When your own voice becomes a weapon: The rise of AI-powered audio fraud
For a long time, phishing emails with poor grammar were the primary tool of cybercriminals. But with the rapid advancement of generative artificial intelligence (AI), the threat potential has changed dramatically. Today, a short audio snippet is enough to clone an ordinary person’s voice with uncanny realism.
This phenomenon, known as “voice cloning” or “deepfake audio,” is revolutionizing scams like the classic “grandchild scam.”

The technology: Three seconds are enough
Where the fraudsters got your vote from
The fraud scenarios
Statistics and risk perception
How to protect yourself
Beliebte Beiträge
Import Stock Quotes into Excel – Tutorial
Importing stock quotes into Excel is not that difficult. And you can do a lot with it. We show you how to do it directly without Office 365.
Create Excel Budget Book – with Statistics – Tutorial
Create your own Excel budget book with a graphical dashboard, statistics, trends and data cut-off. A lot is possible with pivot tables and pivot charts.
Excel random number generator – With Analysis function
You can create random numbers in Excel using a function. But there are more possibilities with the analysis function in Excel.
Excel Database with Input Form and Search Function
So erstellen Sie eine Datenbank mit Eingabemaske und Suchfunktion OHNE VBA KENNTNISSE in Excel ganz einfach. Durch eine gut versteckte Funktion in Excel geht es recht einfach.
Enable developer tools in Office 365
Unlock developer tools in Excel, Word and Outlook. Expand the possibilities with additional functions in Office 365.
Dictate text in Word and have it typed
Dictating text in Word is much easier and faster than typing everything on the keyboard. Speech recognition in Word works just like external speech recognition software.

























