Until recently, scammers only called by phone. Using methods of psychological influence, they persuaded citizens to perform actions, as a result of which money ended up in their accounts.
However, times change, and criminals also keep pace with progress, using all the achievements of modern technology…

In recent years, deepfake technologies have been developing at a rapid pace. These artificial intelligence algorithms allow create realistic videos and audio recordingswhich can be used for both entertainment and fraudulent purposes. Unfortunately, attackers are increasingly using deepfakes to deceive people, compromise their reputation, and extort. To protect yourself from such threats, it is important to understand how this technology works and how to counter it.
What is deepfake
Deepfake is a technology based on the use of machine learning algorithms and artificial neural networks that are capable of superimposing the face of one person onto the face of another or changing the voice, making it identical to the voice of another person. These manipulations are often so realistic that they are difficult to distinguish from the original. Attackers can use deepfakes to:
- Money extortion scams. For example, a deepfake call on behalf of your boss or friend with a request to transfer money.
- Creating fake compromising videos. This can be used for blackmail or reputational damage.
- Spreading misinformation. Such deepfakes are used to create false news.

How to recognize a deepfake
Despite the realism of deepfakes, there are several signs that can indicate a fake:
- Incorrect eye movements and facial expressions. The technology often cannot accurately recreate blinks, gaze direction, or natural facial expressions.
- Distortions on the borders of the face. Sometimes unnatural transitions between the face and background are visible.
- Unusual sound defects. A deepfake voice may sound mechanical or unnatural, with inconsistency in intonation.
- Blurred or incorrect details. Pay attention to the teeth, hair or ears – they may look fuzzy or have artifacts.
- Unnatural head movements. Strange distortions may occur when turning your head.
Commentary from a cybersecurity expert
An acquaintance calls you via video and asks you to borrow a large sum, or your manager demands to urgently transfer money to a new client: ask the interlocutor to turn his head 90 degrees. If this is a deepfake, then his face will “float.”

Unfortunately, with the improvement of AI, all of the above signs will soon become irrelevant.
The sensational story took place in 2024 in Hong Kong. An employee of a multinational corporation, having communicated via video with the deepfake-generated financial director and other employees, transferred $25 million to the scammers. The deception was revealed only a week later, when the employee himself called the head office.
In May 2023, a Chinese businessman received a video call from someone who looked and sounded like his close friend. He asked to transfer him 610 thousand dollars to pay for guarantees for participation in the tender. The image created using a deepfake was so believable that the businessman complied with the request.
How to protect yourself from deepfake scams
Here are some practical tips to protect yourself from deepfakes:
- Don't trust only visual or auditory information. If you receive a suspicious video or voice message, try to confirm the information in another way. For example call the sender or ask him personally.
- Use modern tools to check deepfakes. There are several programs and online services that analyze media files for signs of fraud, for example:
- Deepware Scanner
- Sensity AI
- InVID (video analysis plugin)
- Be careful with personal data. Fraudsters can use your photos, videos and audio recordings to create deepfakes. Limit access to your data on social networks and avoid posting unnecessary information.
- Train your employees and loved ones. Attackers often use deepfakes to attack companies. Provide digital security training to increase your colleagues' awareness.
- Stay tuned for the latest news. Deepfake technologies are constantly evolving. Read regularly about new ways to combat them to stay informed.
- Trust the professionals. If you are a victim of a deepfake scam, contact cybersecurity experts and law enforcement. They will help you identify the source of the threat and protect your data.
- Turn on two-factor authentication on your accounts. This will add an extra layer of security and make it more difficult for scammers to gain access to your personal data.
- Viber Recently an option has appeared phone number checks. The option, using a common database, allows you to determine the level of trust of your subscriber's phone number. Turn it on, it really works.
- Agree on use special words that “strangers” cannot knowfor subscriber identification. You can also use little-known facts from the life of your interlocutor. However, you need to be careful with this too.
- You can ask the person to take a sip from a cup, straighten their hair, take off their glasses, or simply hold their hand in front of their face. Typically, in such a situation, the image created using the deepfake begins to fail, and you will notice it.
- It is more difficult to recognize a deepfake during an audio call. Experts recommend paying attention to intonation and the use of characteristic words. You should be wary of the monotony and emotionlessness of the voice. However, these signs can only be noticed in long messages – it is much more difficult to see inconsistencies in a short phrase.
Future and responsibility
The world is grappling with the threat of deepfakes, and detection technologies are also advancing. For example, researchers are developing methods to automatically verify the authenticity of media files using blockchain and biometric identification. However, it is equally important that companies developing such technologies regulate their use and put in place mechanisms to protect against abuse. However, attackers also do not stand still. And they are improving their methods of deception.
The most important advice: before taking any actions, especially those related to material values, take a break and analyze the situation. 99% that this will help you avoid being scammed.
More Stories
Little piece of paper with big risks: the danger of ATM checks
Slow -out mine: why can you not store old phones anywhere else
Greek beaches-trap: where beauty is deadly