Illustration| Chat GPT
Recently, voice phishing schemes have repeatedly occurred that manipulate crying sounds with artificial intelligence (AI) to deceive parents into thinking their child has been kidnapped, calling for particular caution.
On the 1st, the Financial Supervisory Service issued a consumer alert (caution), saying, “Voice phishing scams that pretend a child has been kidnapped are widespread.” With recent incidents such as the Kyowon Group hacking, concerns are growing that voice phishing targeting parents could expand.
According to the Financial Supervisory Service, scammers have recently chosen parents in areas dense with private academies as their targets, calling and presenting personal information such as the name of the child and the name of the academy. They ask, “We are in front of an academy in △△-dong; you are the mother of ○○, right?”
Without providing specific explanations, scammers briefly play a crying child voice such as ‘Mom. A drunk person hit me’ to provoke the anxiety of parents. But the crying the scammer plays is a fake voice manipulated by AI.
The Financial Supervisory Service explained that crying is indistinct in pronunciation and does not vary greatly from person to person, so if startled, even parents can find it hard to tell whether it is the voice of their child.
The scammer lies that the child cursed at him, that he got angry, confined the child in a car, and beat the child, then demands money from the parents under pretexts such as covering a bar tab.
There have also been cases where they claimed the child struggled and broke the phone screen and demanded repair costs.
The Financial Supervisory Service said that, unlike in the past when large sums were demanded, many recent schemes ask for a small amount of around $375 (500,000 KRW) that can be deposited immediately without canceling savings or time deposits or taking out a loan, aiming to finish the crime quickly.
The Financial Supervisory Service urged, “If someone plays the sound of a child crying over the phone and demands money, claiming the child has been kidnapped, you should suspect voice phishing that abuses AI,” and added, “First hang up on the scammer and directly verify the location and safety of your child.”
It added, “Scammers approach in plausible ways and methods that could occur in a child daily life, making it difficult for victims to recognize the fraud,” and “Using AI voice phishing detection services offered by telecom carriers can help prevent harm.”
#1
[Victim] Hello?
[Scammer] Yes, you are the mother of □□ (child name), correct? One moment. □□, quickly tell your mother. Do not cry!
[AI voice imitating child] Sob sob... Mom.. The man hit me..
[Victim] (to the scammer) Why? What happened?
[Scammer] I was just drinking and smoking on the street, and the kid cursed at me, saying XXXX. Maam, I got really angry, but I am holding back and calling. I do not intend to harm the kid, and I can just drop the kid off from the car.
[Victim] What are you talking about? Where are you?
[Scammer] I cannot just let the kid go. Show some goodwill and send me money to cover my drinks. Deposit only $375 (500,000 KRW). Make the transfer while we stay on the call.
#2
[Victim] Hello?
[Scammer] Hello? You are the mother of △△ (child name), right? I will put the child on.
[AI voice imitating child] Sob sob... Mom... This person hit me.. A drunk person hit me..
[Victim] Huh? What did you say? Where are you now?
[AI voice imitating child] I am in a car right now...
[Victim] Where are you?
[Scammer] I will tell you the location later, because if I tell you now, you will call the police. I just saw that the kid kicked and broke my phone screen. If you send only the phone repair cost, I will not hit the kid anymore and will let the kid out of the car. While we stay on the call, transfer only $375 (500,000 KRW) for the phone repair.