中國發生AI語音合成技術進行的電信詐騙事件
這起案件發生於中國某地,是典型的利用AI語音合成技術進行的電信詐騙事件,受害者是一對年逾七旬的老夫婦。案情的開端,是70歲的劉女士和她的丈夫突然接到一通電話,來電者自稱是他們的孫子,聲音與平時孫子的語調與情緒幾乎一模一樣。電話中,孫子語氣慌張,稱自己在學校打傷同學,對方家長不願報警,而是要求私下賠償兩萬元,否則將把事情鬧大。他懇求爺爺奶奶幫忙解決,並再三叮囑千萬不要告訴自己的媽媽,甚至激動地說:「要是讓媽媽知道了,我就跳樓!」
劉女士因聽到熟悉的孫子聲音而毫無懷疑,立刻與丈夫前往銀行提取現金兩萬元,並按照電話指示前往指定地點。到場後,一名自稱是「同學爸爸」的男子現身接應。為讓劉女士更加相信,他還打一通電話給「孫子」,讓劉女士親耳再次聽到孫子的聲音,並聽到孫子在電話中催促她快點把錢交給對方。劉女士心急如焚,毫不猶豫地將現金交出。
直到幾天後,真正的孫子放學回家,劉女士提起此事,才驚覺自己受騙。孫子表示自己從未打過這樣的電話,也沒有打傷任何同學。進一步回想,他曾在近段時間內接到多通陌生電話,電話中無人回應,也未留下訊息。警方分析,這些電話極有可能是詐騙集團蓄意撥打,目的是為錄下孫子的語音樣本,進而利用AI語音克隆技術合成一段以假亂真的對話,實現精準詐騙。
此案件揭示出一種新型詐騙手法的危險性,即透過AI深度仿聲技術模仿特定對象的聲音,並鎖定對象的親人進行感情勒索與金錢詐取。此技術的應用已不再是科幻,而是真實存在於日常生活中,尤其對年長者影響甚鉅,因他們更容易被情感牽動而降低警覺。
警方提醒民眾:接到陌生來電時,切勿主動先說話,以免對方有機可乘錄下聲音。若對方長時間沉默,建議直接掛斷並封鎖號碼。與家人之間應建立良好的通訊習慣,例如設定家庭暗號、進行反詐演練等,以防萬一。此外,切勿在社群媒體上大量公開包含清晰聲音的影片或語音訊息,特別是兒童或青少年的錄音,皆可能成為詐騙集團訓練AI的數據來源。
這宗事件不僅是一次對老年家庭的詐騙案件,更是AI詐騙技術日益逼真與危險的警鐘,提醒所有人必須提高警覺,保護自己與親人的聲音安全與財產安全。
This case occurred in a region of China and serves as a striking example of telecom fraud that utilizes AI-generated voice cloning technology. The victims were an elderly couple in their seventies. It all began when 70-year-old Ms. Liu and her husband suddenly received a phone call from someone claiming to be their grandson. The voice on the line sounded exactly like their grandson’s — matching his tone, speech pattern, and emotional state almost perfectly.
In the call, the “grandson” frantically claimed that he had injured a classmate during an altercation at school. The classmate’s parents allegedly demanded a private settlement of 20,000 yuan (approximately 2,750 USD) to avoid police involvement. He pleaded with his grandparents for help and begged them not to inform his mother, even threatening to jump off a building if she found out.
Believing what they heard, and recognizing what they thought was their grandson’s voice, Ms. Liu and her husband rushed to the bank to withdraw the money. They then went to a location designated by the caller, where they met a man who introduced himself as the classmate’s father. To further gain their trust, this man made a phone call to the supposed grandson, and Ms. Liu once again heard the familiar voice on the line urging her to hand over the money. Convinced, she gave the cash to the man.
It wasn’t until several days later, when their real grandson returned home, that Ms. Liu mentioned the incident — only to realize they had been scammed. The grandson explained he had never made such a call and certainly hadn’t injured anyone. He also recalled having received several strange phone calls recently, where no one responded on the other end. Police later analyzed the situation and concluded that the fraudsters had deliberately made those silent calls to record the grandson’s voice. They then used AI voice-cloning technology to recreate his voice with uncanny accuracy and used it to carry out the scam.
This case illustrates a new and dangerous type of fraud, in which criminals leverage AI-based voice synthesis to impersonate specific individuals and emotionally manipulate their family members for financial gain. What was once science fiction is now a very real threat, especially to elderly people, who are more likely to trust familiar voices and react emotionally in urgent scenarios.
Police have issued a warning: if you receive a call from an unknown number, do not speak first — this prevents scammers from capturing your voice for cloning. If the caller remains silent, hang up immediately and block the number. Families should also establish secure communication habits, such as using pre-arranged code words or practicing fraud-awareness scenarios together. Additionally, avoid publicly sharing videos or voice recordings, especially of children or teens, as these can easily become training material for fraudsters using AI.
This case is more than just an isolated fraud incident — it is a wake-up call about the growing sophistication and accessibility of AI-driven scams. It reminds us all to stay vigilant and protect not only our personal information and finances, but also the unique data of our voices and identities.
- 1
- 2
- 3
- 4