Recently, the "Lei Feng Volunteer Police" in Hunan Provincial Museum spotted an elderly woman behaving nervously outside a mobile phone store while on a FaceTime video call. Sensing a possible scam, they reported it to local police. Officers arrived quickly, finding the woman engaged in a video conversation with someone impersonating a police officer. The caller, dressed in a police uniform, was sternly instructing her to perform certain actions on her phone. The police seized the woman’s phone, identified themselves, and confronted the scammer, who abruptly ended the call.
The elderly woman, still confused, asked, “He wore a police uniform and could video call me—is this fake?” She explained that the caller accused her of criminal involvement and demanded her cooperation in a "financial investigation." She followed instructions out of fear, believing the scammer’s police identity.
What Kind of Scam Was This?
The scam involved advanced AI face-swapping technology to create a fake video of the scammer posing as a police officer. Combined with FaceTime's video and screen-sharing features, the fraudster manipulated the victim into sharing sensitive data.
Key Methods Used by Scammers:
- AI Face-Swapping: The scammer superimposed a police officer's face onto their own in real-time, with synchronized facial expressions and speech, making the video call highly convincing.
- Screen Sharing: FaceTime’s built-in screen-sharing function enabled the scammer to view the victim's phone activity, including inputting banking details or confidential information.
How Did the Scam Work?
The fraudster accused the victim of criminal activities and demanded cooperation to "verify" her innocence. They instructed her to buy a new phone and SIM card to “assist the investigation.” Through screen sharing, the scammer observed the victim’s phone operations in real-time, capturing her bank account details and personal data.
Why Was It Effective?
- Dynamic Interaction: Unlike static images, AI-driven live video interactions seemed authentic,
- with natural facial expressions and gestures.
- Trust in Law Enforcement: Seeing a police uniform and receiving strict instructions prompted the victim to associate the caller with genuine authorities.
- Lack of Technical Knowledge: The victim was unfamiliar with AI face-swapping and FaceTime’s screen-sharing risks, making it difficult to detect the scam.
- Fear and Pressure: Allegations of criminal involvement induced panic, impairing the victim’s rational judgment and making her compliant.
# How Can the Elderly Prevent Such Scams?
-
Be Wary of “Official Personnel” on Video Calls
Authorities such as police, banks, or government agencies typically do not contact individuals via FaceTime or similar video call apps. If uncertain, verify through official contact numbers instead of trusting caller IDs. Law enforcement does not conduct operations via video calls, nor do they request bank transfers or account details. Always hang up and verify independently if you receive such calls. -
Disable FaceTime’s Sharing Features
If not frequently used, it is advisable to disable FaceTime to prevent misuse. On iPhones, go to “Settings,” locate the “FaceTime” option, and turn it off to eliminate the risk of FaceTime-based scams. -
Refuse Screen Sharing with Strangers
Avoid sharing your screen with strangers on any video app. Be especially cautious if someone claiming to be an official requests access to view your banking operations. -
Watch Out for Phishing Websites and Fake Apps
Do not click on suspicious links. Be extra cautious of websites or pages you are directed to during a screen-sharing session. Always download apps from trusted sources like the Apple Store. -
Protect Personal Information and Privacy
Avoid disclosing sensitive information such as facial data, bank card numbers, OTPs, or passwords to strangers. Stay vigilant to prevent fraudsters from exploiting your personal details. -
Use Technical Safeguards
Tools like Dingxiang Device Fingerprinting can detect screen-sharing activities in real time and issue alerts. If screen sharing is activated, it can notify the user about potential risks and advise caution.
Additionally, the public should remain informed about AI face-swapping technology’s ability to create fake dynamic videos. Exercise skepticism towards seemingly high-quality video images and refrain from entering sensitive data during unfamiliar calls or screen-sharing sessions. In such situations, avoid succumbing to fear and blindly following instructions. Instead, seek assistance from family members or the police.