Deepfake content a new reality of investment scams
전체 맥락을 이해하기 위해서는 본문 보기를 권장합니다.
"The perpetrators are still on the run," the victim added. "They may be deceiving somebody else using deepfake videos elsewhere."
Unfortunately, there is no way to prevent financial fraud using deepfake content other than "not being deceived."
이 글자크기로 변경됩니다.
(예시) 가장 빠른 뉴스가 있고 다양한 정보, 쌍방향 소통이 숨쉬는 다음뉴스를 만나보세요. 다음뉴스는 국내외 주요이슈와 실시간 속보, 문화생활 및 다양한 분야의 뉴스를 입체적으로 전달하고 있습니다.
![A still from the deepfake video featuring actor Zo In-sung presented to investors by an investment fraud gang. [JOONGANG PHOTO]](https://img4.daumcdn.net/thumb/R658x0.q70/?fname=https://t1.daumcdn.net/news/202402/24/koreajoongangdaily/20240224070029419ltyv.jpg)
A fraud victim in his 60s lost 160 million won ($120,500) after joining an investing scam group that was promoted through a YouTube video. The victim says he dropped his suspicion after seeing footage featuring renowned actors such as Zo In-sung and Song Hye-kyo.
Financial fraud using deepfake videos starring famous figures is increasing in Korea.
Deepfake content is computer-manipulated images or videos using artificial intelligence (AI) technology showing digitally-generated figures that are not easily differentiated from real footage.
Earlier this year, a group of investment fraudsters in Korea used a deepfake video to deceive over 100 people into investing money, resulting in hundreds of millions of won in damages.
One of the scammers with the title of group manager cheated the victims by saying that they are a close aid to "Battery Man" Park Soon-hyeok, well known to investors of EV batteries on YouTube.
The manager claimed that he would invest on behalf of the group members with the money provided.
The scam group also uploaded a video online for investors featuring Zo and Song praising the unauthorized trading group as a portion of investment had been donated.
Many investors, including the 60-something victim, believed the fraud was legitimate after seeing the video because they thought famous stars were participating alongside them.
In the end, however, everything was revealed to be a scam.
The video was a deepfake using AI technology to make the actors look like they were talking in the clip.
The manager was also not a secretary or had ever reached an 800 percent rate of return on investments.
The victim requested a refund from the management team, but they vanished after demanding commission and taxes.
Another victim of the fraud lost 500 million won, saying that he put his savings for his daughter’s marriage and business in the investment.
“The perpetrators are still on the run,” the victim added. “They may be deceiving somebody else using deepfake videos elsewhere."
Last November, a voice phishing gang that conned 149.1 billion won from 1891 people was busted by the police right before it attempted to scam people using deepfake footage.
Police discovered that the gang was in the process of making deepfake videos with an actual prosecutor’s face and voice to deceive people.
Concerns of financial fraud using deepfake videos, however, are not only an issue in Korea.
According to media reports Wednesday, a Hong Kong company employee accidentally sent 200 million Hong Kong dollars ($25.5 million) early this month after being scammed by a deepfake video with the company’s finance officer appearing.
Last December, the likeness of Singapore’s Prime Minister, Lee Hsien Loong, was used for fraud and in November, Australian entrepreneur Dick Smith was rendered in a deepfake video.
Both figures were depicted as being interviewed to promote an investment platform, which led to actual fraud damage.
Unfortunately, there is no way to prevent financial fraud using deepfake content other than “not being deceived.”
Measures such as separately marking deepfake content as such are under discussion, but there is no use if the criminals erase it.
Many consider the most effective way to prevent crime using deepfake videos to be detection beforehand, but limitations exist as detecting technology is not able to keep up with the rapid development of generating technology.
“A high possibility exists of deepfake videos and voices being used for contactless financial transactions,” Park Ji-hong, a researcher at the Hana Institute of Finance, said to the JoongAng Ilbo. “We must put effort into securing detecting technology that can identify deepfake content beforehand.”
Some say policies regarding such new types of fraud need improvement as it is almost impossible to retrieve the victim’s damage from the offenders.
“It is worth discussing measures to recover damages for victims by raising funds from the confiscated assets of criminals,” Kim Gye-hwan, a lawyer at the law firm Gamwoo, said.
BY OH HYO-JEONG [kim.jiye@joongang.co.kr]
Copyright © 코리아중앙데일리. 무단전재 및 재배포 금지.
- [EXCLUSIVE] Hyundai to launch Ioniq 9 in December
- [단독] ‘세븐’ 아니라 ‘나인’ 이었다…현대차 12월 ‘아이오닉 9’ 출시
- Michelin Guide 2024 names 220 restaurants in Korea, from Seoul to Busan
- Hwang Jung-eum files for divorce from golfer husband
- S.Tiger, K-pop producer, 40, found dead on Friday
- Dodgers, Padres get the hangul treatment for MLB Seoul Series
- Lee Kang-in vs. Son Heung-min — what actually happened?
- Korean man refused hotel stay for failure to speak Japanese
- Coupang Play's 'Elite League' to return for second season
- Nude model Ha Young-eun bares all with her first book