Seoul to mobilize new AI solution in fight against child pornography

조정우 2024. 5. 22. 15:39
글자크기 설정 파란원을 좌우로 움직이시면 글자크기가 변경 됩니다.

이 글자크기로 변경됩니다.

(예시) 가장 빠른 뉴스가 있고 다양한 정보, 쌍방향 소통이 숨쉬는 다음뉴스를 만나보세요. 다음뉴스는 국내외 주요이슈와 실시간 속보, 문화생활 및 다양한 분야의 뉴스를 입체적으로 전달하고 있습니다.

The Seoul Metropolitan Government announced Wednesday that it will adopt newly developed AI technology that automatically identifies sexually exploitative material involving children to eradicate illegal content.
Seoul Mayor Oh Se-hoon, left, tries out the city's AI system that automatically detects sexually exploitative material during an anniversary event of the capital's digital sex crime support center in southern Seoul held in March last year. [SEOUL METROPOLITAN GOVERNMENT]

The Seoul Metropolitan Government announced Wednesday that it will adopt newly developed AI technology that automatically identifies sexually exploitative material involving children to eradicate illegal content.

The city's think tank, the Seoul Institute, began developing the technology in March last year.

The capital is the first locality in Korea to adopt such a system to automatically detect and remove child sexual abuse material, or CSAM.

The technology produces a list of CSAM in just 90 seconds, dramatically shorter than the two hours needed for a manual search. Accuracy is expected to be enhanced by over 300 percent. The city government expects up to 300,000 videos to be monitored with the technology, double last year's figure.

The technology was developed as children often refrain from telling their parents about the abuse, resulting in few reports made to the authorities, the city government said. The metropolitan government implemented AI technology to remove all sexually exploitative videos in March last year.

According to the digital sex crime support center run by the Seoul Foundation of Women and Family, children who reported their digital sexual abuse to authorities accounted for just 7.8 percent of the total number of digital sex crime cases targeting children filed to the center.

Of the 2,720 materials the center removed in the past two years, only 15.6 percent of them were deleted following requests to do so.

Unlike adult victims, who must report such materials themselves before they are removed, CSAM can be removed immediately without a request from the children or their parents.

According to the city government, the newly developed AI-detection system recognizes the gender and age of the children. The technology can detect whether an individual is a minor even without seeing their face by analyzing objects and elements in the material, such as books, dolls, uniforms and speaking styles.

The city government said that the technology would be used to search for videos worldwide, considering that such materials — which used to circulate mainly in the United States — have recently spread to other countries, including China, Russia and Vietnam.

Meanwhile, the city government rendered assistance to victims in 30,576 digital sex crime cases over the past two years through its digital sex crime support center.

People in their 20s and teenagers accounted for the higher number, with nearly 86 percent being women.

In particular, the number of digital sex crime cases against children filed at the center has risen over sevenfold in the past two years, from 2,026 in 2022 to 15,434 last year.

Online grooming accounted for the highest number of cases, followed by distribution and redistribution of CSAM.

BY CHO JUNG-WOO [cho.jungwoo1@joongang.co.kr]

Copyright © 코리아중앙데일리. 무단전재 및 재배포 금지.

이 기사에 대해 어떻게 생각하시나요?