At least 25 arrests a worldwide operation is against the children’s abuse images undertaken by artificial intelligence (AI), the organization said the law enforcement of the European Union.
The suspects are about a criminal group whose members act in distribution of perfect AI images made by minors, according to agency.
Surgery is one of the first to involve such a child’s sexual abuse (CSAM), says Europol. The lack of national legislation against these crimes makes it “harder for investigators”, in addition.
The arrests were made together on Wednesday 26 in February in Operation Cumberland, led to the law enforcement of Danish, said a release of news.
Authorities from at least 18 other countries have been involved and operations continue, with many arrests expected in the next few weeks, Europol said.
In addition to the arrests, 272 suspects, 33 household searches are held and 173 electronic devices have been caught, according to the agency.
It is also said that the primary suspect is a National National arrested on November 2024.
The statement says he “runs an online platform where he distributes the material that he has made”.
After creating a “symbolic online payment”, users from the whole world get a password that allows them to “access the platform and see children abused”.
The agency says online sexual exploitation is one of the leading priorities for the law enforcement organizations, facing “a constant growing illegal content”.
Europol added that even cases if the content is completely artificial and no real victim described, such as CSAM operations contributed to children’s purpose and sexuality “.
Executive director of Europol Catherine de Bolle said: “These artificial actions have been made easily to individuals with definite criminal purposes.”
He warns the enforcement of the Law must develop the “new method of investigating and tools” to meet the developing challenges.
The Internet Watch Foundation (IWF) warns more sexual abuse AI images of children are made and become more widespread on open web.
In research last year the charity was found to be in a month period, 3,512 AI sexual abuse of child abuse and exploitation images discovered on a dark website. If compared to a month last year, the number of worst category images (Category A) rose by 10%.
Experts say AI sexual sexual abuse material is always looking more realistic, which is hard to tell the truth from fake.