Over the past five years, there has been an overwhelming increase of Artificial Technology (AI) integration into many aspects of our society. In the fight against human trafficking, law enforcement, and professionals must be aware of this technology and its already vast reach.
Artificial Intelligence has proven to be a double-edged sword, working both for and against the mission to end human trafficking worldwide.
It is important to understand first how traffickers are utilizing this technology to combat this issue. As technology improves and internet access becomes more accessible, it becomes easier for people to maintain online anonymity. With platforms such as OpenAI’s ChatGPT, predators can populate language that appeals to young children.
For example, giving ChatGPT the prompt, “Write me a text using modern slang that a 10-year-old girl would understand,” could produce language that causes a potential victim to trust an online predator.
When social media began to popularize, it became easier for people to lie about details that would be impossible without the shield of a screen. Fake profile pictures, posts, and bios were just the beginning of a much larger issue. Generative AI, which is defined by multinational technology company IBM, refers to deep-learning models that can generate high-quality text, images, and other content based on the data they were programmed for. Predators can create fake profile pictures and posts that contain people that never have or will exist, making it more difficult for them to be located.
Along with that, AI has enabled predators to modify and distort existing images of children, create instructional guides on the abuse of minors, and generate poetry and stories containing child sexual abuse content.
While content of this nature is obviously prohibited on these platforms, predators have found loopholes around these guidelines.
Like most technology, AI has both negatives and positives depending on its use. Law enforcement needs to be aware of the harms of AI and learn to utilize it as a tool for themselves.
The National Center for Exploited and Missing Children, a national non-profit organization that works to fight child victimization, began using Reveal AI late last year. Reveal assisted them in processing over 21,000 cases, saving them over 4,000 hours in labor.
Similarly, Street Grace, a non-profit organization working to eradicate child sexual exploitation, developed transactional interception, which is AI-generated decoy ads that target child predators. Gracie, the AI chatbot, communicates with the buyer about the ad. Once the purchase is confirmed Gracie informs the buyer of the risks. Subsequently, information about the buyer is sent directly to law enforcement.
“The only answer is technology,” President and CEO of Street Grace Bob Rodgers said.
AI works faster than humans. The ability to identify high-risk locations, people, or patterns in a timely fashion can make a big difference. Evidence sorting, combing through video or photo surveillance, or advanced facial recognition are all things made more efficient through the adaptation of AI technologies.
– Ilaria Noonan is the Community Outreach Intern. She can be reached at ilaria@encstophumantrafficking.org