Technology New world. Artificial intelligence against online hate

New world. Artificial intelligence against online hate




How to detect violent messages on social networks? (illustration photo) (PHOTOPQR / NICE MATIN / MAXPPP)

How to fight against hatred and false information on social networks? Can technology help solve the problems it itself has caused? Several companies are now trying to invent automatic techniques for monitoring and filtering content. This is the case of the Toulouse startup, Predicta Lab.

franceinfo: how does your technology limit hatred and infox on social networks?

Baptiste Robert, founder of Predicta Lab : The idea is to retrieve data from all social networks, analyze it and create alerts based on what is discovered. It could be hate online, harassment, or something that is going on. We are able to capture any agitation on a network based, in particular, of semantic analysis.

What types of content can you detect?

We detect incidents of online violence and harassment very quickly. I recently saw a video of a woman being beaten up by her husband and a flood of hateful comments below. There are people who spend their evenings, from 6 p.m. to midnight, insulting others on the Internet. These contents can be captured by our algorithms.

Who is your solution for?

Our solution may be of interest to the media, who need to monitor the news. It may also be of interest to the authorities, in the context of the fight against terrorism or online hatred, but also to detect disasters. Finally, it will be of interest to companies for everything relating to e-reputation and online crisis management. We are already in contact with several interested organizations.

Is technology the silver bullet to hate online?

Technology is not magic. She has a hard time picking up the nuance. On the other hand, we can easily detect blatantly violent direct words. Already, if we managed to remove all this from social networks, they would be a little more appeased.

Leave a Reply

Your email address will not be published. Required fields are marked *