Tech

Alphabet’s newest tool gives your comments a rating based on how toxic they are

You might just learn a lot about yourself.

Photo of Phillip Tracy

Phillip Tracy

Article Lead Image

A unit of Google’s parent company is teaching computers the awful things people say online, so it can make the internet a safe and civil place.

Jigsaw, a subsidiary of Alphabet, released two machine learning models Thursday that will tell you how disrespectful your online comments are—and even help you avoid rude comments from others.

First, there is the reading experiment, which takes comments from three contentious topics (climate change, Brexit, and the U.S. election) and filters them based on how toxic they are. Jigsaw says this will help developers and publishers give real-time feedback to commenters, help moderators do their job, and even help readers find relevant information more easily.

The second module instantly rates any comment you input into its webpage with a score from 0 to 100, based on its toxicity. Just load up the website, scroll down to “writing experiment,” and start typing in the large text box. Once you stop typing, the machine will spit out a result about your comment that looks something like, “X% similar to comments people said were toxic.”

The tool, called Perspective, was created using 17 million reader comments from the New York Times, Wikipedia, and additional data collected from online harassment victims. Those comments were then rated by several thousand people to help determine their toxicity levels.

Alphabet defines toxic as, “a rude, disrespectful, or unreasonable comment that is likely to make you leave a discussion.”

We played around with the tool for a bit and came up with some interesting results. “Donald Trump” has a toxicity level of 22 percent, while “Hillary Clinton” only has 5 percent. “Barack Obama” falls somewhere between with 16 percent.

Unsurprisingly, adding a curse word to any sentence will see your comment’s levels go through the roof. That thing our current president said about what he does to women has a staggering 92 percent toxicity rating.

As Vocativ points out, ‘Muslims’ has a 57 percent score, ‘Mexicans’ are at 66 percent, and ‘Jews’ at 64 percent. The term ‘black’ also has nearly twice the toxicity level as ‘white.’

All of this just shows that Jigsaw has quite a lot of work to do if it wants to make the internet a civil place again. 

H/T Vocativ 

Featured Video
 
The Daily Dot