System for your blog to receive more constructive comments

The comments of the most viewed YouTube videos, generally, show a large proportion of comments that, in fact, are insults, slander or directly foul words, one after the other. If you visit a blog with a lot of traffic, so much of the same happens. But are there really so many rude, abhorrent, aggressive and hateful people in the world? The answer is yes: because we can all be like that according to the context.

But the worrying thing, the most worrying thing, at least, is that many commentators do not even read the post before commenting (or read it without understanding what they have read). To solve that, a Norwegian blog has implemented a reading comprehension test, condition sine qua non to pour your comment (or your bile).

Reading comprehension

It all started at the NRK technology site, NRKbeta, which decided to implement a small multiple choice questionnaire in some of its notes. The questions are simple, like what some acronyms mentioned in the text mean. Once answered correctly, the form enables the comments section.

For the editor of NRKbeta, Marius Arnesen, the additional 15 seconds to the reading that it takes to answer the questions on the theme of the story also contribute to lower the tone of the comment.

Perspective

It is not the only initiative to get that comments are more constructive and less destructive. Last week the company Alphabet, from Google, announced that it was working with the New York Times, The Economist, The Guardian and Wikipedia to test a new tool, called Perspective, dedicated to identifying "toxic" comments.

"We have more information and more articles than at any other time in history, and yet the toxicity of the conversations that follow those articles are driving people away from the conversation," he said. Jared Cohen, president of Jigsaw, formerly known as Google Ideas.

Jigsaw is using the Artificial intelligence (specifically machine learning) in hundreds of thousands of comments to identify the types of comments that could deter people from a conversation. Based on these data, Perspective provided a score of 0 to 100 on the toxicity of the new comments based on the comments identified as toxic.

The same methodology is being provided to the editors, who could use the scores for human moderators to review the less constructive comments. Thus, finally, certain filters will make the best of us (such as the theory of broken windows and other contextual filters allegedly allowing us to be more polite and less vandalism).

Video: Show & Tell. Eagle Creek Pack-It System (April 2020).