The uncontrolled moderation of social media News


The imposition of search controls on search engines, sites and social media is having an ever greater impact on the presence and persistence of online content.

The topics that revolve around the "need to control content on the Internet" are becoming more and more common. All big corporations, which there's not even need to name, keep committing to the implementation of systems that automatically detect inappropriate content and automatically delete it from social network timelines or search results.

There are many examples of this, from Google reducing the positioning (and therefore visibility) of some Internet sites, YouTube (which always belongs to big-G), adopting systems that reduce the visibility of videos with controversial topics, Facebook removing unsuitable posts. We have seen it applied on every level, up to the point where illustrations were cut to censor female nudity. The published content, in case of illustrations, has been removed for "violation of the terms of use," and despite the fact that the control criteria were corrected after a few months - when it was "discovered" that that content could not be censored - what was eliminated, has been lost. At least works published at the time. And thanks for all the fish.

Automated systems will become more and more popular, as Artificial Intelligence technologies evolve and become more prominent among companies offering services such as social networks and search engines. The amount of content generated every second makes it impossible to have human moderation. However, to date, and according to some for at least another decade as well, "intelligent" systems are not smart enough to capture the nuances that distinguish a text composed of ordered words such as a "lorem ipsum" or the meaning hidden behind an illustration or a photograph.

Because "censoring" (without mincing words) content based on its appearance, rather than substance, is a perfect way to counter the diffusion of art in all its forms. Substance is what makes an Artwork, and it communicates it through its form: a machine, at least for today, can identify a cat or nudity, but is unable to determine the message it conveys.

Sebastian Zdrojewski

System, Network and Data Security advisor for over 20 years, in 2017 co-founded Rights Chain, a company aiming the development of copyright and intellectual property protection and enforcement solutions.


Last update 2017-08-28


#Censorship, #Social, #Copyright

Last articles


How AI could wipe out creativity

by Sebastian Zdrojewski, 2020-07-22

Every day we see how #AI technologies are allegedly disrupting this or that industry. But what happens when AI is starting to mess with #creativity? Could it possibly #kill it?

Another year has passed: 2019 closure notes and thoughts

by Sebastian Zdrojewski, 2020-01-03

Closure note from the CEO at Rights Chain: an intense year has just ended, a new challenging year has begun.

Getty Images shuts down Rights Managed model for a Royalty Free

by Sebastian Zdrojewski, 2019-11-11

If you are a Getty Images subscriber or photographer, you probably received an advisory where the licensing model turns from a Rights Managed (RM) model to a Royalty Free (RF) licensing mode. What's going on?

Using Rights Chain: commissioned works

by Sebastian Zdrojewski, 2019-10-25

This week we start a column of articles to talk about the reasons why it is important to think about copyright. Our first example will be about works done on commission.