The price of a “clean” internet | Hans Block and Moritz Riesewieck

The price of a “clean” internet | Hans Block and Moritz Riesewieck

Translator: Ivana Korom
Reviewer: Krystian Aparta [This talk contains mature content] Moritz Riesewieck: On March 23, 2013, users worldwide
discovered in their news feed a video of a young girl
being raped by an older man. Before this video
was removed from Facebook, it was already shared 16,000 times, and it was even liked 4,000 times. This video went viral
and infected the net. Hans Block: And that was the moment
we asked ourselves how could something like this
get on Facebook? And at the same time,
why don’t we see such content more often? After all, there’s a lot
of revolting material online, but why do we so rarely see such crap
on Facebook, Twitter or Google? MR: While image-recognition software can identify the outlines
of sexual organs, blood or naked skin in images and videos, it has immense difficulties
to distinguish pornographic content from holiday pictures, Adonis statues or breast-cancer screening campaigns. It can’t distinguish
Romeo and Juliet dying onstage from a real knife attack. It can’t distinguish satire
from propaganda or irony from hatred,
and so on and so forth. Therefore, humans are needed to decide which of the suspicious content
should be deleted, and which should remain. Humans whom we know almost nothing about, because they work in secret. They sign nondisclosure agreements, which prohibit them
from talking and sharing what they see on their screens
and what this work does to them. They are forced to use code words
in order to hide who they work for. They are monitored
by private security firms in order to ensure
that they don’t talk to journalists. And they are threatened by fines
in case they speak. All of this sounds
like a weird crime story, but it’s true. These people exist, and they are called content moderators. HB: We are the directors of the feature
documentary film “The Cleaners,” and we would like to take you to a world that many of you
may not know yet. Here’s a short clip of our film. (Music) (Video) Moderator: I need to be anonymous,
because we have a contract signed. We are not allowed to declare
whom we are working with. The reason why I speak to you is because the world should know
that we are here. There is somebody
who is checking the social media. We are doing our best
to make this platform safe for all of them. Delete. Ignore. Delete. Ignore. Delete. Ignore. Ignore. Delete. HB: The so-called content moderators don’t get their paychecks from Facebook,
Twitter or Google themselves, but from outsourcing firms
around the world in order to keep the wages low. Tens of thousands of young people looking at everything
we are not supposed to see. And we are talking about
decapitations, mutilations, executions, necrophilia,
torture, child abuse. Thousands of images in one shift — ignore, delete, day and night. And much of this work is done in Manila, where the analog toxic waste
from the Western world was transported for years
by container ships, now the digital waste is dumped there
via fiber-optic cable. And just as the so-called scavengers rummage through gigantic tips
on the edge of the city, the content moderators click their way
through an endless toxic ocean of images and videos and all manner
of intellectual garbage, so that we don’t have to look at it. MR: But unlike the wounds
of the scavengers, those of the content moderators
remain invisible. Full of shocking and disturbing content, these pictures and videos
burrow into their memories where, at any time,
they can have unpredictable effects: eating disorders, loss of libido, anxiety disorders, alcoholism, depression, which can even
lead to suicide. The pictures and videos infect them, and often never let them go again. If they are unlucky, they develop
post-traumatic stress disorders, like soldiers after war missions. In our film, we tell the story
of a young man who had to monitor livestreams
of self-mutilations and suicide attempts, again and again, and who eventually
committed suicide himself. It’s not an isolated case,
as we’ve been told. This is the price all of us pay for our so-called clean
and safe and “healthy” environments on social media. Never before in the history of mankind has it been easier to reach
millions of people around the globe in a few seconds. What is posted on social media
spreads so quickly, becomes viral and excites the minds
of people all around the globe. Before it is deleted, it is often already too late. Millions of people
have already been infected with hatred and anger, and they either become active online, by spreading or amplifying hatred, or they take to the streets
and take up arms. HB: Therefore, an army
of content moderators sit in front of a screen
to avoid new collateral damage. And they are deciding,
as soon as possible, whether the content
stays on the platform — ignore; or disappears — delete. But not every decision is as clear as the decision about a child-abuse video. What about controversial content,
ambivalent content, uploaded by civil rights activists
or citizen journalists? The content moderators
often decide on such cases at the same speed as the [clear] cases. MR: We will show you a video now, and we would like to ask you to decide: Would you delete it, or would you not delete it? (Video) (Air strike sounds) (Explosion) (People speaking in Arabic) MR: Yeah, we did some blurring for you. A child would potentially
be dangerously disturbed and extremely frightened by such content. So, you rather delete it? But what if this video could help
investigate the war crimes in Syria? What if nobody would have heard
about this air strike, because Facebook, YouTube, Twitter
would have decided to take it down? Airwars, a nongovernmental
organization based in London, tries to find those videos
as quickly as possible whenever they are uploaded
to social media, in order to archive them. Because they know, sooner or later, Facebook, YouTube, Twitter
would take such content down. People armed with their mobile phones can make visible what journalists
often do not have access to. Civil rights groups often
do not have any better option to quickly make their recordings
accessible to a large audience than by uploading them to social media. Wasn’t this the empowering potential
the World Wide Web should have? Weren’t these the dreams people in its early stages had
about the World Wide Web? Can’t pictures and videos like these persuade people who have become
insensitive to facts to rethink? HB: But instead, everything
that might be disturbing is deleted. And there’s a general shift in society. Media, for example, more and more often
use trigger warnings at the top of articles which some people may perceive
as offensive or troubling. Or more and more students
at universities in the United States demand the banishment of antique classics which depict sexual violence or assault
from the curriculum. But how far should we go with that? Physical integrity is guaranteed
as a human right in constitutions worldwide. In the Charter of Fundamental Rights
of the European Union, this right expressly applies
to mental integrity. But even if the potentially
traumatic effect of images and videos is hard to predict, do we want to become so cautious that we risk losing
social awareness of injustice? So what to do? Mark Zuckerberg recently stated
that in the future, the users, we, or almost everybody, will decide individually what they would like to see
on the platform, by personal filter settings. So everyone could easily claim
to remain undisturbed by images of war
or other violent conflicts, like … MR: I’m the type of guy
who doesn’t mind seeing breasts and I’m very interested in global warming, but I don’t like war so much. HB: Yeah, I’m more the opposite, I have zero interest in naked breasts
or naked bodies at all. But why not guns? I like guns, yes. MR: Come on, if don’t share
a similar social consciousness, how shall we discuss social problems? How shall we call people to action? Even more isolated bubbles would emerge. One of the central questions is:
“How, in the future, freedom of expression will be weighed
against the people’s need for protection.” It’s a matter of principle. Do we want to design
an either open or closed society for the digital space? At the heart of the matter
is “freedom versus security.” Facebook has always wanted to be
a “healthy” platform. Above all, users should feel
safe and secure. It’s the same choice of words the content moderators
in the Philippines used in a lot of our interviews. (Video) The world
that we are living in right now, I believe, is not really healthy. (Music) In this world, there is really
an evil who exists. (Music) We need to watch for it. (Music) We need to control it — good or bad. (Music) [Look up, Young man! –God] MR: For the young content moderators
in the strictly Catholic Philippines, this is linked to a Christian mission. To counter the sins of the world which spread across the web. “Cleanliness is next to godliness,” is a saying everybody
in the Philippines knows. HB: And others motivate themselves by comparing themselves
with their president, Rodrigo Duterte. He has been ruling
the Philippines since 2016, and he won the election
with the promise: “I will clean up.” And what that means is eliminating
all kinds of problems by literally killing people on the streets who are supposed to be criminals,
whatever that means. And since he was elected, an estimated 20,000 people
have been killed. And one moderator in our film says, “What Duterte does on the streets, I do for the internet.” And here they are,
our self-proclaimed superheroes, who enforce law and order
in our digital world. They clean up,
they polish everything clean, they free us from everything evil. Tasks formerly reserved
to state authorities have been taken over
by college graduates in their early 20s, equipped with
three- to five-day training — this is the qualification — who work on nothing less
than the world’s rescue. MR: National sovereignties
have been outsourced to private companies, and they pass on their
responsibilities to third parties. It’s an outsourcing of the outsourcing
of the outsourcing, which takes place. With social networks, we are dealing with a completely
new infrastructure, with its own mechanisms, its own logic of action and therefore, also, its own new dangers, which had not yet existed
in the predigitalized public sphere. HB: When Mark Zuckerberg
was at the US Congress or at the European Parliament, he was confronted
with all kinds of critics. And his reaction was always the same: “We will fix that, and I will follow up on that
with my team.” But such a debate shouldn’t be held
in back rooms of Facebook, Twitter or Google — such a debate should be openly discussed
in new, cosmopolitan parliaments, in new institutions
that reflect the diversity of people contributing to a utopian project
of a global network. And while it may seem impossible
to consider the values of users worldwide, it’s worth believing that there’s more that connects us
than separates us. MR: Yeah, at a time
when populism is gaining strength, it becomes popular
to justify the symptoms, to eradicate them, to make them invisible. This ideology is spreading worldwide, analog as well as digital, and it’s our duty to stop it before it’s too late. The question of freedom and democracy must not only have these two options. HB: Delete. MR: Or ignore. HB: Thank you very much. (Applause)


  • Munching On the Tea says:

    If only we had a clean internet

  • Jonah Schwarz says:


  • Jonah Schwarz says:

    TED epic

  • In Musk I Trust says:


  • Ghrae says:

    The largest problem with a 'clean' internet is that it's purely up to the cleansers to decide what's allowed to be let through. On the surface, this is great. Then you have China as a shining example of why it's not great.

  • Sandro says:

    If we only had clean internet

  • Watch your profanities says:

    I don't know what to think of this.

  • Xeno Phon says:

    If you want to change how privacy on private property works then do so. Until then facebook, youtube, all private corporations are private property and they are no more bound to allow you to spout idiocy on their property than you are if they were on yours.

  • iOnsteins Engineering says:

    This is dumb. I wouldnt vote for this even my life depended on it. A clean internet requires censorship. It is impossible to have a clean internet without bias, censorship, and monopolization. Kick rocks dummy. Your TED talk sucks harry spaghetti ballz

  • Topher says:

    Trust us; we'll set your filters.

  • Bill Titwell says:

    More script

  • Titans Tracks says:

    Damn we literally live in the "Wild West" of the internet. Hatred and chaos still run rampant and we still need to adapt to this new form of communication as a species.



  • xmastacrackax says:

    We need to see what we have become. If it’s being deleted we cannot fix it and address it as a society.

  • Optical Clarity says:

    The price of a "clean internet" is freedom of speech. China & North Korea would love to help you.

  • Armahan Kar says:

    millions of movies have kind of sick scenes.. look at the joker..made billion usd box-office. so what?
    just dont look images ugly. and words, sentences, lyrics etc which have the same uglines?
    save your eyes, ears.

  • Optical Clarity says:

    0:30 Use the video to prosecute the rapist… and get over your objection to capital punishment. You don't have any problem murdering children…

  • Hannah Louise says:

    “Those who would give up essential Liberty to purchase a little temporary Safety deserve neither Liberty or Safety.” – Benjamin Franklin

    Freedom of speech is a right that shouldn’t be taken away only for someone to ignore the truth, no matter how gruesome and rooted in evil it might be.

  • Remy Lebeau says:

    It is not security, it is slavery. Dangerous freedom vs peaceful slavery.

  • Casey Hutton says:

    I'm sorry, but those images and video of the air strike does not need to be viewed by the masses. people who have access to that sort of stuff should be the same people that can do something about it. Why would Joe blow working his MacDonald's job need to see an air strike happen in Syria?

  • John says:

    All I hear is how to create a censorship ruled internet

  • SirAwesomeness7 says:

    The problem is much less keeping it “clean” and much more keeping it “legal.” Clean is a very subjective term that can lead to oppression. Something toxic or offensive to one person might be normal or entertaining to someone else. To take away a lifestyle from certain people just to fit yours better is a form of oppression. Staying within a communities guideline is acceptable so long as a terms and conditions was agreed to by the users. Laws should only be applied via the country the media is being fed to. Making things emotion based, subjective, will only cause a civil war online. Banning toxicity will only bring toxicity. Following the rules will be the right path to take.

  • michael says:

    Face palm. Define clean. It's different for everyone. Now shut up and go home.

  • mostawesomedudeever1 says:

    By clean you mean, censoring don't you. Funny how regressive ideas are being passed of as progressive these days .

  • Ted JustAdmitIT says:

    In my humble opinion it's a simple answer & covered by 1st amendment. Users can set preferences. I don't need to be "protected" and am tired of Big Brother's growing power and almighty eyes.

  • Andres C. says:

    Sounds like commie talk

  • Karla Reyes says:

    This is scary. It's scary to think how sick some people are that they would record rapes, mutilations, animal abuse, child abuse!! And uploaded for other sick people to enjoy! "Likes" on a rape video?!! Wtf is wrong with the world. I wish we would use those videos to stop it but nothing is being done. We dont need to clean the internet, we need to clean our minds and become more human than beasts 😔😔

  • Wake Up says:

    Another garbage video from pro pédôphilé TED

  • Truthful Chap says:

    I bet the parents of these two worked for the Stasi.

  • SirAwesomeness7 says:

    If content online is too inappropriate for a kid to watch or read, then it is the supervisors fault fault for not restricting access (parental controls) to that site and the kids fault for going to that site to begin with. It isn’t a posters duty to ask if things are child friendly because life itself is far from that. Hiding adult content online is going to cause problems for adults accessing the content. Access to information is essential to keep the truth in sights. Restricting access to these truths will only lead to an ignorant society. We can’t leave people in the dark because kids go where they’re not supposed to. Also kid mentality is very unstudied and rather subjective as it can only apply to specific individuals, not all kids. There are plenty of smart kids that this wouldn’t phase and many kids are exposed to worse than that video in video games. They end up just fine. There is no problem with a little exposure to adult content. Its parents that leave the kids ignorant, not the kids choosing that for themselves. If you can’t force an adult to be ignorant and stupid, then you shouldn’t force your kids to either. Instead just teach them about the content so it doesn’t risk any psychological harm and let them grow up at their rate. Not the rate the parent feels is best for their kid. Parents, you hate to accept this, but you really don’t know what’s best for your kid. Let go of this idea to relive your own childhood through your kids and let them be themself. They’re not your puppets. They’re not fragile. They can handle mature content if they’re informed on the matter. Don’t oppress your child.

  • One Of Those Guys says:

    Lol tecnocratic activists?
    Ok ok, but dont say populism like that. What we need as a global protection is to root out the problem. Have worker coops run the global economy.
    Now….how to get to that point successfully….lol

  • PST3K NaN says:

    The internet is the actual abstraction of free speech….so censors can f*ck off

  • ResurrectionX says:

    Great content guys. This is one of those dilemmas where there is not really a right or wrong answer.

  • JVH312 says:

    I once had a dream that a "negativity filter" had been created. It could be used on the internet, smart, phones and smart tvs. It was an app that filtered content deemed by a board of directors from multiple tech companies. People could also submit content, words etc that they wanted blocked. A monthly fee was paid and you only saw content that was "positive" or optimistic…at first it was just a handful of people who used it and then it became more popular than Facebook. The downside was that without any exposure to realistic content the human experience became false. Real issues that required attention went unnoticed, unaddressed…like world hunger, war and violence in other countries (tour own country). I posted the dream on Facebook and tagged a bunch of tech companies. I understand wanting to contain certain things…but hatred, anger and violence is a symptom of a real problem or issue…blocking it online does not resolve what is happening in reality.

  • green eyed cat says:

    Would ww2 or the Vietnam war gone as far as the dizzy heights they did and possibly further without graphic journalism ?.

    Just a thought.

  • Koozomec says:

    So, unrestricted access to content and information is a barrier to a supposed global utopia of "sanity" and "safety".
    Freedom vs security

    Kids, it's what happened if you smell your own farts.

  • Astrea Kaito says:

    Our liberty and freedom of speech is quite a fucking high price.

  • Matt Roszak says:

    This sounds like a scifi dystopia story.
    Except we're living in the dystopia!

    I'm all for freedom of speech online, but that becomes a problem when bots and clickbait get involved. Spreading bad ideas is very profitable, even if you don't believe them yourself!

  • FabledDan says:

    Ironic how TED has a trigger warning before this video… These guys are right, we should not be censoring the internet just so you can live in your own personal bubble. The politically correct SJWs coming out of the US have been spreading like the plague lately. All you're doing is promoting ignorance and propaganda.

  • Kevin NYC says:

    I would trust the computers over the humans. We now know Google is full of far-left ideologues.

  • Person says:

    Freedom or Security?
    We'll never have Security, life is not safe. So give us at least Freedom.

  • Xirpzy says:

    Honestly it should be up to the user to protect themselves from bad content. The platform should also be able to set rules and guidelines. What shouldnt be done is giving government control. Just look at China, you cant upload or say anything that doesnt fit. Even if you dont intend harm, someone else might think so and you are gone. The positives with a free internet are greater than the negatives. Why punish people that follow the rules and give more reasons to take the illegal route?

    Someone uploads horrific content, report that content and have it removed, move on. Why would we need a more regulated and limiting system that in the end harms millions more?

  • Idaho Potato says:

    Good-looking fascists trying to explain the necessity of modern-day book burning–replete with German accents, nonetheless. How very 1938.

  • Greg Mauchline says:

    Necessity is the tool of tyrants and the Creed of slaves.

  • whuzzzup says:

    There should be no "clean" internet. Simple as that.

  • Johnnie Saghian says:

    They also might change the past scientifically proven stuff which were wrong and replace them with the correct or the newer theory . Like Kale
    Like egg yolks and like nicotine or marijuana being good for a stroke dementia alzihmer patients etc…

  • SuperAtheist says:

    Internet purity brought to you by the Ministry of Truth.

  • SCEzeric says:

    The stuff that doesn't get deleted is bad enough, the stuff that does is terrible.
    Edit: I feel dirty after watching this, which is sad considering the subject matter. Hopefully one day we can really start to fix things, one day sooner rather than later. Also, the subject of back room dealings shouldn't be a thing, matters need to be discussed openly in public is something I agree on completely.

  • Dokta Whawee says:

    We MUST fight the authoritarian internet! Don’t let our internet become like China’s! We live in a better society!

  • Adymn Sani says:

    very sad generally, news talking heads and reporters in general suffer the same trauma…honestly no one should be a witness to disturbing content and each of us should shut it out of our lives…you think that to allow the right organizations or people to have this evil content is good as evidence, no it is not good…we need to go after the thinking first so that we don't have to "punish" evil doers after the facts…violence causes more violence no matter who does it or what level of violence or the type…we will have to suffer a lot before generations are finally over the desire for revenge of some evil violence done to them or country or family

  • Валентин Капчин says:

    So basically they are 4chan users, but also getting paid for seeing all the shitposting?

  • MultiBorsch says:

    Blah blah sad people look at bad pictures online. You can like… Get another job.

  • TJ Tampa says:

    So where. Is. This. Clean. Internet you speak of? The one I am familiar with is a disaster, a reflection of the boken ppl in the world dumping their agenda like a giant garbage truck.

  • David A says:

    The opposite of what you know and believe is ALSO true

  • Ryan B says:

    The internet was never meant to be commercialized, regulated or utilized the way it is today. No one owns the internet, no one has the supreme right to control it. Of course, we know what the extreme nature of this can be, perfectly illustrated by the usual goings of the dark web/net. The internet was made by the people who used it, usually for communicated across it to others. Unfortunately, the Nature and/or Conditions of Human beings is condemnable, and rather Government going out of their way doing old school investigation and policing work, they rather destroy privacy and human rights, settings a precedent for the inevitably Dystopia to come. This is why it's important to never allow Government to get too big: It'll corrupt and degenerate into totalitarianism. We should look to our individual responsibilities as citizens of a civil society, as family and that we can't keep running to Government to do everything for us. Their is a Price to Freedom, if you aren't willing to pay it, expect to swallow the black pill and polish your social credit score.

  • Ricca Shaps says:

    We might not be able to identify the company they work with but I can tell what nationality the content moderator is with the accent alone.

  • Captain zac says:

    This is a very odd video they start off telling the story of when moderators fail and a rape post goes viral. Then eventually they seem to settle on the idea that content mods are bad because they stifle free speech and debate.
    Couple of points (many of these may be pedantic):

    First: you don't have to see an air strike to know it happened that's why we have journalists. They made it clear that this stuff has real affects on people then say we should all see it.

    Two: content warnings are not censorship they're advice that you are free to ignore.

    Three: The implications they make at the end are all based on conjecture and hear-say. Seriously their point was "this is a Catholic country and some guy said he felt like their dodgy leader so content moderators here are bad"

    There are some good points in here but I really feel they let their biases skewed it towards the end.

  • JG77 Northeast says:

    Was this talk a joke? The internet has already been wiped. Who searches Google anymore? Nothing outside their narrative. We won't accept censorship. TED talk to love your jailers.

  • Demosthenes says:

    I could teach you the frameworks for more comprehensive artificial intelligence.

  • Belus Traveller Foreign Once says:

    Leave it pure of it’s communism, Are you the dictator, You people hid the truth and nothing gets done,

  • Lara Smith says:

    🦋so sad, it’s because humans are drawn to negativity. We need to love others instead of spreading negatively

  • Danial Huls says:

    Removing media of crime doesn't stop the crime from being committed, and is indeed, merely the tip of the iceberg. It is only a symptom of an underlying criminal culture. Yes it should be removed by being flagged and reported, and passed to criminal investigations.

    Nearly 500,000 children go missing in America every year, less in most other smaller nations.
    An estimated 8 million children world wide

    This is what happens when people protect criminals and prevent people from protecting themselves and their families while having short sighted misconstrued priorities…
    Parents no longer seem to raise their children, and instead rely on internet and government which were never meant to do such a task, and further perpetuates criminal culture via the lack of morality…

    While evil has and will always exist, it is the spreading degeneration of bloated societal populations that are no longer able to self-regulate virtuous socio-cultural norms and strong family values that seems to perpetually propagate more and more criminal issues.

    Censorship is a false sense of security and tacit allowance of evil by ignoring its existence under the guise of "modernity"…it takes more than just laws and police to root out cannibalistic predators of humanity…

  • Zoé V says:

    Been a long time since I have seen such a great Ted talk, this is really smtg no one knew about plus they are asking questions and leaving the conversation open it's really great

  • Cyan Diaz says:

    Like TED being TedX Hmm??? i guess i won't sub then??? guys????

  • tHE wHITE hOT mA mA'SS says:

    TEDTalks or TedX who are you???

  • REG3305 says:

    Too many are advocating for someone to moderate everything we see…. … …

    No Mr. Orwell.. I know 2+2=4 and I will not comply!

  • ADEB SIZ says:

    This is so true there is so much that we don’t see.

  • REG3305 says:

    "Moderators" remove religious messages while boosting those about 5 Years old being guided toward gender dysphoria…

  • Cerberus x47 says:

    A CHILD would NOT be searching for such things without provocation or simply bad parenting

  • Lindsay4182 says:

    Reminds me of the giver.

  • Vortecus says:

    I mean, I've seen some absolute vile stuff on the internet, I watch most of it on 4chan the rest elsewhere. I wouldn't say it has affected me as severely as people would think, I just understand the fact that this world and a lot of humanity is extremely disgusting and horrific. It doesn't surprise me that these content moderators commit suicide, some of the things I have seen online has made me quiver or sick and even make me reconsider parts of my life. It's a shame that these jobs are outsourced for the lowest wage and that employees aren't told exactly what they're getting themselves into. I wouldn't actually mind working as a content moderator for minimum wage. Working from home in an environment I feel safest. I think the biggest issue is that these companies hide the information about their content moderators and treat them so harshly, which is also a good reason a lot of them are probably depressed. If they opened up these careers, made them safer and sustainable for the employees, I doubt the employees would be that phased by the material if they already knew what the job entitled and how to cope with the worst stuff online.

  • Warminster100 says:

    We have laws clearly identifying what is hate or pedophelia crimes. Let the government monitor internet according to those laws, not some biased people on social media!

  • Warminster100 says:

    Never give up freedom for security!!

  • Grace Clague says:

    Thank you.

  • Binladen TheDon says:

    We Need a Spiritual SO TedTalk

  • Black Dot says:

    No, if you want a clean internet don't go to those sites.

  • Ryan Payne says:

    I feel like this would work better if they were mimes and they just played CNN clips

  • Black Dot says:

    Just because people can't control themselves they want to control everyone.

  • Bob Frog says:

    "Clean Internet"? Right, nothing but cat vids and "approved" thought. Sod off, COMMIES!

  • Alok Sen says:

    Internet is the medium for exchange of ideas and information. Only Adults should have the right to decide what they can and can't see on the individual level.

  • Cielo Batingal says:

    fighting for freedom too much will only lead to undisciplined actions and rebels. i'm from ph and still don't know how to react constructively.

  • Hayden H says:

    I want a cleaner internet as long as I can decide what you watch. Good?
    1. No curse words
    2. No far left ideas
    3. No gross stuff
    4. ONLY 1980s music is allowed
    5. No fat people like me should be on my, our internet

  • Devon Levack says:

    anyone down voting this loves the visuals of decrepit society. OR they just dont want to believe this happens OR wants to ignore the reality of the world and stay in their White Picket Fences and Secure Compound Suburbs.

    As a soldier in 3rd world peacekeeping we have seen this. Don't be desentized, be empowered and take action against this.

  • alarcon99 says:

    Oscar Wilde: “Everything in moderation, including moderation.”

  • siegfried greding says:

    In my studies I have seen privacy is already dead.
    That's not necessarily a bad thing. The problem is ignorant people.
    I would say don't censor any of it. Sweeping these problems under the rug doesn't usually fix it.
    But then we see what lies on the internet can do. Like What it has cost for the Muslim population in India. But just like privacy it's a problem of ignorance.
    This will further change once we get artificial intelligence that can tell what the video is before it's ever even uploaded and stop it.
    Let alone when everyone has an augmented reality glasses. How much will that change everything?

  • Jeff Lafferty says:

    I don't understand why Zuckerberg says that user discretion options in settings is something in the future. Why not now?

  • xingx355 says:

    This ideaology is spreading world wide because its called information control. the user will not get to see what ever content the admin chooses even if it is good.

  • 2garv2 says:

    Content should not be moderated! Let those who are stupid show us all just how stupid they are. Things that are explicitly illegal like inciting violence or CP should be removed.

  • Paxus says:

    i can't believe this is even a thing. How stupid. (A) It's technically impossible. (B) Restrictions and censorship lead to abuse. <— Just leave the WWW alone. People will self regulate. As to children – they should be regulated by parents anyway. Consider the WWW, what it is: The world outside your home. You should let you children roam to danerous or disgusting or illegal places… well same with WWW. But to censure the WWW or regulate it… that is as retarded as trying to make a 'War on Terrorism' <— It's NOT POSSIBLE!!! It's a fantasy designed by ppl who can make $$$ from it!!!
    I don't want to live on Earth anymore 😥

  • Deepak Gaba says:

    Love TED talks but sigh, just watched a talk where there was no suggested solution. I think people who spend time thinking about a problem should always share a suggested solution or two, it's so much more effective.

  • antlerman says:

    I really agree with this message.
    Good job TEDx for showing the other side.

  • Pilbaran00b says:

    Never really thought about this issue. Thanks for opening my mind to it with this video.

  • Yulin Chao says:

    Can someone give me a TL;DW?

Leave a Reply

Your email address will not be published. Required fields are marked *