![gay flag facebook background for posts gay flag facebook background for posts](https://ak3.picdn.net/shutterstock/videos/14051765/thumb/1.jpg)
![gay flag facebook background for posts gay flag facebook background for posts](https://static-secure.guim.co.uk/sys-images/Guardian/Pix/pictures/2014/3/26/1395860562711/Rainbow-flag-014.jpg)
These days, their primary job is to insist that Facebook is a fun place to share baby photos and sell old couches, not a vector for hate speech, misinformation, and violent extremist propaganda. There are reportedly more than five hundred full-time employees working in Facebook’s P.R. But then the hate speech and the toxicity keeps multiplying, and at a certain point you go, Oh, maybe, despite what they say, getting rid of this stuff just isn’t a priority for them.” Rashad Robinson, the president of the racial-justice group Color of Change, told me, “I don’t want to sound naïve, but until recently I was willing to believe that they were committed to making real progress. Facebook’s representatives repeatedly claimed that they took the spread of harmful content seriously, indicating that they could manage the problem if they were only given more time. Yet none of this seemed to cause lasting damage to the company’s reputation, or to its valuation. In its early years, Facebook weathered periodic waves of bad press, usually occasioned by incidents of bullying or violence on the platform. The rule about graphic content, for example, begins, “We remove content that glorifies violence.” The internal version, by contrast, enumerates several dozen types of graphic images-“charred or burning human beings” “the detachment of non-generating body parts” “toddlers smoking”-that content moderators are instructed to mark as “disturbing,” but not to remove.įacebook’s stated mission is to “bring the world closer together.” It considers itself a neutral platform, not a publisher, and so has resisted censoring its users’ speech, even when that speech is ugly or unpopular. The document available to Facebook’s users, the Community Standards, is a condensed, sanitized version of the guidelines.
Gay flag facebook background for posts software#
These are located on an internal software system that only content moderators and select employees can access. These days, the Implementation Standards comprise an ever-changing wiki, roughly twelve thousand words long, with twenty-four headings-“Hate Speech,” “Bullying,” “Harassment,” and so on-each of which contains dozens of subcategories, technical definitions, and links to supplementary materials.
![gay flag facebook background for posts gay flag facebook background for posts](https://wallpapercave.com/wp/wp4054428.jpg)
A few years later, it was given a more innocuous-sounding title: the Implementation Standards. He called the document the Abuse Standards. ‘We delete nudity.’ ‘People aren’t allowed to say nice things about Hitler.’ It was a list, not a framework.” So he wrote a framework. The guidelines, he told me, “were just a bunch of examples, with no one articulating the reasoning behind them. He later became the company’s head of content policy. At the time, she said, the written guidelines were about a page long around the office, they were often summarized as, “If something makes you feel bad in your gut, take it down.” Her husband, Dave, was hired the following year, becoming one of twelve full-time content moderators. Charlotte Willner joined three years later, as one of the company’s first employees to moderate content on the site. When Facebook was founded, in 2004, the company had few codified rules about what was allowed on the platform and what was not. This content can also be viewed on the site it originates from.