This article needs to be updated. Please help update this article to reflect recent events or newly available information.(January 2021)
Google and its subsidiary companies, such as YouTube, have removed or omitted information from its services in order to comply with company policies, legal demands, and government censorship laws.[1]
Numerous governments have asked Google to censor content. In 2012, Google ruled in favor of more than half the requests they received via court orders and phone calls. This did not include China or Iran, who completely blocked the site or one of its subsidiary companies.[2]
In February 2003, Google stopped showing advertisements from Oceana, a non-profit organization protesting against a major cruise ship operation's sewage treatment practices. Google, citing its editorial policy, stated that "Google does not accept advertising if the ad or site advocates against other individuals, groups, or organizations."[3]
In April 2014, Google accepted ads from the pro-choice abortion lobbying group NARAL, but removed ads for some anti-abortion crisis pregnancy centers. Google removed the web search ads after an investigation by NARAL found evidence that the ads violated Google's policy against deceptive advertising. According to NARAL, people using Google to search for abortion clinics found advertisements for anti-abortion pregnancy crisis centers. Google stated that it had followed company procedures in applying its ad policy standards related to ad relevance, clarity, and accuracy.[4]
In September 2018, Google removed a paid advertisement from YouTube made by supporters of Russian opposition who urged Russians to participate in a protest set on September 9. Russia's Central Election Commission earlier sent a request to Google to remove the advertisement, saying it violated election laws that call for a "day of silence" on election matters ahead of voting, but the advertisement was blocked even in regions with no voting set on September 9 and in regions where authorities had authorized the pension-reform protests.[5]
In March 2007, the lower-resolution satellite imagery on Google Maps showing post-Hurricane Katrina damage in Louisiana, US, was allegedly replaced with higher resolution images from before the storm.[6] Google's official blog post in April revealed that the imagery was still available in KML format on Google Earth or Google Maps.[7][8][9]
To protect the privacy and anonymity of individuals, Google selectively blurred photographs containing car license number plates and faces in Google Street View. Users may request further blurring of images that feature them, their family, their car, or their home. Users can also request the removal of images that feature what Google terms "inappropriate content," which falls under their categories of intellectual property violations; sexually explicit content; illegal, dangerous, or violent content; child endangerment; hate speech; harassment and threats; and personal or confidential information.[11] In some countries (e.g. Germany), Google modifies images of specific buildings.[12] In the United States, Google Street View adjusts or omits certain images deemed of interest to national security by the federal government.[10]
In the United States, Google commonly filters search results to comply with Digital Millennium Copyright Act (DMCA)-related legal complaints.[13]
In the United Kingdom, it was reported that Google had "delisted" Inquisition 21, a website that claims to challenge moral authoritarian and sexually absolutist ideas in the United Kingdom. Google later released a press statement suggesting Inquisition 21 had attempted to manipulate search results.[14] In Germany and France, a study reported that approximately 113 white nationalist, Nazi, antisemitic, Islamic extremist, and other similar websites had been removed from the German and French versions of Google.[15] Google has complied with these laws by not including sites containing such material in its search results. However, Google does list the number of excluded results at the bottom of the search result page and links to Lumen (formerly, Chilling Effects) for an explanation.[1]
Lolicon content
This section needs to be updated. The reason given is: No info past 2010. Please help update this article to reflect recent events or newly available information.(April 2024)
As of 18 April 2010[update], Google censors "lolicon", a Japanese term meaning "attractive young girls",[16][17][18] on its search results, hiding results regarding lolicon material, even if the user types words along with the term which would typically lead to explicit content results; the terms "loli" and "lolita" also suffer from censorship in regards to this content.[19][20]
This section needs to be updated. The reason given is: No info past 2012, last time I checked, this isn't the case anymore. Please help update this article to reflect recent events or newly available information.(April 2024)
The removal of SafeSearch options in Google refers to changes in how Google filters search results to ensure they are appropriate for different audiences. SafeSearch is a feature that blocks explicit content, including adult material like pornography, violence, or graphic content, from appearing in search results. SafeSearch can be useful for parents, educators, and institutions that want to ensure a safer online experience for children or specific groups of users.
Google SafeSearch was first introduced in 1999 as a tool to help users filter out explicit content from search results. Over the years, it has evolved to become a key feature for maintaining a family-friendly and educationally safe environment online. Below is a timeline of significant events and changes related to SafeSearch and its implementation:
Timeline and History of Google SafeSearch:
1999 – Initial Launch of SafeSearch
Google SafeSearch introduced: In the early days of Google, SafeSearch was rolled out as an optional feature to allow users to filter adult content, including sexually explicit material and violent content, from their search results. This was part of Google's mission to provide relevant and appropriate content for users while browsing.
Early 2000s – Gradual Improvements:
Content Filtering Algorithms: Over the early 2000s, Google improved its content filtering algorithms, making SafeSearch more accurate in identifying inappropriate content. As the web grew, so did the variety of explicit materials, and Google responded by refining how SafeSearch worked, particularly in Google Images.
2010 – SafeSearch Locked Feature:
Locking SafeSearch for Kids and Schools: In 2010, Google introduced a feature allowing parents and schools to lock SafeSearch on shared devices or accounts. By locking SafeSearch, administrators could ensure that inappropriate content would not be displayed, even if someone attempted to disable the filter.
2012 – SafeSearch Update and Removal of the Moderate Filter:
Removal of Moderate Filter: In December 2012, Google made significant changes to SafeSearch. Previously, users could select from three settings: Off, Moderate, and Strict. Google removed the Moderate option, making SafeSearch either On (strict) or Off. This was part of an effort to improve filtering and prevent access to explicit images more effectively.
Blurred Content in Image Searches: Along with this update, Google also blurred explicit images that appeared in search results even when SafeSearch was off. Users would need to explicitly click to view the image, adding an additional layer of protection.
2018 – SafeSearch Default for Minors via Family Link:
Google Family Link Integration: By 2018, Google's Family Link app allowed parents to create accounts for their children under 13, ensuring that SafeSearch was automatically enabled for these accounts.[21] This was part of a broader push by Google to protect children's privacy and online safety, in compliance with regulations like the Children's Online Privacy Protection Act (COPPA).
2021 – Automatic SafeSearch for Users Under 18:
Stronger Protections for Minors: In August 2021, Google rolled out significant policy changes designed to better protect children and teenagers on the internet. This included automatically enabling SafeSearch for all users under the age of 18, whether they were using Google Search or Google Assistant.
Default Privacy Settings: The move was part of a larger shift toward protecting minors' digital privacy, including disabling location history and limiting the visibility of minors' personal information across Google's platforms.
2023 – SafeSearch Filtering Enhancements:
Stricter Defaults: By 2023, Google took further steps to enforce SafeSearch by default for all users. This was particularly important in regions with more stringent content regulations. For example, Google enforced SafeSearch more rigorously in countries with strong internet censorship laws or where governments mandated stricter content control..[22][23][24]
Some users have stated that the lack of a completely unfiltered option amounts to censorship by Google. A Google spokesperson disagreed, saying that Google is "not censoring any adult content," but "want to show users exactly what they are looking for—but [Google policies] aim not to show sexually-explicit results unless a user is specifically searching for them".[25]
Online pharmacies
Following a settlement with the Food and Drug Administration (FDA) ending Google Adwords' advertising of Canadian pharmacies that permitted Americans to access cheaper prescriptions, Google agreed to several compliances and reporting measures to limit the visibility of "rogue pharmacies". Google and other members of the Center for Safe Internet Pharmacies are collaborating to remove illegal pharmacies from search results and participating in "Operation Pangea" with the FDA and Interpol.[26][27]
In January 2010, Google was reported to have stopped providing automatic suggestions for any search beginning with the term "Islam is", while it continued to do so for other major religions. According to Wired.com, a Google spokesperson stated, "This is a bug and we're working to fix it as quickly as we can."[28] Suggestions for "Islam is" were available later that month. Nonetheless, Google continues to filter certain words from autocomplete suggestions,[29] describing them as "potentially inappropriate".[30]
The publication 2600: The Hacker Quarterly has compiled a list of words that are restricted by Google Instant.[31] These are terms that the company's Instant Search feature will not search.[32][33] Most terms are often vulgar and derogatory in nature, but some apparently irrelevant searches including "Myleak" are removed.[33]
As of 26 January 2011[update], Google's Autocomplete feature would not complete certain words such as "BitTorrent," "Torrent," "uTorrent," "Megaupload," and "Rapidshare", and Google actively censored search terms or phrases that its algorithm considered likely constituting spam or intending to manipulate search results.[34]
In September 2012, multiple sources reported that Google had removed "bisexual" from its list of blacklisted terms for Instant Search.[35]
In December 2022, Google was reported to have stopped providing automatic suggestions for any search with the term "protests in China", while it continued to do so for other countries.[citation needed]
Ungoogleable
In 2013, the Language Council of Sweden included the Swedish version of the word ungoogleable (ogooglebar) in its list of new words.[36] It had "defined the term as something that cannot be found with any search engine".[37] Google objected to this definition, wanting it to only refer to Google searches, and the Council removed it in order to avoid a legal confrontation,[38] and accused Google of trying to "control the Swedish language".[39]
On 31 August 2014, almost 200 private pictures of various celebrities containing nudity and explicit content were made public on certain websites. Google removed most search results that linked users directly to such content shortly after.[40]
COVID-19 pandemic-related content
An Australian study found Google search results relating to COVID-19 were heavily curated, with no indication given to users that such curation was happening.[41] Google removed autocomplete suggestions for searches related to the COVID-19 lab leak theory.[42] Google also censored a public Google Docs document on efficacy of the drug hydroxychloroquine as a COVID-19 treatment, in favour of WHO recommendations—some of which were themselves based on fraudulent data.[43][unreliable source?]
International
Australia
In January 2010, Google Australia removed links to satirical website Encyclopedia Dramatica's "Aboriginal" article, citing it as a violation of Australia's Racial Discrimination Act.[44] After the website's domain change in 2011, the article resurfaced in Google Australia's search results.
On 19 June 2014, Google was ordered by the Supreme Court of British Columbia to remove search results that linked to websites of a company called Datalink. The websites in question sell network device technology that Datalink is alleged to have stolen from Equustek Solutions. Google voluntarily removed links from google.ca, the main site used by Canadians, but the court granted a temporary injunction applying to all Google sites across the world.[45] Google argued that Canadian law could not be imposed across the world but was given until June 17, 2014, to comply with the court's ruling.[46]
Google claimed it did not plan to give the government information about users who searched for blocked content and would inform users that content had been restricted if they attempt to search for it.[49] As of 2009[update], Google was the only major China-based search engine to explicitly inform the user when search results were blocked or hidden. As of December 2012[update], Google no longer informs the user of possible censorship for certain queries during a search.[50] The Chinese government had restricted citizens' access to popular search engines such as AltaVista, Yahoo, and Google in the past, though the complete ban has since been lifted[when?]. However, the government remains active in filtering Internet content. In October 2005, the Blogger platform and access to the Google cache was made available in mainland China; however, in December 2005, some mainland Chinese Blogger users reported that their access to the site was once again restricted[who?].
In January 2006, Google agreed that China's version of Google, Google.cn, would filter certain keywords given to it by the Chinese government.[51] Google pledged to tell users when search results are censored and said that it would not "maintain any services that involve personal or confidential data, such as Gmail or Blogger, on the mainland".[52] Google said that it does not plan to give the government information about users who search for blocked content and will inform users that content has been restricted if they attempt to search for it. Searchers may encounter a message which states: "In accordance with local laws and policies, some of the results have not been displayed."[49] Google issued a statement saying that "removing search results is inconsistent with Google's mission" but that the alternative—being shut down entirely and thereby "providing no information (or a heavily degraded user experience that amounts to no information) is more inconsistent with our mission."[51] Initially, both the censored Google.cn and the uncensored Chinese-language Google.com were available. In June 2006, however, China blocked Google.com again.[52]
Some Chinese Internet users were critical of Google for assisting the Chinese government in repressing its own citizens, particularly those dissenting against the government and advocating for human rights.[53] Furthermore, Google had been denounced and called hypocritical by the Free Media Movement and Reporters Without Borders for agreeing to China's demands while simultaneously fighting the United States government's requests for similar information.[54] Google China had also been condemned by Reporters Without Borders,[54]Human Rights Watch,[55] and Amnesty International.[56]
On 14 February 2006, protesters organized a "mass breakup with Google" whereby users agreed to boycott Google on Valentine's Day to show their disapproval of the Google China policy.[57][58]
In June 2009, Google was ordered by the Chinese government to block various overseas websites, including some with sexually explicit content. Google was criticized by the China Illegal Information Reporting Center (CIIRC) for allowing search results that included content that was sexual in nature, and claimed the company was a dissemination channel for a "huge amount of pornography and lewd content".[59]
On 12 January 2010, in response to an apparent hacking of Google's servers in an attempt to access information about Chinese dissidents, Google announced that "we are no longer willing to continue censoring our results on Google.cn, and so over the next few weeks we will be discussing with the Chinese government the basis on which we could operate an unfiltered search engine within the law, if at all."[60]
On 22 March 2010, after talks with Chinese authorities failed to reach an agreement, the company redirected its censor-complying Google China service to its Google Hong Kong service, which is outside the jurisdiction of Chinese censorship laws. However, at least as of March 23, 2010, "The Great Firewall" continues to censor search results from the Hong Kong portal, www.google.com.hk (as it does with the US portal, www.google.com) for controversial terms such as "Falun gong" and "the June 4th incident" (1989 Tiananmen Square protests and massacre).[61][62][63]
In August 2018, it was revealed that Google was working on a version of its search engine for use in China, which would censor content according to the restrictions placed by the Chinese government. This project was worked on by a small percentage of the company and was codenamed Dragonfly. A number of Google employees expressed their concern about the project, and several resigned.[64][65] In 2019, Google's vice president of public policy, Karan Bhatia, testified before the U.S. Senate Judiciary Committee that the Dragonfly project had been terminated.[66]
In February 2023, Radio Free Asia reported that YouTube content satirizing CCP General SecretaryXi Jinping is routinely targeted for takedowns using YouTube's copyright infringement reporting system.[67]
European Union
In July 2014, Google began removing certain search results from its search engines in the European Union in response to requests under the right to be forgotten. Articles whose links were removed, when searching for specific personal names, included a 2007 blog by the BBC journalist Robert Peston about Stanley O'Neal, a former chairman of investment bank Merrill Lynch, being forced out after the bank made huge losses.[68] Peston criticized Google for "...cast[ing him] into oblivion".[69]
The Guardian reported that six of its articles, including three relating to a former Scottish football referee, had been "hidden".[70] Other articles, including one about French office workers using post-it notes and another about a collapsed fraud trial of a solicitor standing for election to the Law Society's ruling body, were affected.[71][72]
The Oxford Mail reported that its publishers had been notified by Google about the removal of links to the story of a conviction for shoplifting in 2006. The paper said it was not known who had asked Google to remove the search result, but there had been a previous complaint to the Press Complaints Commission (PCC) in 2010 concerning its accuracy, claimed that the report was causing "embarrassment", and requested that the story be taken off the paper's website. The paper said two factual amendments were made to the article and the PCC dismissed the complaint.[75][76]
An article about the conversion to Islam of the brother of George Osborne, the Chancellor of the Exchequer, was removed after a request to Google from an unknown person under the right-to-be-forgotten ruling.[77]
The Telegraph reported that links to a report on its website about claims that a former Law Society chief faked complaints against his deputy were hidden.[78][79] The search results for the articles for the same story in the Guardian and The Independent were also removed.[80][81]The Independent reported that its article, together with an article on the Indian Ocean tsunami in 2004 and one on new trends in sofa design in 1998, had been removed.[82]The Telegraph also reported that links to articles concerning a student's 2008 drink-driving conviction and a 2001 case that resulted in two brothers each receiving nine-month jail terms for affray had been removed.[83]
The Spanish newspaper El Mundo reported that some results were hidden over a 2008 news report[84] of a Spanish Supreme Court ruling involving executives of Riviera Coast Invest who were involved in a mortgage mis-selling scandal.[85]
On 5 July 2014, German news magazine Der Spiegel reported removal of a search result to an article about Scientology.[86][87]
On 19 August 2014, the BBC reported that Google had removed 12 links to stories on BBC News.[88]
Google has complied with these laws by not including sites containing such material in its search results. However, Google does list the number of excluded results at the bottom of the search result page and links to Lumen (formerly known as Chilling Effects) for explanation.[1]
Sweden
In March 2018, Google delisted a WordPress hosted site from search results in Sweden,[90] following an intense media frenzy targeted against Google, YouTube, and Facebook by the tabloid Expressen and the daily newspaper Dagens Nyheter.[91] The WordPress site lists Swedish Jews in the public sphere, and also agitates against the dominant publishing house Bonnier Group, the owner of both newspapers.
Although perfectly legal in Sweden, the WordPress site was described as antisemitic.[92] The Bonnier papers argued that Google should not promote such content and above all not at a high rank. Ministers in the Swedish green-left government agreed with this sentiment, and threatened with national and EU regulation unless Google adapt its algorithms and delist contents of "threats and hate" (hot och hat).[93] Google eventually delisted the site in Sweden due to copyright claims.[when?]
Said papers also targeted the YouTube channel Granskning Sverige (Scrutiny Sweden) for its alleged extreme right-wing contents.[94] The channel was described as a "troll factory", where members called authorities, journalists and other public figures, and recut the recorded interviews to make them fit the channel's right-wing extremist world view.[95] The interviews were broadcast against a black backdrop with the channel logotype, and the occasional use of screen dumps from newspaper articles related to the interviews.[96] Google eventually complied with the demands,[when?] and closed the channel, citing copyright infringement and violation of terms of agreement.[97]
On April 13, 2018, Google took part in a meeting with the Swedish government, to discuss the search company's role in the media landscape.[98] Minister of Justice, Morgan Johansson (Social Democrats), and Minister of Digitization, Peter Eriksson (Green Party), expressed concerns that "unlawful" and "harmful" content was facilitated by Google, and that "trolls" could have a negative impact on the upcoming Swedish parliamentary election. Google agreed to refine its algorithms, and also hire more staff to make sure "threats and hate" are eliminated from Google search and YouTube videos.[99] Critics have voiced concerns that private international companies are mandated to put censorship into effect to comply with local regulations without guidance from courts, and that free speech is deteriorating at an accelerating rate.[100][101][102]
Since 2015, Google removed certain search results that were defamatory in nature[104] from its search engine in Israel following gag orders.[105]
United Kingdom
On 21 September 2006,[14] it was reported that Google had "delisted" Inquisition 21, a website that claims to challenge moral authoritarian and sexually absolutist ideas in the United Kingdom. According to Inquisition 21, Google was acting "in support of a campaign by law enforcement agencies in the US and the UK to suppress emerging information about their involvement in major malpractice", allegedly exposed by their own investigation of any legal action against those who carried out Operation Ore, a far-reaching and much-criticized law enforcement campaign against the viewers of child pornography.[106][107] Google released a press statement suggesting Inquisition 21 had attempted to manipulate search results.[14]
In 2002, "in an apparent response to criticism of its handling of a threatening letter from a Church of Scientology lawyer," Google began to make DMCA "takedown" letters public, posting such notices on the Chilling Effects archive (now Lumen), which archives legal threats made against Internet users and Internet sites.[109]
In mid-2016, Google conducted a two-month standoff with writer Dennis Cooper after deleting his Blogger and Gmail accounts without warning or explanation following a single anonymous complaint. The case drew worldwide media attention, and finally resulted in Google returning Cooper's content to him.[110][111]
In mid-2018, Google permanently barred conspiracy theorist Alex Jones from using its subsidiary company YouTube. Jones' channel InfoWars responded by "accusing the companies of censorship".[112]
In mid-2019, Google allegedly suspended Tulsi Gabbard's advertisements for her presidential campaign, while the candidate was at the height of public interest.[113] Gabbard sued Google for $50million in damages.[114][needs update]
In 2024, Google, Bing, DuckDuckGo, Quant, and Gibiru all blacklisted the website secretservicepolygraph.com which dealt with federal law enforcement corruption. The site is only discoverable using the Russian search engine Yandex.
In June 2017, the Canadian supreme court ruled that Google can be forced to remove search results worldwide. Civil liberties groups including Human Rights Watch, the BC Civil Liberties Association, and the Electronic Frontier Foundation argue that this would set a precedent for Internet censorship. In an appeal, Google argued that the global reach of the order was unnecessary and that it raised concerns over freedom of expression. While the court writes that "[They] have not, to date, accepted that freedom of expression requires the facilitation of the unlawful sale of goods", OpenMedia spokesman, David Christopher, warns that "there is great risk that governments and commercial entities will see this ruling as justifying censorship requests that could result in perfectly legal and legitimate content disappearing off the web because of a court order in the opposite corner of the globe".[115][116]
On September 17, 2021, Google removed the Smart Voting app used by the Russian opposition to coordinate its voting strategy against the ruling United Russia party during elections. The app was removed following threats from the Russian government.[117][118]
YouTube, a video sharing website and subsidiary of Google, in its Terms of Service, prohibits the posting of videos which violate copyrights or depict pornography, illegal acts, gratuitous violence, hate speech, and what it deems to be misinformation about COVID-19.[119] User-posted videos that violate such terms may be removed and replaced with a message that reads, "This video has been removed due to a violation of our Terms of Service."
In September 2007, YouTube blocked the account of Wael Abbas, an Egyptian activist who posted videos of police brutality, voting irregularities and antigovernmental demonstrations under the Mubarak regime.[120] Shortly afterward, his account was subsequently restored,[121] along with 187 of his videos.[122]
In 2006, Thailand blocked access to YouTube after identifying 20 offensive videos it ordered the site to remove.[1] In 2007, a Turkish judge ordered YouTube to be blocked in the country due to videos insulting Mustafa Kemal Atatürk, the founder of the Republic of Turkey (which falls under Article 301 prohibitions on insulting the Turkish nation).[1]
In February 2008, the Pakistan Telecommunications Authority banned YouTube in the country, but the manner in which it performed the block accidentally prevented access to the website worldwide for several hours.[123] The ban was lifted after YouTube removed controversial religious comments made by a Dutch government official concerning Islam.[124][125]
In October 2008, YouTube removed a video by Pat Condell titled "Welcome to Saudi Britain"; in response, his fans re-uploaded the video themselves and the National Secular Society wrote to YouTube in protest.[126]
In 2016, YouTube launched a localized Pakistani version of its website for the users in Pakistan in order to censor content considered blasphemous by the Pakistan government as a part of its deal with the latter. As a result, the three-year ban on YouTube by the Pakistan government was subsequently lifted.[127][128]
In July 2017, YouTube began modifying suggested videos to debunk terrorist ideologies.[129] In August 2017, YouTube wrote a blog post explaining a new "limited state" for religious and controversial videos, which would not allow comments, likes, monetization, and suggested videos.[130]
In October 2017, PragerU sued YouTube, alleging violations of their freedom of speech under the First Amendment via YouTube's "arbitrary and capricious use of 'restricted mode' and 'demonetization' viewer restriction filters" to suppress their content. A U.S. district appeals court threw out the suit in February 2020, stating that despite "[its] ubiquity and its role as a public-facing platform", YouTube was still considered a private platform (the First Amendment only applies to state actors).[131]
In December 2017, what YouTubers referred to as the "AdPocalypse" took place, with YouTube's automated content policing tool began demonetizing content that ran afoul of the company's very-broad "Not Advertiser-Friendly" category.[132] The following April, numerous firearm-related channels began encountering additional policing by YouTube when new rules restricting videos "that facilitate private gun sales or link to websites that sell guns" were enacted.[132] As a result, popular firearms vlogger Hickok45's account was deleted (and subsequently reinstated after an outcry).[133]
In March 2018, The Atlantic found that YouTube had delisted a video where journalist Daniel Lombroso reported a speech by white nationalist Richard B. Spencer at the 2016 annual conference of the National Policy Institute, where they celebrated Donald Trump's win at the presidential election.[134] YouTube relisted the video after The Atlantic sent a complaint.
On June 5, 2019, YouTube updated its hate speech policy to prohibit hateful and supremacist work, and limit the spread of violent extremist content online. The policy extends to content that justifies discrimination, segregation, or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation, or veteran status. It covers videos that, for example, include Nazi ideology, Holocaust denial, Sandy Hook conspiracy theories, or flat Earth theories. The policy also aims at reducing borderline content and harmful misinformation, such as videos promoting phony miracle cures for serious illnesses.[135]
In February 2020, YouTube reportedly began censoring any content related to the novel coronavirus (SARS-CoV-2) by removal or demonetization of the channel, citing the "sensitive topics" advertiser-friendly content guideline on Twitter.[136][137]
In October 2020, PewDiePie was allegedly shadow-banned by YouTube, which led to his channel and videos becoming unavailable on search results. However, YouTube denied shadow-banning him, although the human review was restricted due to the COVID-19 pandemic. YouTube was criticized by PewDiePie himself, his fans, other YouTubers, and netizens over this.[139][140]
In early February 2021, YouTube removed raw footage taken of the 2021 storming of the United States Capitol by independent journalists like Ford Fischer from News2Share or from progressive media outlets such as Status Coup citing that the videos violated its policies on misinformation.[141][142][143] The same footage from the outlets was reused by large media organizations and still up on their YouTube accounts.[142][143] Some independent journalists including Fischer and other progressive outlets like The Progressive Soap Box (host Jamarl Thomas), Political Vigilante (Graham Elwood), Franc Analysis and The Convo Couch were demonetized by YouTube with some having their superchat feature blocked.[141][142] Fischer was later remonetized by YouTube after it acknowledged "over-enforcement".[142]
At least since October 2019, YouTube has been automatically deleting any comments that contain the Chinese terms for "50 Cent Party" (五毛党) and its shortened version "50 Cent" (五毛). They have also been deleting any comments referring to the Chinese Communist Party (CCP) as "bandits" (共匪). In May 2020, YouTube made a statement to The Verge that these deletions were made "in error".[144][145]
YouTube policies restrict certain forms of content from being included in videos being monetized with advertising, including strong violence, language, sexual content, and "controversial or sensitive subjects and events, including subjects related to wars, political conflicts, natural disasters, and tragedies, even if graphic imagery is not shown", unless the content is "usually newsworthy or comedic and the creator's intent is to inform or entertain".[149]
On August 31, 2016, YouTube introduced a new system to notify users of violations of the "advertiser-friendly content" rules, and allow them to appeal. Following its introduction, many prominent YouTube users began to accuse the site of engaging in de facto censorship, arbitrarily disabling monetization on videos discussing various topics such as skincare, politics, and LGBTQ history. Philip DeFranco argued that not being able to earn money from a video was "censorship by a different name", while Vlogbrothers similarly pointed out that YouTube had flagged both "Zaatari: thoughts from a refugee camp" and "Vegetables that look like penises" (although the flagging on the former was eventually overturned).[149] The hashtag "#YouTubeIsOverParty" was prominently used on Twitter as a means of discussing the controversy. A YouTube spokesperson stated that "[w]hile [their] policy of demonetizing videos due to advertiser-friendly concerns hasn't changed, [they've] recently improved the notification and appeal process to ensure better communication to [their] creators."[150][151][152]
In March 2017, a number of major advertisers and prominent companies began to pull their advertising campaigns from YouTube over concerns that their ads were appearing on objectionable and/or extremist content, in what the YouTube community began referring to as a "boycott".[153][154] YouTube personality PewDiePie described these boycotts as an "adpocalypse", noting that his video revenue had fallen to the point that he was generating more revenue from YouTube Red subscription profit sharing (which is divided based on views by subscribers) than advertising.[155] On 6 April 2017, YouTube announced planned changes to its Partner Program, restricting new membership to vetted channels with a total of at least 10,000 video views. YouTube stated that the changes were made in order to "ensure revenue only flows to creators who are playing by the rules".[156]
Censorship of LGBT content in Restricted Mode
In March 2017, the "Restricted Mode" feature was criticized by YouTube's LGBT community for filtering videos that discuss issues of human sexuality and sexual and gender identity, even when there is no explicit references to sexual intercourse or otherwise inappropriate content.[157][149][158] Rapper Mykki Blanco told The Guardian that such restrictions are used to make LGBT vloggers feel "policed and demeaned" and "sends a clear homophobic message that the fact that my video displays unapologetic queer imagery means it's slapped with an 'age restriction', while other cis, overly sexualised heteronormative work" remain uncensored.[158] Musicians Tegan and Sara similarly argued that LGBT people "shouldn't be restricted", after acknowledging that the mode had censored several of their music videos.[159]
YouTube later stated that a technical error on Restricted Mode wrongfully impacted "hundreds of thousands" LGBT-related videos.[160]
False positives
In February 2019, automated filters accidentally flagged several channels with videos discussing the AR mobile game Pokémon Go and the massively multiplayer online gameClub Penguin for containing prohibited sexual content, as some of their videos contained references to "CP" in their title. In Pokémon Go, "CP" is an abbreviation of "Combat Power"—a level system in the game, and "CP" is an abbreviation of Club Penguin, but it was believed that YouTube's filters had accidentally interpreted it as referring to child pornography. The affected channels were restored, and YouTube apologized for the inconvenience.[161][162]
On May 10, 2007, shareholders of Google voted down an anti-censorship proposal for the company. The text of the failed proposal submitted by the New York City comptroller's office, which controls a significant number of shares on behalf of retirement funds, stated that:
Data that can identify individual users should not be hosted in Internet-restricting countries, where political speech can be treated as a crime by the legal system.
The company will not engage in pro-active censorship.
The company will use all legal means to resist demands for censorship. The company will only comply with such demands if required to do so through legally binding procedures.
Users will be clearly informed when the company has acceded to legally binding government requests to filter or otherwise censor content that the user is trying to access.
Users should be informed about the company's data retention practices and the ways in which their data is shared with third parties.
The company will document all cases where legally-binding censorship requests have been complied with, and that information will be publicly available.
David Drummond, senior vice president for corporate development, said "Pulling out of China, shutting down Google.cn, is just not the right thing to do at this point... but that's exactly what this proposal would do."[164]
CEO Eric Schmidt and founders Larry Page and Sergey Brin recommended that shareholders vote against the proposal. Together, they hold 66.2 percent of Google's total shareholder voting power, meaning that they could themselves have declined the anti-censorship proposal.[165]
Russian invasion of Ukraine
In early March 2022, contractors who were working for Google and preparing translations for the Russian market received an update from Google: "Effective immediately, the ongoing Russian war against Ukraine could no longer be referred to as a war but rather only vaguely as 'extraordinary circumstances.'"[166][167] Thus, Google was trying to protect itself from Russian sanctions, as well as its employees from persecution within Russia, in connection with the new law, which provided up to 15 years in prison for any information about the war against Ukraine, except when officially announced by the Kremlin.[168]
Since the beginning of the Russo-Ukrainian conflict, Google has been blocking Russian state-funded media such as RT and Sputnik,[169] and has also extended its censorship to non state-funded media outlets such as RBK by banning them entirely from the video-hosting platform YouTube. Thus said, Google has been blocking all Russian news outlets, citing that it represents a violation of their terms of services. Google also acted upon a request of the European Union.[170]
^Galbraith 2016, pp. 113–114: "Given its importance, it is not surprising that lolicon has been well researched in Japan over the course of decades, which has led to numerous insights. [...] Characters are not compensating for something more 'real,' but rather are in their fiction the object of affection. This has been described as 'finding sexual objects in fiction in itself', which in discussions of lolicon is made explicitly distinct from desire for and abuse of children."
^McLelland 2011b, p. 16: "Japanese scholarship has, on the whole, argued that, in the case of Japanese fans, neither the Loli nor the BL fandom represent the interests of paedophiles since moe characters are not objectified in the same manner that actual images of children can be, rather they express aspects of their creators' or consumers' own identities."
^Kittredge 2014, p. 524: "The majority of the cultural critics responding to the Japanese otaku's erotic response to lolicon images emphasize, like Keller, that no children are harmed in the production of these images and that looking with desire at a stylized drawing of a young girl is not the same as lusting after an actual child."
^"3. Google, Inc."Archived 2017-03-12 at the Wayback Machine in Race to the Bottom': Corporate Complicity in Chinese Internet Censorship, Part IV. How Multinational Internet Companies assist Government Censorship in China, Human Rights Watch, Vol. 18 No. 8(C), August 2006
^Error pageArchived 2018-12-25 at the Wayback Machine, Google France, (in French), "Aucun document ne correspond aux termes de recherche spécifiés (site:jesus-is-lord.com). En réponse à une demande légale adressée à Google, nous avons retiré 391 résultat(s) de cette page. Si vous souhaitez en savoir plus sur cette demandeArchived 2013-09-27 at the Wayback Machine, vous pouvez consulter le site ChillingEffects.org." ("No documents match the specified search (site: jesus-is-lord.com). In response to a legal request submitted to Google, we have removed 391 result(s) from this page. If you want to know more about this application, you can consult the ChillingEffects.org site."). Retrieved 27 September 2013.