YouTube: Difference between revisions

72,498 bytes removed ,  7 March
→‎Videos: shorten
(→‎Videos: shorten)
Line 42: Line 42:


YouTube has an estimate 14 billion videos<ref name=":1" /> with about 5% of those never having a view and just over 85% of them have fewer than 1,000 views.<ref>{{Cite journal |last1=McGrady |first1=Ryan |last2=Zheng |first2=Kevin |last3=Curran |first3=Rebecca |last4=Baumgartner |first4=Jason |last5=Zuckerman |first5=Ethan |date=2023-12-20 |title=Dialing for Videos: A Random Sample of YouTube |url=https://journalqd.org/article/view/4066/3766 |journal=Journal of Quantitative Description: Digital Media |language=en |volume=3 |doi=10.51685/jqd.2023.022 |issn=2673-8813}}</ref>
YouTube has an estimate 14 billion videos<ref name=":1" /> with about 5% of those never having a view and just over 85% of them have fewer than 1,000 views.<ref>{{Cite journal |last1=McGrady |first1=Ryan |last2=Zheng |first2=Kevin |last3=Curran |first3=Rebecca |last4=Baumgartner |first4=Jason |last5=Zuckerman |first5=Ethan |date=2023-12-20 |title=Dialing for Videos: A Random Sample of YouTube |url=https://journalqd.org/article/view/4066/3766 |journal=Journal of Quantitative Description: Digital Media |language=en |volume=3 |doi=10.51685/jqd.2023.022 |issn=2673-8813}}</ref>
=== Copyright issues ===
{{Main|YouTube copyright issues}}
{{further|#Revenue to copyright holders}}
YouTube has faced numerous challenges and criticisms in its attempts to deal with copyright, including the site's first viral video, [[Lazy Sunday (The Lonely Island song)|Lazy Sunday]], which had to be taken down, due to copyright concerns.<ref name="First Launched" /> At the time of uploading a video, YouTube users are shown a message asking them not to violate copyright laws.<ref>{{cite news |last=Marsden |first=Rhodri |date=August 12, 2009 |title=Why did my YouTube account get closed down? |work=The Independent |location=London |url=https://www.independent.co.uk/life-style/gadgets-and-tech/features/rhodri-marsden-why-did-my-youtube-account-get-closed-down-1770618.html |archive-url=https://ghostarchive.org/archive/20220507/https://www.independent.co.uk/life-style/gadgets-and-tech/features/rhodri-marsden-why-did-my-youtube-account-get-closed-down-1770618.html |archive-date=May 7, 2022 |url-access=subscription |url-status=live |access-date=August 12, 2009}}{{cbignore}}</ref> Despite this advice, many unauthorized clips of copyrighted material remain on YouTube. YouTube does not view videos before they are posted online, and it is left to copyright holders to issue a [[Digital Millennium Copyright Act|DMCA]] [[takedown notice]] pursuant to the terms of the [[Online Copyright Infringement Liability Limitation Act]]. Any successful complaint about copyright infringement results in a [[YouTube copyright strike]]. Three successful complaints for [[copyright infringement]] against a user account will result in the account and all of its uploaded videos being deleted.<ref>[https://www.youtube.com/t/copyright_strike Why do I have a sanction on my account?] {{Webarchive|url=https://web.archive.org/web/20130120143234/http://www.youtube.com/t/copyright_strike |date=January 20, 2013 }} YouTube. Retrieved February 5, 2012.</ref><ref>{{cite news |date=May 21, 2010 |title=Is YouTube's three-strike rule fair to users? |work=BBC News |location=London |url=https://news.bbc.co.uk/1/hi/programmes/click_online/8696716.stm |access-date=February 5, 2012 |archive-date=July 4, 2018 |archive-url=https://web.archive.org/web/20180704094039/http://news.bbc.co.uk/1/hi/programmes/click_online/8696716.stm |url-status=live }}</ref> From 2007 to 2009 organizations including [[Viacom (2005–2019)|Viacom]], [[Mediaset]], and the English [[Premier League]] have filed lawsuits against YouTube, claiming that it has done too little to prevent the uploading of copyrighted material.<ref>{{cite news |date=March 13, 2007 |title=Viacom will sue YouTube for $1bn|url=https://news.bbc.co.uk/1/hi/business/6446193.stm|work=[[BBC News]]|archive-date=January 15, 2009 |archive-url=https://web.archive.org/web/20090115123246/http://news.bbc.co.uk/1/hi/business/6446193.stm |url-status=live|access-date=May 26, 2008}}</ref><ref>{{cite news |date=July 30, 2008 |title=Mediaset Files EUR500 Million Suit Vs Google's YouTube |publisher=[[CNNMoney.com]] |url=https://money.cnn.com/news/newsfeeds/articles/djf500/200807301025DOWJONESDJONLINE000654_FORTUNE5.htm |access-date=August 19, 2009 |archive-date=September 8, 2008 |archive-url=https://web.archive.org/web/20080908122120/http://money.cnn.com/news/newsfeeds/articles/djf500/200807301025DOWJONESDJONLINE000654_FORTUNE5.htm |url-status=live }}</ref><ref>{{cite news |date=May 5, 2007 |title=Premier League to take action against YouTube |website=[[The Daily Telegraph]] |url=https://www.telegraph.co.uk/sport/football/2312532/Premier-League-to-take-action-against-YouTube.html |archive-url=https://ghostarchive.org/archive/20220110/https://www.telegraph.co.uk/sport/football/2312532/Premier-League-to-take-action-against-YouTube.html |archive-date=January 10, 2022 |url-access=subscription |url-status=live |access-date=March 26, 2017}}{{cbignore}}</ref>
In August 2008, a US court ruled in ''[[Lenz v. Universal Music Corp.]]'' that copyright holders cannot order the removal of an online file without first determining whether the posting reflected [[fair use]] of the material.<ref>{{cite news |last=Egelko |first=Bob |date=August 20, 2008 |title=Woman can sue over YouTube clip de-posting |work=San Francisco Chronicle |url=https://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2008/08/20/MNU412FKRL.DTL |access-date=August 25, 2008 |archive-date=August 25, 2008 |archive-url=https://web.archive.org/web/20080825003638/http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2008/08/20/MNU412FKRL.DTL |url-status=live }}</ref> YouTube's owner Google announced in November 2015 that they would help cover the legal cost in select cases where they believe fair use defenses apply.<ref>{{cite magazine |last=Finley |first=Klint |date=November 19, 2015 |title=Google Pledges to Help Fight Bogus YouTube Copyright Claims—for a Few |url=https://www.wired.com/2015/11/google-pledges-to-help-fight-bogus-youtube-copyright-claims-for-a-few/ |magazine=Wired |access-date=March 25, 2017 |archive-date=March 20, 2017 |archive-url=https://web.archive.org/web/20170320144102/https://www.wired.com/2015/11/google-pledges-to-help-fight-bogus-youtube-copyright-claims-for-a-few/ |url-status=live }}</ref>
In the 2011 case of ''[[Smith v. Summit Entertainment LLC]]'', professional singer Matt Smith sued Summit Entertainment for the wrongful use of copyright takedown notices on YouTube.<ref>{{cite web |publisher=Ohio Northern District Court |date=July 18, 2013 |url=https://www.docketalarm.com/cases/Ohio_Northern_District_Court/3--11-cv-00348/Smith__v_Summit_Entertainment_LLC/#q= |access-date=October 21, 2014 |title=Smith v. Summit Entertainment LLC |website=Docket Alarm, Inc. |archive-date=June 19, 2024 |archive-url=https://web.archive.org/web/20240619012909/https://www.docketalarm.com/cases/Ohio_Northern_District_Court/3--11-cv-00348/Smith__v._Summit_Entertainment_LLC/#q= |url-status=live }}</ref> He asserted seven [[causes of action]], and four were ruled in Smith's favor.<ref>{{cite web |author=District Judge James G. Carr |date=June 6, 2011 |title=Order |url=https://scholar.google.com/scholar_case?case=4653165041580834913 |access-date=November 7, 2011 |work=Smith v. Summit Entertainment LLC |publisher=United States District Court, N.D. Ohio, Western Division |archive-date=January 30, 2016 |archive-url=https://web.archive.org/web/20160130083207/http://scholar.google.com/scholar_case?case=4653165041580834913 |url-status=live }}</ref> In April 2012, a court in Hamburg ruled that YouTube could be held responsible for copyrighted material posted by its users.<ref>{{cite news |date=April 20, 2012 |title=YouTube loses court battle over music clips |work=[[BBC News]] |location=London |url=https://www.bbc.co.uk/news/technology-17785613 |access-date=April 20, 2012 |archive-date=October 16, 2012 |archive-url=https://web.archive.org/web/20121016014454/http://www.bbc.co.uk/news/technology-17785613 |url-status=live }}</ref> On November 1, 2016, the dispute with GEMA was resolved, with Google content ID being used to allow advertisements to be added to videos with content protected by GEMA.<ref>{{cite news |date=November 1, 2016 |title=YouTube's seven-year stand-off ends |work=[[BBC News]] |location=London |url=https://www.bbc.co.uk/news/technology-37839038 |access-date=November 2, 2016 |archive-date=November 3, 2016 |archive-url=https://web.archive.org/web/20161103103021/http://www.bbc.co.uk/news/technology-37839038 |url-status=live }}</ref>
In April 2013, it was reported that [[Universal Music Group]] and YouTube have a contractual agreement that prevents content blocked on YouTube by a request from UMG from being restored, even if the uploader of the video files a DMCA counter-notice.<ref>{{cite web |title=YouTube's Deal With Universal Blocks DMCA Counter Notices |url=https://torrentfreak.com/youtube-deal-with-universal-blocks-dmca-counter-notices-130405/|publisher=TorrentFreak |archive-url=https://web.archive.org/web/20130407164748/http://torrentfreak.com/youtube-deal-with-universal-blocks-dmca-counter-notices-130405/ |url-status=live|archive-date=April 7, 2013|date=April 5, 2013|access-date=April 5, 2013}}</ref><ref>{{cite web |title=Videos removed or blocked due to YouTube's contractual obligations |url=https://support.google.com/youtube/bin/answer.py?hl=en&answer=3045545 |access-date=April 5, 2013 |archive-date=May 14, 2013 |archive-url=https://web.archive.org/web/20130514115738/http://support.google.com/youtube/bin/answer.py?hl=en&answer=3045545 |url-status=live }}</ref> As part of YouTube Music, Universal and YouTube signed an agreement in 2017, which was followed by separate agreements other major labels, which gave the company the right to advertising revenue when its music was played on YouTube.<ref>{{cite web |last1=Aswad |first1=Jem |date=December 19, 2017 |title=YouTube Strikes New Deals With Universal and Sony Music |url=https://variety.com/2017/biz/news/universal-music-group-and-youtube-reach-new-global-multi-year-agreement-1202644815/ |access-date=April 22, 2021 |website=Variety |language=en-US |archive-date=April 22, 2021 |archive-url=https://web.archive.org/web/20210422152635/https://variety.com/2017/biz/news/universal-music-group-and-youtube-reach-new-global-multi-year-agreement-1202644815/ |url-status=live }}</ref> By 2019, creators were having videos taken down or demonetized when Content ID identified even short segments of copyrighted music within a much longer video, with different levels of enforcement depending on the record label.<ref name="fighting">{{cite web |last=Alexander |first=Julia |date=May 24, 2019 |title=YouTubers and record labels are fighting, and record labels keep winning |url=https://www.theverge.com/2019/5/24/18635904/copyright-youtube-creators-dmca-takedown-fair-use-music-cover |access-date=April 22, 2021 |website=The Verge |language=en |archive-date=April 22, 2021 |archive-url=https://web.archive.org/web/20210422152639/https://www.theverge.com/2019/5/24/18635904/copyright-youtube-creators-dmca-takedown-fair-use-music-cover |url-status=live}}</ref> Experts noted that some of these clips said qualified for fair use.<ref name="fighting" />
==== Content ID ====
{{Main|Content ID}}
In June 2007, YouTube began trials of a system for automatic detection of uploaded videos that infringe copyright. Google CEO Eric Schmidt regarded this system as necessary for resolving lawsuits such as the one from [[Viacom (2005–2019)|Viacom]], which alleged that YouTube profited from content that it did not have the right to distribute.<ref>{{cite news |last=Delaney |first=Kevin J. |date=June 12, 2007 |title=YouTube to Test Software To Ease Licensing Fights |work=The Wall Street Journal |url=https://online.wsj.com/article/SB118161295626932114.html |access-date=December 4, 2011 |archive-date=February 20, 2012 |archive-url=https://web.archive.org/web/20120220085307/http://online.wsj.com/article/SB118161295626932114.html |url-status=live }}</ref> The system, which was initially called "Video Identification"<ref>{{Citation|last=YouTube Advertisers|title=Video Identification|date=February 4, 2008|url=https://www.youtube.com/watch?v=xWizsV5Le7s|access-date=August 29, 2018}}{{cbignore}}{{Dead YouTube link|date=February 2022}}</ref><ref>{{cite news |last=King |first=David |date=December 2, 2010 |title=Content ID turns three |language=en-US |work=Official YouTube Blog |url=https://youtube.googleblog.com/2010/12/content-id-turns-three.html |access-date=August 29, 2018}}</ref> and later became known as Content ID,<ref>{{cite web |date=September 28, 2010 |title=YouTube Content ID |url=https://www.youtube.com/watch?v=9g2U12SsRns |archive-url=https://ghostarchive.org/varchive/youtube/20211221/9g2U12SsRns |archive-date=December 21, 2021 |url-status=live |access-date=May 25, 2015 |via=YouTube}}{{cbignore}}</ref> creates an ID File for copyrighted audio and video material, and stores it in a database. When a video is uploaded, it is checked against the database, and flags the video as a copyright violation if a match is found.<ref name="youtube">[https://www.youtube.com/t/contentid_more More about Content ID] YouTube. Retrieved December 4, 2011.</ref> When this occurs, the content owner has the choice of blocking the video to make it unviewable, tracking the viewing statistics of the video, or adding advertisements to the video.
An independent test in 2009 uploaded multiple versions of the same song to YouTube and concluded that while the system was "surprisingly resilient" in finding copyright violations in the audio tracks of videos, it was not infallible.<ref>{{cite news |last1=Von Lohmann |first1=Fred |date=April 23, 2009 |title=Testing YouTube's Audio Content ID System |newspaper=Electronic Frontier Foundation |url=https://www.eff.org/deeplinks/2009/04/testing-youtubes-aud |access-date=December 4, 2011}}</ref> The use of Content ID to remove material automatically has led to [[YouTube copyright issues|controversy]] in some cases, as the videos have not been checked by a human for fair use.<ref>{{cite news |last1=Von Lohmann |first1=Fred |date=February 3, 2009 |title=YouTube's January Fair Use Massacre |newspaper=Electronic Frontier Foundation |url=https://www.eff.org/deeplinks/2009/01/youtubes-january-fair-use-massacre |access-date=December 4, 2011}}</ref> If a YouTube user disagrees with a decision by Content ID, it is possible to fill in a form disputing the decision.<ref>[https://www.youtube.com/t/contentid_dispute Content ID disputes] YouTube. Retrieved December 4, 2011.</ref>
Before 2016, videos were not monetized until the dispute was resolved. Since April 2016, videos continue to be monetized while the dispute is in progress, and the money goes to whoever won the dispute.<ref>{{cite web |last1=Hernandez |first1=Patricia |title=YouTube's Content ID System Gets One Much-Needed Fix |url=https://kotaku.com/youtubes-content-id-system-gets-one-much-needed-fix-1773643254 |access-date=September 16, 2017 |website=Kotaku |date=April 28, 2016}}</ref> Should the uploader want to monetize the video again, they may remove the disputed audio in the "Video Manager".<ref>{{cite web |title=Remove Content ID claimed songs from my videos – YouTube Help |url=https://support.google.com/youtube/answer/2902117?hl=en |access-date=September 17, 2017 |publisher=Google Inc. |language=en}}</ref> YouTube has cited the effectiveness of Content ID as one of the reasons why the site's rules were modified in December 2010 to allow some users to upload videos of unlimited length.<ref>{{cite web |last1=Siegel |first1=Joshua |last2=Mayle |first2=Doug |date=December 9, 2010 |title=Up, Up and Away – Long videos for more users |url=https://youtube.googleblog.com/2010/12/up-up-and-away-long-videos-for-more.html |access-date=March 25, 2017 |website=Official YouTube Blog}}</ref>
=== Moderation and offensive content ===
{{Main|YouTube moderation}}
{{See also|Criticism of Google#YouTube|Censorship by Google#YouTube|Content moderation}}
YouTube has a set of community guidelines aimed to reduce abuse of the site's features. The uploading of videos containing defamation, pornography, and material encouraging criminal conduct is forbidden by YouTube's "Community Guidelines".<ref name="guidelines">{{cite web |title=YouTube Community Guidelines |url=https://www.youtube.com/t/community_guidelines |archive-url=https://web.archive.org/web/20170304150155/https://www.youtube.com/yt/policyandsafety/communityguidelines.html |archive-date=March 4, 2017 |access-date=November 30, 2008 |via=YouTube}}</ref>{{better source needed|date=August 2019|reason=The current source is a primary source}} Generally prohibited material includes sexually explicit content, videos of animal abuse, [[shock site|shock videos]], content uploaded without the copyright holder's consent, hate speech, spam, and predatory behavior.<ref name="guidelines" /> YouTube relies on its users to flag the content of videos as inappropriate, and a YouTube employee will view a flagged video to determine whether it violates the site's guidelines.<ref name="guidelines" /> Despite the guidelines,YouTube has faced criticism over aspects of its operations,<ref name="demonetization">{{cite web |last=Alexander |first=Julia |date=May 10, 2018 |title=The Yellow $: a comprehensive history of demonetization and YouTube's war with creators |url=https://www.polygon.com/2018/5/10/17268102/youtube-demonetization-pewdiepie-logan-paul-casey-neistat-philip-defranco |access-date=November 3, 2019 |website=Polygon |language=en}}</ref> its [[recommender system|recommendation algorithms]] perpetuating [[#Promotion of conspiracy theories and fringe discourse|videos that promote conspiracy theories]] and falsehoods,<ref>{{cite news |last1=Wong |first1=Julia Carrie |author-link=Julia Carrie Wong |last2=Levin |first2=Sam |date=January 25, 2019 |title=YouTube vows to recommend fewer conspiracy theory videos |language=en-GB |work=The Guardian |url=https://www.theguardian.com/technology/2019/jan/25/youtube-conspiracy-theory-videos-recommendations |access-date=November 3, 2019 |issn=0261-3077}}</ref> hosting videos ostensibly targeting children but containing [[Elsagate|violent or sexually suggestive content involving popular characters]],<ref>{{cite news |last=Orphanides |first=K. G. |date=March 23, 2018 |title=Children's YouTube is still churning out blood, suicide and cannibalism |magazine=Wired UK |url=https://www.wired.co.uk/article/youtube-for-kids-videos-problems-algorithm-recommend |access-date=November 3, 2019 |issn=1357-0978}}</ref> videos of minors attracting [[Pedophilia|pedophilic]] activities in their comment sections,<ref>{{cite news |last=Orphanides |first=K. G. |date=February 20, 2019 |title=On YouTube, a network of paedophiles is hiding in plain sight |magazine=Wired UK |url=https://www.wired.co.uk/article/youtube-pedophile-videos-advertising |access-date=November 3, 2019 |issn=1357-0978}}</ref> and fluctuating policies on the types of content that is eligible to be monetized with advertising.<ref name="demonetization" />
YouTube contracts companies to hire content moderators, who view content flagged as potentially violating YouTube's content policies and determines if they should be removed. In September 2020, a class-action suit was filed by a former content moderator who reported developing [[post-traumatic stress disorder]] (PTSD) after an 18-month period on the job.<ref>{{cite web |last=Kimball |first=Whitney |date=September 22, 2020 |title=Content Moderator Exposed to Child Assault and Animal Torture Sues YouTube |url=https://gizmodo.com/youtube-moderator-sues-over-ptsd-symptoms-lack-of-work-1845143110 |access-date=October 11, 2020 |work=Gizmodo}}</ref><ref>{{cite news |last=Vincent |first=James |date=September 22, 2020 |title=Former YouTube content moderator sues the company after developing symptoms of PTSD |url=https://www.theverge.com/2020/9/22/21450477/youtube-content-moderator-sues-lawsuit-ptsd-graphic-content-exposure |access-date=October 11, 2020 |work=The Verge}}</ref><ref>{{cite web |last=Elias |first=Jennifer |date=September 22, 2020 |title=Former YouTube content moderator describes horrors of the job in new lawsuit |url=https://www.cnbc.com/2020/09/22/former-youtube-content-moderator-describes-horrors-of-the-job-in-lawsuit.html |access-date=October 11, 2020 |publisher=CNBC}}</ref>
Controversial moderation decisions have included material relating to [[Holocaust denial]],<ref>{{cite news |title=YouTube criticized in Germany over anti-Semitic Nazi videos |url=https://www.haaretz.com/hasen/spages/898004.html |url-status=dead |archive-url=https://web.archive.org/web/20080517001126/http://www.haaretz.com/hasen/spages/898004.html |archive-date=May 17, 2008 |access-date=May 28, 2008 |agency=Reuters}}</ref> the [[Hillsborough disaster]],<ref>{{cite web |title=Fury as YouTube carries sick Hillsboro video insult |url=https://icliverpool.icnetwork.co.uk/0100news/0100regionalnews/tm_headline=fury-as-youtube-carries-sick-hillsboro-video-insult%26method=full%26objectid=18729523%26page=1%26siteid=50061-name_page.html |url-status=dead |archive-url=https://web.archive.org/web/20120320021147/https://icliverpool.icnetwork.co.uk/0100news/0100regionalnews/tm_headline%3Dfury-as-youtube-carries-sick-hillsboro-video-insult%26method%3Dfull%26objectid%3D18729523%26page%3D1%26siteid%3D50061-name_page.html |archive-date=March 20, 2012 |access-date=November 29, 2015 |publisher=icLiverpool}}</ref> [[Anthony Bourdain]]'s death,<ref>{{cite news |last=Alba |first=Davey |date=June 16, 2018 |title=YouTube Is Spreading Conspiracy Theories about Anthony Bourdain's Death |url=https://www.buzzfeednews.com/article/daveyalba/conspiracy-theories-about-anthony-bourdains-death-are |access-date=June 16, 2018 |work=[[BuzzFeed News]] |language=en}}</ref> and the [[Notre-Dame fire]].<ref>{{cite news |last=Bergen |first=Mark |date=April 15, 2019 |title=YouTube Flags Notre-Dame Fire as 9/11 Conspiracy, Says System Made 'Wrong Call' |url=https://www.bloomberg.com/news/articles/2019-04-15/youtube-flags-notre-dame-fire-as-9-11-conspiracy-in-wrong-call?srnd=technology-vp |access-date=April 15, 2019 |publisher=[[Bloomberg L.P.]]}}</ref> In July 2008, the Culture and Media Committee of the House of Commons of the United Kingdom stated that it was "unimpressed" with YouTube's system for policing its videos, and argued that "proactive review of content should be standard practice for sites hosting user-generated content".<ref>{{cite news |last1=Kirkup |first1=James |last2=Martin |first2=Nicole |date=July 31, 2008 |title=YouTube attacked by MPs over sex and violence footage |url=https://www.telegraph.co.uk/technology/3358061/YouTube-attacked-by-MPs-over-sex-and-violence-footage.html |url-access=subscription |url-status=live |archive-url=https://ghostarchive.org/archive/20220110/https://www.telegraph.co.uk/technology/3358061/YouTube-attacked-by-MPs-over-sex-and-violence-footage.html |archive-date=2022-01-10 |access-date=March 26, 2017 |website=[[The Daily Telegraph]]}}
{{cbignore}}</ref>
In June 2022, [[Media Matters]], a media watchdog group, reported that [[homophobic]] and [[transphobic]] content calling LGBT people [[LGBT grooming conspiracy theory|"predators" and "groomers"]] was becoming more common on YouTube.<ref name="lawton_20220623">{{cite web |url=https://www.mediamatters.org/google/right-wing-clickbait-pushing-anti-lgbtq-groomer-smears-are-increasingly-popular-youtube |title=Right-wing clickbait pushing anti-LGBTQ 'groomer' smears are increasingly popular on YouTube |website=Media Matters |last1=Lawton |first1=Sophie |date=June 23, 2022 |access-date=October 23, 2022}}</ref> The report also referred to common accusations in YouTube videos that LGBT people are [[mental illness|mentally ill]].<ref name="lawton_20220623" /> The report stated the content appeared to be in violation of YouTube's hate speech policy.<ref name="lawton_20220623" />
An August 2022 report by the [[Center for Countering Digital Hate]], a British think tank, found that harassment against women was flourishing on YouTube.<ref name="misogyny">{{cite news |last=Lorenz |first=Taylor |author-link=Taylor Lorenz |date=September 18, 2022 |title=YouTube remains rife with misogyny and harassment, creators say |url=https://www.washingtonpost.com/technology/2022/09/18/you-tube-mysogyny-women-hate/ |access-date=December 26, 2022 |newspaper=[[The Washington Post]] |language=en-US |issn=0190-8286}}</ref> In his 2022 book ''Like, Comment, Subscribe: Inside YouTube's Chaotic Rise to World Domination'', [[Bloomberg News|Bloomberg]] reporter Mark Bergen said that many female content creators were dealing with harassment, bullying, and stalking.<ref name="misogyny" />
==== Conspiracy theories and far-right content{{anchor|Promotion_of_conspiracy_theories_and_fringe_discourse|Conspiracy_theories_and_fringe_discourse}} ====
YouTube has been criticized for using an algorithm that gives great prominence to videos that promote conspiracy theories, falsehoods and incendiary fringe discourse.<ref name="Darkest">{{cite news |last=Nicas |first=Jack |date=February 7, 2018 |title=How YouTube Drives People to the Internet's Darkest Corners |language=en-US |work=The Wall Street Journal |url=https://www.wsj.com/articles/how-youtube-drives-viewers-to-the-internets-darkest-corners-1518020478 |access-date=June 16, 2018 |issn=0099-9660}}</ref><ref>{{cite news |title=As Germans Seek News, YouTube Delivers Far-Right Tirades |newspaper=The New York Times |date=September 7, 2018 |language=en |url=https://www.nytimes.com/2018/09/07/world/europe/youtube-far-right-extremism.html |access-date=September 8, 2018 |last1=Fisher |first1=Max |last2=Bennhold |first2=Katrin}}</ref><ref name="secret life">{{cite news |last1=Ingram |first1=Matthew |title=YouTube's secret life as an engine for right-wing radicalization |language=en |work=Columbia Journalism Review |issue=September 19, 2018 |url=https://www.cjr.org/the_media_today/youtube-conspiracy-radicalization.php |access-date=March 26, 2019}}</ref><ref>{{cite news |title=YouTube wants the news audience, but not the responsibility |url=https://www.cjr.org/innovations/youtube-wants-the-news-audience-but-not-the-responsibility.php |access-date=September 23, 2018 |work=Columbia Journalism Review |language=en}}</ref> According to an investigation by ''The Wall Street Journal'', "YouTube's recommendations often lead users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven't shown interest in such content. When users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more-extreme viewpoints."<ref name="Darkest" /><ref>{{cite web |last1=Lewis |first1=Rebecca |date=September 2018 |title=Alternative Influence: Broadcasting the Reactionary Right on YouTube |url=https://datasociety.net/wp-content/uploads/2018/09/DS_Alternative_Influence.pdf |access-date=March 26, 2019 |website=datasociety.net |publisher=Data and Society}}</ref> After YouTube drew controversy for giving top billing to videos promoting falsehoods and conspiracy when people made breaking-news queries during the [[2017 Las Vegas shooting]], YouTube changed its algorithm to give greater prominence to mainstream media sources.<ref name="Darkest" /><ref>{{cite news |last=Nicas |first=Jack |date=October 6, 2017 |title=YouTube Tweaks Search Results as Las Vegas Conspiracy Theories Rise to Top |language=en-US |work=The Wall Street Journal |url=https://www.wsj.com/articles/youtube-tweaks-its-search-results-after-rise-of-las-vegas-conspiracy-theories-1507219180 |access-date=June 16, 2018 |issn=0099-9660}}</ref><ref>{{cite news |title=Here's How YouTube Is Spreading Conspiracy Theories About The Vegas Shooting |language=en |work=BuzzFeed |url=https://www.buzzfeed.com/charliewarzel/heres-how-youtube-is-spreading-conspiracy-theories-about |access-date=June 16, 2018}}</ref><ref>{{cite news |title=The Big Tech Platforms Still Suck During Breaking News |language=en |work=BuzzFeed |url=https://www.buzzfeed.com/charliewarzel/the-big-tech-platforms-are-still-botching-breaking-news |access-date=June 16, 2018}}</ref>
In 2017, it was revealed that advertisements were being placed on extremist videos, including videos by rape apologists, anti-Semites, and hate preachers who received ad payouts.<ref name="apologises">{{cite news |date=March 20, 2017 |title=Google apologises as M&S pulls ads |language=en-GB |work=BBC News |url=https://www.bbc.com/news/business-39325916 |access-date=June 16, 2018}}</ref> After firms started to stop advertising on YouTube in the wake of this reporting, YouTube apologized and said that it would give firms greater control over where ads got placed.<ref name="apologises" />
University of North Carolina professor [[Zeynep Tufekci]] has referred to YouTube as "The Great Radicalizer", saying "YouTube may be one of the most powerful radicalizing instruments of the 21st century."<ref>{{cite news |title=Opinion {{!}} YouTube, the Great Radicalizer |newspaper=The New York Times |date=March 10, 2018 |language=en |url=https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-politics-radical.html |access-date=June 16, 2018 |last1=Tufekci |first1=Zeynep |id={{ProQuest|2610860590}}}}</ref> Jonathan Albright of the Tow Center for Digital Journalism at Columbia University described YouTube as a "conspiracy ecosystem".<ref name="secret life" /><ref>{{cite news |title=Parkland shooting 'crisis actor' videos lead users to a 'conspiracy ecosystem' on YouTube, new research shows |url=https://www.washingtonpost.com/news/the-switch/wp/2018/02/25/parkland-shooting-crisis-actor-videos-lead-users-to-a-conspiracy-ecosystem-on-youtube-new-research-shows/ |access-date=September 23, 2018 |newspaper=The Washington Post |language=en}}</ref>
===== Use among white supremacists =====
Before 2019, YouTube took steps to remove specific videos or channels related to [[Supremacism|supremacist]] content that had violated its acceptable use policies but otherwise did not have site-wide policies against [[hate speech]].<ref name="youtubeblog june2019">{{cite web |date=June 5, 2019 |title=Our ongoing work to tackle hate |url=https://youtube.googleblog.com/2019/06/our-ongoing-work-to-tackle-hate.html |access-date=April 9, 2020 |via=YouTube}}</ref>
In the wake of the March 2019 [[Christchurch mosque attacks]], YouTube and other sites like Facebook and Twitter that allowed user-submitted content drew criticism for doing little to moderate and control the spread of hate speech, which was considered to be a factor in the rationale for the attacks.<ref>{{cite web |last=Robertson |first=Adi |date=March 15, 2019 |title=Questions about policing online hate are much bigger than Facebook and YouTube |url=https://www.theverge.com/2019/3/15/18267638/new-zealand-christchurch-mass-shooting-online-hate-facebook-youtube |access-date=April 9, 2020 |work=[[The Verge]]}}</ref><ref>{{cite news |last1=Timberg |first1=Craig |last2=Harwell |first2=Drew |last3=Shaban |first3=Hamza |last4=Ba Tran |first4=Andrew |last5=Fung |first5=Brian |date=March 15, 2020 |title=The New Zealand shooting shows how YouTube and Facebook spread hate and violent images – yet again |url=https://www.washingtonpost.com/technology/2019/03/15/facebook-youtube-twitter-amplified-video-christchurch-mosque-shooting/ |access-date=April 9, 2020 |newspaper=[[The Washington Post]]}}</ref> These platforms were pressured to remove such content, but in an interview with ''The New York Times'', YouTube's then chief product officer Neal Mohan said that unlike content such as [[ISIS]] videos which take a particular format and thus easy to detect through computer-aided algorithms, general hate speech was more difficult to recognize and handle, and thus could not readily take action to remove without human interaction.<ref>{{cite web |last=Roose |first=Kevin |date=March 29, 2019 |title=YouTube's Product Chief on Online Radicalization and Algorithmic Rabbit Holes |url=https://www.nytimes.com/2019/03/29/technology/youtube-online-extremism.html |access-date=April 9, 2020 |work=The New York Times}}</ref>
In May 2019, YouTube joined an initiative led by France and New Zealand with other countries and tech companies to develop tools to be used to block [[online hate speech]] and to develop regulations, to be implemented at the national level, to be levied against technology firms that failed to take steps to remove such speech, though the United States declined to participate.<ref>{{cite web |last=Browne |first=Ryan |date=May 15, 2019 |title=New Zealand and France unveil plans to tackle online extremism without the US on board |url=https://www.cnbc.com/2019/05/15/new-zealand-france-unveil-plans-to-tackle-online-extremism-without-us.html |access-date=April 9, 2020 |publisher=[[CNBC]]}}</ref><ref>{{cite web |last=Willsher |first=Kim |date=May 15, 2019 |title=Leaders and tech firms pledge to tackle extremist violence online |url=https://www.theguardian.com/world/2019/may/15/jacinda-ardern-emmanuel-macron-christchurch-call-summit-extremist-violence-online |access-date=April 9, 2020 |work=[[The Guardian]]}}</ref> Subsequently, on June 5, 2019, YouTube announced a major change to its terms of service and further stated it would "remove content denying that well-documented violent events, like the Holocaust or [[Sandy Hook Elementary School shooting|the shooting at Sandy Hook Elementary]], took place."<ref name="youtubeblog june2019" /><ref>{{cite web |last=Newton |first=Casey |date=June 5, 2019 |title=YouTube just banned supremacist content, and thousands of channels are about to be removed |url=https://www.theverge.com/2019/6/5/18652576/youtube-supremacist-content-ban-borderline-extremist-terms-of-service |access-date=April 9, 2020 |work=[[The Verge]]}}</ref>
In June 2020, YouTube was criticized for allowing white supremacist content on its platform for years after it announced it would be pledging $1 million to fight racial injustice.<ref>{{Cite web |last=Hamilton |first=Isobel Asher |date=June 1, 2020 |title=YouTube has pledged $1 million in solidarity with Black Lives Matter protesters, but critics note the site has allowed white supremacist videos for years |url=https://www.businessinsider.com/youtube-pledges-1-million-to-fight-racial-injustice-draws-criticism-2020-6 |access-date=May 11, 2024 |website=Business Insider |language=en-US}}</ref> Later that month, it banned several channels associated with white supremacy, including those of [[Stefan Molyneux]], [[David Duke]], and [[Richard B. Spencer]], asserting these channels violated their policies on hate speech.<ref>{{cite web |last=Alexander |first=Julia |date=June 29, 2020 |title=YouTube bans Stefan Molyneux, David Duke, Richard Spencer, and more for hate speech |url=https://www.theverge.com/2020/6/29/21307303/youtube-bans-molyneux-duke-richard-spencer-conduct-hate-speech |access-date=June 29, 2020 |work=[[The Verge]]}}</ref>
==== Misinformation and handling of the COVID-19 pandemic ====
Multiple research studies have investigated cases of misinformation in YouTube. In a July 2019 study based on ten YouTube searches using the [[Tor Browser]] related to climate and climate change, the majority of videos were videos that communicated views contrary to the [[scientific consensus on climate change]].<ref>{{cite journal |last=Allgaier |first=Joachim |date=July 25, 2019 |title=Science and Environmental Communication on YouTube: Strategically Distorted Communications in Online Videos on Climate Change and Climate Engineering |journal=Frontiers in Communication |volume=4 |doi=10.3389/fcomm.2019.00036 |issn=2297-900X |doi-access=free}}</ref> A May 2023 study found that YouTube was monetizing and profiting from videos that included misinformation about climate change.<ref>{{Cite web |date=May 4, 2023 |title=Google profiting from climate misinformation on YouTube, report finds |url=https://www.independent.co.uk/climate-change/news/google-youtube-climate-disinformation-ads-b2331573.html |access-date=August 27, 2023 |website=The Independent |language=en}}</ref> A 2019 BBC investigation of YouTube searches in ten different languages found that YouTube's algorithm promoted health misinformation, including fake cancer cures.<ref>{{cite news |last1=Carmichael |first1=Flora |last2=Gragani |first2=Juliana |date=September 12, 2019 |others=Beyond Fake News & B.B.C. Monitoring |title=How YouTube makes money from fake cancer cure videos |url=https://www.bbc.com/news/blogs-trending-49483681 |access-date=September 27, 2019 |work=BBC News |language=en}}</ref> In Brazil, YouTube has been linked to pushing pseudoscientific misinformation on health matters, as well as elevated far-right fringe discourse and conspiracy theories.<ref>{{cite news |last1=Fisher |first1=Max |last2=Taub |first2=Amanda |date=August 11, 2019 |title=How YouTube Radicalized Brazil |url=https://www.nytimes.com/2019/08/11/world/americas/youtube-brazil.html |access-date=August 12, 2019 |work=The New York Times |language=en-US |issn=0362-4331}}</ref> In the Philippines, numerous channels disseminated misinformation related to the [[2022 Philippine general election|2022 Philippine elections]].<ref>{{cite news |last=Tuquero |first=Loreben |date=September 22, 2021 |title=Red flag for 2022: Political lies go unchecked on YouTube showbiz channels |url=https://www.rappler.com/nation/elections/political-lies-unchecked-youtube-showbiz-channels-red-flag-candidates-2022 |access-date=September 23, 2021 |work=[[Rappler]] |publisher=Rappler Inc. |location=[[Manila]], Philippines}}</ref> Additionally, research on the dissemination of [[Modern flat Earth beliefs|Flat Earth]] beliefs in social media, has shown that networks of YouTube channels form an echo chamber that polarizes audiences by appearing to confirm preexisting beliefs.<ref>{{cite journal |last1=Diaz Ruiz |first1=Carlos |last2=Nilsson |first2=Tomas |date=August 8, 2022 |title=Disinformation and Echo Chambers: How Disinformation Circulates on Social Media Through Identity-Driven Controversies |journal=Journal of Public Policy & Marketing |language=en |volume=42 |pages=18–35 |doi=10.1177/07439156221103852 |issn=0743-9156 |s2cid=248934562 |doi-access=}}</ref>
In 2018, YouTube introduced a system that would automatically add information boxes to videos that its algorithms determined may present conspiracy theories and other [[fake news]], filling the infobox with content from [[Encyclopædia Britannica]] and [[Wikipedia]] as a means to inform users to minimize misinformation propagation without impacting freedom of speech.<ref>{{cite web |last=Newton |first=Casey |date=March 13, 2018 |title=YouTube will add information from Wikipedia to videos about conspiracies |url=https://www.theverge.com/2018/3/13/17117344/youtube-information-cues-conspiracy-theories-susan-wojcicki-sxsw |access-date=April 15, 2019 |work=[[The Verge]]}}</ref><ref>{{Cite news |last=Brown |first=David |date=March 14, 2018 |title=YouTube uses Wikipedia to fight fake news |url=https://www.thetimes.co.uk/article/youtube-fights-fake-news-with-wikipedia-frkpc8nm2 |url-status=live |archive-url=https://archive.today/20210927105159/https://www.thetimes.co.uk/article/youtube-fights-fake-news-with-wikipedia-frkpc8nm2 |archive-date=September 27, 2021 |access-date=July 13, 2023 |work=[[The Times]] |language=en |issn=0140-0460}}</ref> In 2023, YouTube revealed its changes in handling content associated with [[eating disorder]]s. This social media platform's Community Guidelines now prohibit content that could encourage emulation from at-risk users.<ref>{{cite news |title=YouTube rolls out new policies for eating disorder content |url=https://edition.cnn.com/2023/04/18/tech/youtube-eating-disorder-policies/index.html |publisher=CNN}}</ref>
In January 2019, YouTube said that it had introduced a new policy starting in the United States intended to stop recommending videos containing "content that could misinform users in harmful ways." YouTube gave [[Modern flat Earth societies|flat earth theories]], miracle cures, and [[9/11 Truth movement|9/11 truther-isms]] as examples.<ref>{{cite news |last=Weill |first=Kelly |date=January 25, 2019 |title=YouTube Tweaks Algorithm to Fight 9/11 Truthers, Flat Earthers, Miracle Cures |url=https://www.thedailybeast.com/youtube-tweaks-algorithm-to-fight-911-truthers-flat-earthers-miracle-cures |access-date=January 29, 2019 |language=en}}</ref> Efforts within YouTube engineering to stop recommending borderline extremist videos falling just short of forbidden hate speech, and track their popularity were originally rejected because they could interfere with viewer engagement.<ref>{{cite news |last1=Bergen |first1=Mark |date=April 2, 2019 |title=YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant |url=https://www.bloomberg.com/news/features/2019-04-02/youtube-executives-ignored-warnings-letting-toxic-videos-run-rampant |access-date=April 2, 2019 |work=Bloomberg News}}</ref> In July 2022, YouTube announced policies to combat misinformation surrounding [[abortion]], such as videos with instructions to perform abortion methods that are considered unsafe and videos that contain misinformation about the [[safety of abortion]].<ref>{{cite web |last=Elias |first=Jennifer |date=July 21, 2022 |title=YouTube says it will crack down on abortion misinformation and remove videos with false claims |url=https://www.cnbc.com/2022/07/21/youtube-says-it-will-crack-down-on-abortion-misinformation.html |access-date=July 21, 2022 |publisher=CNBC |language=en}}</ref> Google and YouTube implemented policies in October 2021 to deny monetization or revenue to advertisers or content creators that promoted [[climate change denial]].<ref>{{cite web |last=Peters |first=Jay |date=October 7, 2021 |title=Google and YouTube will cut off ad money for climate change deniers |url=https://www.theverge.com/2021/10/7/22715102/google-youtube-climate-change-deniers-ads-monetization |access-date=October 7, 2021 |work=[[The Verge]]}}</ref> In January 2024, the [[Center for Countering Digital Hate]] reported that climate change deniers were instead pushing other forms of climate change denial that have not yet been banned by YouTube.<ref>{{Cite web |last=Belanger |first=Ashley |date=January 16, 2024 |title=Climate denialists find new ways to monetize disinformation on YouTube |url=https://arstechnica.com/tech-policy/2024/01/youtube-profits-from-videos-claiming-global-warming-is-beneficial/ |access-date=January 31, 2024 |website=Ars Technica}}</ref><ref>{{Cite news |date=January 17, 2024 |title=YouTube making money off new breed of climate denial, monitoring group says |url=https://www.reuters.com/sustainability/climate-energy/youtube-making-money-off-new-breed-climate-denial-monitoring-group-says-2024-01-16/ |access-date=January 31, 2024 |work=Reuters}}</ref>
Following the dissemination via YouTube of [[misinformation related to the COVID-19 pandemic]] that [[5G]] communications technology was responsible for the spread of [[coronavirus disease 2019]] which led to multiple 5G towers in the United Kingdom being attacked by arsonists, YouTube removed all such videos linking 5G and the coronavirus in this manner.<ref name="guardian-youtube-to-suppress-content-spreading-coronavirus-5g-conspiracy-theory">{{cite news |last=Hern |first=Alex |date=April 5, 2020 |title=YouTube moves to limit spread of false coronavirus 5G theory |url=https://www.theguardian.com/world/2020/apr/05/youtube-to-suppress-content-spreading-coronavirus-5g-conspiracy-theory |access-date=April 5, 2020 |newspaper=[[The Guardian]]}}</ref>
In September 2021, YouTube extended this policy to cover videos disseminating misinformation related to any vaccine, including those long approved against measles or Hepatitis B, that had received approval from local health authorities or the [[World Health Organization]].<ref name="WaPo20210929">{{cite news |last=Pannett |first=Rachel |date=January 29, 2021 |title=Russia threatens to block YouTube after German channels are deleted over coronavirus misinformation |url=https://www.washingtonpost.com/world/2021/09/29/russia-ban-youtube-german-coronavirus/ |access-date=September 30, 2021 |newspaper=The Washington Post}}</ref><ref name="NYT20210929">{{cite news |last=Alba |first=Davey |author-link=Davey Alba |date=September 29, 2021 |title=YouTube bans all anti-vaccine misinformation |url=https://www.nytimes.com/2021/09/29/technology/youtube-anti-vaxx-ban.html |url-access=limited |archive-url=https://ghostarchive.org/archive/20211228/https://www.nytimes.com/2021/09/29/technology/youtube-anti-vaxx-ban.html |archive-date=December 28, 2021 |access-date=September 30, 2021 |work=The New York Times}}{{cbignore}}</ref> The platform proceeded to remove the accounts of anti-vaccine campaigners such as [[Robert F. Kennedy Jr.]] and [[Joseph Mercola]].<ref name="NYT20210929" /> YouTube had extended this moderation to non-medical areas. In the weeks following the [[2020 United States presidential election]], the site added policies to remove or label videos promoting election fraud claims;<ref>{{cite news |last=Ortutay |first=Barbara |date=December 9, 2020 |title=Weeks after election, YouTube cracks down on misinformation |url=https://apnews.com/article/youtube-election-misinformation-removal-74ca3738e2774c9a4cf8fbd1e977710f |access-date=June 2, 2023 |work=[[Associated Press News]]}}</ref><ref>{{Cite web |last=Lee |first=Timothy B. |date=December 9, 2020 |title=YouTube bans videos claiming Trump won |url=https://arstechnica.com/tech-policy/2020/12/youtube-bans-videos-claiming-trump-won/ |access-date=January 31, 2024 |website=[[Ars Technica]] |language=en-us}}</ref> however, it reversed this policy in June 2023, citing that the removal was necessary to "openly debate political ideas, even those that are controversial or based on disproven assumptions".<ref>{{cite news |date=June 1, 2023 |title=YouTube changes policy to allow false claims about past US presidential elections |url=https://apnews.com/article/youtube-election-misinformation-policy-42a6c1b7623c485dbc04eb76ad443247 |access-date=June 2, 2023 |work=Associated Press}}</ref><ref>{{Cite web |last=Brodkin |first=Jon |date=June 2, 2023 |title=YouTube now allows videos that falsely claim Trump won 2020 election |url=https://arstechnica.com/tech-policy/2023/06/youtube-now-allows-videos-that-falsely-claim-trump-won-2020-election/ |access-date=January 31, 2024 |website=Ars Technica |language=en-us}}</ref>
==== Child safety and wellbeing ====
{{See also|FamilyOFive|Fantastic Adventures scandal|Elsagate}}
Leading into 2017, there was a significant increase in the number of videos related to children, coupled between the popularity of parents vlogging their family's activities, and previous content creators moving away from content that often was criticized or demonetized into family-friendly material. In 2017, YouTube reported that time watching family vloggers had increased by 90%.<ref>{{cite magazine |last=Luscombe |first=Belinda |date=May 18, 2017 |title=The YouTube Parents Who are Turning Family Moments into Big Bucks |url=https://time.com/4783215/growing-up-in-public/ |access-date=June 21, 2019 |magazine=[[Time (magazine)|Time]]}}</ref><ref>{{cite web |last=Alexander |first=Julia |date=June 21, 2019 |title=YouTube can't remove kid videos without tearing a hole in the entire creator ecosystem |url=https://www.theverge.com/2019/6/21/18651223/youtube-kids-harmful-content-predator-comments-family-vlogging |access-date=June 21, 2019 |work=[[The Verge]]}}</ref> However, with the increase in videos featuring children, the site began to face several controversies related to [[Child protection|child safety]], including with popular channels [[FamilyOFive]] and [[Fantastic Adventures scandal|Fantastic Adventures]].<ref name="Ohlheiser2017">{{cite news |last=Ohlheiser |first=Abby |date=April 26, 2017 |title=The saga of a YouTube family who pulled disturbing pranks on their own kids |url=https://www.washingtonpost.com/news/the-intersect/wp/2017/04/25/the-saga-of-a-youtube-family-who-pulled-disturbing-pranks-on-their-own-kids/ |newspaper=[[The Washington Post]]}}</ref><ref name="Cresci2017">{{cite news |last=Cresci |first=Elena |date=May 7, 2017 |title=Mean stream: how YouTube prank channel DaddyOFive enraged the internet |language=en-GB |work=[[The Guardian]] |url=https://www.theguardian.com/technology/shortcuts/2017/may/07/when-youtube-pranks-go-horribly-wrong |access-date=June 7, 2017 |issn=0261-3077}}</ref><ref name="Dunphy2017">{{cite web |last=Dunphy |first=Rachel |date=April 28, 2017 |title=The Abusive 'Pranks' of YouTube Family Vloggers |url=https://nymag.com/selectall/2017/04/daddyofive-youtube-abuse-controversy-explained.html|work=[[New York Magazine]]|access-date=July 9, 2017}}</ref><ref name="Gajanan2017">{{cite magazine |last=Gajanan |first=Mahita |date=May 3, 2017 |title=YouTube Star DaddyOFive Loses Custody of 2 Children Shown in 'Prank' Videos |url=https://time.com/4763981/daddyofive-mike-martin-heather-martin-youtube-prank-custody/ |access-date=July 9, 2017 |magazine=[[Time (magazine)|Time]]}}</ref><ref>{{cite web |first1=Eric |last1=Levenson |first2=Mel |last2=Alonso |title=A mom on a popular YouTube show is accused of pepper-spraying her kids when they flubbed their lines |url=https://www.cnn.com/2019/03/20/us/youtube-fantastic-adventures-mom-arrest-trnd/index.html |publisher=CNN |date=March 20, 2019}}</ref>
Later that year, YouTube came under criticism for showing inappropriate videos targeted at children and often featuring popular characters in violent, sexual or otherwise disturbing situations, many of which appeared on [[YouTube Kids]] and attracted millions of views. The term "[[Elsagate]]" was coined on the Internet and then used by various news outlets to refer to this controversy.<ref>Ben Popper, [https://www.theverge.com/2017/2/20/14489052/youtube-kids-videos-superheroes-disney-characters-fart-jokes Adults dressed as superheroes is YouTube's new, strange, and massively popular genre], ''The Verge'', February 4, 2017</ref><ref>{{cite web |author=<!--Staff writer(s); no by-line.--> |date=March 31, 2017 |title=Report: Thousands of videos mimicking popular cartoons on YouTube Kids contain inappropriate content |url=https://news10.com/2017/03/31/report-thousands-of-videos-mimicking-popular-cartoons-on-youtube-kids-contain-inappropriate-content/ |access-date=April 30, 2017 |website=NEWS10 ABC |archive-date=August 19, 2017 |archive-url=https://web.archive.org/web/20170819234642/http://news10.com/2017/03/31/report-thousands-of-videos-mimicking-popular-cartoons-on-youtube-kids-contain-inappropriate-content/ |url-status=dead }}</ref><ref name="NYT">{{cite web |last=Maheshwari |first=Sapna |date=November 4, 2017 |title=Child Friendly? Startling Videos Slip Past Filters |url=https://www.nytimes.com/2017/11/04/business/media/youtube-kids-paw-patrol.html |url-access=limited |website=The New York Times |id={{ProQuest|2463387110}}}}</ref><ref name="forbes">Dani Di Placido, [https://www.forbes.com/sites/danidiplacido/2017/11/28/youtubes-elsagate-illuminates-the-unintended-horrors-of-the-digital-age/ YouTube's "Elsagate" Illuminates The Unintended Horrors Of The Digital Age], ''[[Forbes (magazine)|Forbes]]'', November 28, 2017</ref> Following the criticism, YouTube announced it was strengthening site security to protect children from unsuitable content and the company started to mass delete videos and channels that made improper use of family-friendly characters. As part of a broader concern regarding child safety on YouTube, the wave of deletions also targeted channels that showed children taking part in inappropriate or dangerous activities under the guidance of adults.<ref name="auto">Todd Spangler, [https://variety.com/2017/digital/news/youtube-toy-freaks-channel-terminated-1202617834/ YouTube Terminates Toy Freaks Channel Amid Broader Crackdown on Disturbing Kids' Content], ''[[Variety (magazine)|Variety]]'', November 17, 2017</ref><ref name="verge">{{cite news |last=Popper |first=Ben |date=November 9, 2017 |title=YouTube says it will crack down on bizarre videos targeting children |work=[[The Verge]] |url=https://www.theverge.com/2017/11/9/16629788/youtube-kids-distrubing-inappropriate-flag-age-restrict |url-status=live |archive-url=https://web.archive.org/web/20171116090955/https://www.theverge.com/2017/11/9/16629788/youtube-kids-distrubing-inappropriate-flag-age-restrict |archive-date=November 16, 2017 |quote=In August of this year, YouTube announced that it would no longer allow creators to monetize videos which "made inappropriate use of family-friendly characters." Today it's taking another step to try to police this genre.}}</ref><ref>Sarah Templeton, [https://www.newshub.co.nz/home/entertainment/2017/11/disturbing-elsagate-toy-freaks-videos-removed-from-youtube-after-abuse-allegations.html Disturbing 'ElsaGate', 'Toy Freaks' videos removed from YouTube after abuse allegations], ''[[Newshub]]'', November 22, 2017</ref><ref>[https://abcnews.go.com/US/youtube-crack-videos-showing-child-endangerment/story?id=51336368 YouTube to crack down on videos showing child endangerment], ''[[ABC News (United States)|ABC News]]'', November 22, 2017</ref><ref>Charlie Warzel, [https://www.buzzfeed.com/charliewarzel/youtube-is-addressing-its-massive-child-exploitation-problem YouTube Is Addressing Its Massive Child Exploitation Problem] [[BuzzFeed]], November 22, 2017</ref><ref>{{cite news |last1=Bridge |first1=Mark |last2=Mostrous |first2=Alexi |date=November 18, 2017 |title=Child abuse on YouTube |newspaper=The Times |url=https://www.thetimes.co.uk/article/child-abuse-on-youtube-q3x9zfkch |url-access=subscription |access-date=November 28, 2017}}</ref>
Even for content that appears to be aimed at children and appears to contain only child-friendly content, YouTube's system allows for anonymity of who uploads these videos. These questions have been raised in the past, as YouTube has had to remove channels with children's content which, after becoming popular, then suddenly include inappropriate content masked as children's content.<ref name="WSJ kids love">{{cite news |last1=Koh |first1=Yoree |last2=Morris |first2=Betsy |date=April 11, 2019 |title=Kids Love These YouTube Channels. Who Creates Them Is a Mystery. |newspaper=The Wall Street Journal |url=https://www.wsj.com/articles/kids-love-these-youtube-channels-who-creates-them-is-a-mystery-11554975000 |url-access=registration |url-status=live |archive-url=https://web.archive.org/web/20190814180500/https://www.wsj.com/articles/kids-love-these-youtube-channels-who-creates-them-is-a-mystery-11554975000 |archive-date=August 14, 2019 |access-date=August 14, 2019}}</ref> The anonymity of such channel raise concerns because of the lack of knowledge of what purpose they are trying to serve.<ref name="vice kids content">{{cite web |last=Haskins |first=Caroline |date=March 19, 2019 |title=YouTubers Are Fighting Algorithms to Make Good Content for Kids |url=https://www.vice.com/en_us/article/mbznpy/youtubers-are-fighting-algorithms-to-make-good-content-for-kids |url-status=live |archive-url=https://web.archive.org/web/20190814182839/https://www.vice.com/en_us/article/mbznpy/youtubers-are-fighting-algorithms-to-make-good-content-for-kids |archive-date=August 14, 2019 |access-date=August 14, 2019 |website=[[Vice Media|Vice]]}}</ref> The difficulty to identify who operates these channels "adds to the lack of accountability", according to Josh Golin of the [[Campaign for a Commercial-Free Childhood]], and educational consultant Renée Chernow-O'Leary found the videos were designed to entertain with no intent to educate, all leading to critics and parents to be concerned for their children becoming too enraptured by the content from these channels.<ref name="WSJ kids love" /> Content creators that earnestly make child-friendly videos have found it difficult to compete with larger channels, unable to produce content at the same rate as them, and lacking the same means of being promoted through YouTube's recommendation algorithms that the larger animated channel networks have shared.<ref name="vice kids content" />
In January 2019, YouTube officially banned videos containing "challenges that encourage acts that have an inherent risk of severe physical harm" (such as the [[Consumption of Tide Pods|Tide Pod Challenge]]) and videos featuring pranks that "make victims believe they're in physical danger" or cause emotional distress in children.<ref>{{cite web |last=Palladino |first=Valentina |date=January 16, 2019 |title=YouTube updates policies to explicitly ban dangerous pranks, challenges |url=https://arstechnica.com/gadgets/2019/01/youtube-updates-policies-to-explicitly-ban-dangerous-pranks-challenges/ |access-date=January 16, 2019 |website=Ars Technica |language=en-us}}</ref>
==== Sexualization of children and pedophilia ====
{{See also|Elsagate}}
In November 2017, it was revealed in the media that many videos featuring children—often uploaded by the minors themselves, and showing innocent content such as the children playing with toys or performing gymnastics—were attracting comments from [[Pedophilia|pedophiles]]<ref>[https://www.theverge.com/2017/11/15/16656706/youtube-videos-children-comments YouTube videos of children are plagued by sexual comments], ''[[The Verge]]'', November 15, 2017</ref><ref name="habits">{{cite news |last1=Mostrous |first1=Alexi |last2=Bridge |first2=Mark |last3=Gibbons |first3=Katie |date=November 24, 2017 |title=YouTube adverts fund paedophile habits |newspaper=The Times |url=https://www.thetimes.co.uk/article/youtube-adverts-fund-paedophile-habits-fdzfmqlr5 |url-access=subscription |access-date=November 28, 2017}}</ref> with predators finding the videos through private YouTube playlists or typing in certain keywords in Russian.<ref name="habits" /> Other child-centric videos originally uploaded to YouTube began propagating on the [[dark web]], and uploaded or embedded onto forums known to be used by pedophiles.<ref>{{cite news |last=Tait |first=Amelia |date=April 24, 2016 |title=Why YouTube mums are taking their kids offline |url=https://www.newstatesman.com/culture/observations/2016/04/why-youtube-mums-are-taking-their-kids-offline |access-date=June 21, 2019 |work=[[New Statesman]]}}</ref>
As a result of the controversy, which added to the concern about "Elsagate", several major advertisers whose ads had been running against such videos froze spending on YouTube.<ref name="forbes" /><ref>Todd Spangler, [https://variety.com/2017/digital/news/youtube-ad-boycott-pedophile-sexual-children-videos-1202622790/ YouTube Faces Advertiser Boycott Over Videos With Kids That Attracted Sexual Predators], ''[[Variety (magazine)|Variety]]'', November 25, 2017</ref> In December 2018, ''[[The Times]]'' found more than 100 grooming cases in which children were manipulated into sexually implicit behavior (such as taking off clothes, adopting overtly sexual poses and touching other children inappropriately) by strangers.<ref name="Paedophiles">{{cite news |author1=Harry Shukman |author2=Mark Bridge |date=December 10, 2018 |title=Paedophiles grooming children live on YouTube |language=en |work=The Times |url=https://www.thetimes.co.uk/article/paedophiles-grooming-children-live-on-youtube-3fv8gt730 |issn=0140-0460 |url-status=dead |archive-url=https://web.archive.org/web/20181210055232/https://www.thetimes.co.uk/article/paedophiles-grooming-children-live-on-youtube-3fv8gt730 |archive-date=December 10, 2018 |access-date=February 3, 2024}}</ref>
In February 2019, YouTube vlogger Matt Watson identified a "wormhole" that would cause the YouTube recommendation algorithm to draw users into this type of video content, and make all of that user's recommended content feature only these types of videos.<ref>{{cite web |last1=Lieber |first1=Chavie |title=YouTube has a pedophilia problem, and its advertisers are jumping ship |url=https://www.vox.com/the-goods/2019/2/27/18241961/youtube-pedophile-ring-child-safety-advertisers-pulling-ads |website=vox.com |date=March 1, 2019}}</ref> Most of these videos had comments from sexual predators commenting with timestamps of when the children were shown in compromising positions or otherwise making indecent remarks.<ref name="bloomberg mwatson">{{cite news |last1=Bergen |first1=Mark |last2=de Vynck |first2=Gerrit |last3=Palmeri |first3=Christopher |date=February 20, 2019 |title=Nestle, Disney Pull YouTube Ads, Joining Furor Over Child Videos |url=https://www.bloomberg.com/news/articles/2019-02-20/disney-pulls-youtube-ads-amid-concerns-over-child-video-voyeurs |access-date=February 20, 2019 |work=[[Bloomberg News]]}}</ref> In the wake of the controversy, the service reported that they had deleted over 400 channels and tens of millions of comments, and reported the offending users to law enforcement and the [[National Center for Missing and Exploited Children]].<ref>{{cite web |last=Alexander |first=Julia |date=February 21, 2019 |title=YouTube terminates more than 400 channels following child exploitation controversy |url=https://www.theverge.com/2019/2/21/18234494/youtube-child-exploitation-channel-termination-comments-philip-defranco-creators |access-date=February 21, 2019 |work=[[The Verge]]}}</ref><ref>{{cite web |last=Brodkin |first=Jon |date=February 21, 2019 |title=YouTube loses advertisers over 'wormhole into pedophilia ring' |url=https://arstechnica.com/tech-policy/2019/02/youtube-loses-advertisers-over-wormhole-into-pedophilia-ring/ |access-date=February 22, 2019 |website=Ars Technica |language=en-us}}</ref> Despite these measures several large advertisers pulled their advertising from YouTube.<ref name="bloomberg mwatson" /><ref>{{cite web |last1=Haselton |first1=Todd |last2=Salinas |first2=Sara |date=February 21, 2019 |title=As fallout over pedophilia content on YouTube continues, AT&T pulls all advertisements |url=https://www.cnbc.com/2019/02/21/att-pulls-all-ads-from-youtube-pedophilia-controversy.html |access-date=February 21, 2019 |publisher=[[CNBC]]}}</ref>
Subsequently, YouTube began to demonetize and block advertising on the types of videos that have drawn these predatory comments.<ref>{{cite web |last=Ingraham |first=Nathan |date=February 22, 2019 |title=YouTube is proactively blocking ads on videos prone to predatory comments |url=https://www.engadget.com/2019/02/22/youtube-blocking-ads-on-videos-predatory-comments/ |access-date=February 22, 2019 |work=[[Engadget]]}}</ref> YouTube also began to flag channels that predominantly feature children, and preemptively disable their comments sections.<ref>{{cite news |last=Fox |first=Chris |date=February 28, 2019 |title=YouTube bans comments on all videos of kids |language=en-GB |url=https://www.bbc.com/news/technology-47408969 |access-date=March 2, 2019}}</ref><ref>{{cite web |last=Alexander |first=Julia |date=February 28, 2019 |title=YouTube is disabling comments on almost all videos featuring children |url=https://www.theverge.com/2019/2/28/18244954/youtube-comments-minor-children-exploitation-monetization-creators |access-date=February 28, 2019 |work=[[The Verge]]}}</ref>
A related attempt to algorithmically flag videos containing references to the string "CP" (an abbreviation of [[child pornography]]) resulted in some prominent false positives involving unrelated topics using the same abbreviation. YouTube apologized for the errors and reinstated the affected videos.<ref>{{cite web |last=Gerken |first=Tom |date=February 19, 2019 |title=YouTube backtracks after Pokemon 'child abuse' ban |url=https://www.bbc.com/news/technology-47278362|work=[[BBC News]]|access-date=February 20, 2019}}</ref>
In June 2019, ''The New York Times'' cited researchers who found that users who watched erotic videos could be recommended seemingly innocuous videos of children.<ref>{{cite web |last1=Fisher |first1=Max |last2=Taub |first2=Amanda |date=June 3, 2019 |title=On YouTube's Digital Playground, an Open Gate for Pedophiles |url=https://www.nytimes.com/2019/06/03/world/americas/youtube-pedophiles.html |access-date=June 6, 2019 |work=The New York Times}}</ref>
=== Russia ===
In 2021, two accounts linked to [[RT Deutsch]], the German channel of the Russian [[RT (TV network)|RT]] network were removed as well for breaching YouTube's policies relating to COVID-19.<ref name="WaPo20210929" /> Russia threatened to ban YouTube after the platform deleted two German RT channels in September 2021.<ref>{{cite news |date=September 29, 2021 |title=Russia threatens YouTube ban for deleting RT channels |language=en-GB |work=BBC News |url=https://www.bbc.com/news/technology-58737433 |access-date=February 27, 2022}}</ref>
Shortly after the [[Russian invasion of Ukraine]] in 2022, YouTube removed all channels funded by the Russian state.<ref>{{cite news |date=March 3, 2022 |title=YouTube blocks Russian state-funded media channels globally |language=en-US |work=Reuters |url=https://www.reuters.com/business/media-telecom/youtube-blocks-russian-state-funded-media-channels-globally-2022-03-11/ |access-date=December 5, 2023}}</ref> YouTube expanded the removal of Russian content from its site to include channels described as 'pro-Russian'. In June 2022, the ''War Gonzo'' channel run by Russian military blogger and journalist [[Semyon Pegov]] was deleted.<ref>{{cite news |date=June 21, 2022 |title=Youtube deletes Wargonzo channel |url=https://news.am/eng/news/708387.html|access-date=December 5, 2023}}</ref>
In July 2023, YouTube removed the channel of British journalist [[Graham Phillips (journalist)|Graham Phillips]], active in covering the [[War in Donbas]] from 2014.<ref>{{cite news |title=British Pro-Russian YouTuber vows his assets shouldn't be frozen for promoting invasion |url=https://www.mirror.co.uk/news/uk-news/british-pro-russian-youtuber-vows-31457188 |access-date=December 5, 2023 |work=[[Daily Mirror|The Mirror]] |date=November 16, 2023 }}</ref>
In August 2023, a Moscow court fined Google 3&nbsp;million rubles, around $35,000, for not deleting what it said was "fake news about the war in Ukraine".<ref>{{cite news |title=Russia fines Google for failing to delete 'false content' about Ukraine war |url=https://www.politico.eu/article/russia-fine-google-ukraine-war/ |access-date=December 10, 2023 |work=[[Politico]] |date=August 17, 2023 }}</ref>
In October 2024, a Russian court has fined its parent company Google a grand total of 2&nbsp;undecillion [[Russian ruble|rubles]] (equivalent to US$20 decillion) for restricting Russian state media channels on YouTube.<ref>{{Cite news |last=Fraser |first=Graham |date=31 October 2024 |title=Russia fines Google more money than there is in entire world |url=https://www.bbc.com/news/articles/cdxvnwkl5kgo |access-date=8 November 2024 |work=[[BBC]]}}</ref> The fine imposed by Russia is far greater than the world's total GDP, estimated at US$110 trillion by the [[International Monetary Fund]].<ref>{{Cite news |last=Cairns |first=Dan |date=31 October 2024 |title=Russia fines Google more than world's entire GDP for blocking YouTube accounts |url=https://news.sky.com/story/russia-fines-google-more-than-worlds-entire-gdp-for-blocking-youtube-accounts-13245208 |access-date=8 November 2024 |work=[[Sky News]]}}</ref> News agency [[TASS]] reported that Google is allowed to return to the Russian market only if it complies with the court's decision.<ref>{{Cite news |date=29 October 2024 |title=Google's fines in Russia reach stratospheric levels — lawyer |url=https://tass.com/economy/1864291 |work=[[TASS]]}}</ref> Kremlin spokesperson [[Dmitry Peskov]] labeled the court decision as "symbolic" and warned Google that it “should not be restricting the actions of our broadcasters on its platform.”<ref>{{Cite news |date=31 October 2024 |title=Russia says $20 decillion fine against Google is 'symbolic' |url=https://www.theguardian.com/world/2024/oct/31/russia-20-decillion-fine-against-google-symbolic-youtube-ban-pro-kremlin-media |access-date=8 November 2024 |work=[[The Guardian]] |agency=[[Agence France-Presse]]}}</ref>
=== April Fools gags ===
{{See also|List of Google April Fools' Day jokes}}
YouTube featured an [[April Fools' Day|April Fools]] prank on the site on April 1 of every year from 2008 to 2016. In 2008, all links to videos on the main page were redirected to [[Rick Astley]]'s music video "[[Never Gonna Give You Up]]", a prank known as "[[rickrolling]]".<ref>{{cite web |last=Arrington |first=Michael |date=March 31, 2008 |title=YouTube RickRolls Users |url=https://techcrunch.com/2008/03/31/youtube-rickrolls-users/ |access-date=March 26, 2017 |website=[[TechCrunch]] |publisher=[[AOL]]}}</ref><ref>{{cite magazine |last=Wortham |first=Jenna |date=April 1, 2008 |title=YouTube 'Rickrolls' Everyone |url=https://www.wired.com/2008/04/youtube-rickrol/ |magazine=Wired |access-date=March 26, 2017}}</ref> The next year, when clicking on a video on the main page, the whole page turned upside down, which YouTube claimed was a "new layout".<ref>{{cite web |author=Bas van den Beld |date=April 1, 2009 |title=April fools: YouTube turns the world up-side-down |url=https://www.searchcowboys.com/news/453 |archive-url=https://web.archive.org/web/20090403054721/https://www.searchcowboys.com/news/453 |archive-date=April 3, 2009 |access-date=April 2, 2010 |publisher=searchcowboys.com}}</ref> In 2010, YouTube temporarily released a "TEXTp" mode which rendered video imagery into [[ASCII art]] letters "in order to reduce bandwidth costs by $1 per second."<ref>{{cite web |last=Pichette |first=Patrick |date=March 31, 2010 |title=TEXTp saves YouTube bandwidth, money |url=https://youtube.googleblog.com/2010/03/textp-saves-youtube-bandwidth-money.html |access-date=March 25, 2017 |website=Official YouTube Blog}}</ref>
The next year, the site celebrated its "100th anniversary" with a range of sepia-toned silent, early 1900s-style films, including a parody of [[Keyboard Cat]].<ref>{{cite news |last=Richmond |first=Shane |date=April 1, 2011 |title=YouTube goes back to 1911 for April Fools' Day |website=[[The Daily Telegraph]] |url=https://www.telegraph.co.uk/technology/google/8421394/YouTube-goes-back-to-1911-for-April-Fools-Day.html |archive-url=https://ghostarchive.org/archive/20220110/https://www.telegraph.co.uk/technology/google/8421394/YouTube-goes-back-to-1911-for-April-Fools-Day.html |archive-date=January 10, 2022 |url-access=subscription |url-status=live |access-date=March 26, 2017}}{{cbignore}}</ref> In 2012, clicking on the image of a DVD next to the site logo led to a video about a purported option to order every YouTube video for home delivery on DVD.<ref>{{cite magazine |last=Carbone |first=Nick |date=April 1, 2012 |title=April Fools' Day 2012: The Best Pranks from Around the Web |url=https://newsfeed.time.com/2012/04/01/april-fools-day-2012-the-best-pranks-from-around-the-web/ |magazine=Time |access-date=March 26, 2017}}</ref>
In 2013, YouTube teamed up with satirical newspaper company ''[[The Onion]]'' to claim in an uploaded video that the video-sharing website was launched as a contest which had finally come to an end, and would shut down for ten years before being re-launched in 2023, featuring only the winning video. The video starred several [[Internet celebrity|YouTube celebrities]], including [[Antoine Dodson]]. A video of two presenters announcing the nominated videos streamed live for 12 hours.<ref>{{cite magazine |last=Quan |first=Kristene |date=April 1, 2013 |title=WATCH: YouTube Announces It Will Shut Down |url=https://newsfeed.time.com/2013/04/01/watch-youtube-announces-it-will-shut-down/ |magazine=Time |access-date=March 26, 2017}}</ref><ref>{{cite web |last=Murphy |first=Samantha |date=March 31, 2013 |title=YouTube Says It's Shutting Down in April Fools' Day Prank |url=https://mashable.com/2013/03/31/youtube-april-fools-day/?europe=true |access-date=November 8, 2019 |publisher=[[Mashable]]}}</ref>
In 2014, YouTube announced that it was responsible for the creation of all viral video trends, and revealed previews of upcoming trends, such as "Clocking", "Kissing Dad", and "Glub Glub Water Dance".<ref>{{cite news |last=Kleinman |first=Alexis |date=April 1, 2014 |title=YouTube Reveals Its Viral Secrets in April Fools' Day Video |newspaper=HuffPost |url=https://www.huffingtonpost.com/2014/04/01/youtube-april-fools_n_5068694.html |access-date=April 1, 2014}}</ref> The next year, YouTube added a music button to the video bar that played samples from "[[Sandstorm (instrumental)|Sandstorm]]" by [[Darude]].<ref>{{cite web |last=Alba |first=Alejandro |date=April 1, 2015 |title=17 April Fools' pranks from tech brands, tech giants today |url=https://www.nydailynews.com/news/national/17-april-fool-pranks-tech-brands-tech-giants-today-article-1.2169557 |access-date=June 12, 2016 |website=Daily News|location=New York}}</ref> In 2016, YouTube introduced an option to watch every video on the platform in 360-degree mode with [[Snoop Dogg]].<ref>{{cite news |last=Sini |first=Rozina |date=April 1, 2016 |title=Snoopavision and other April Fools jokes going viral |work=BBC News |url=https://www.bbc.co.uk/news/uk-35941866 |access-date=April 1, 2016}}</ref>


== Services ==
== Services ==