Alt-right pipeline

This is a good article. Click here for more information.
Source: Wikipedia, the free encyclopedia.

Graphic of interactions between mostly right-wing personalities on YouTube from January 2017 to April 2018. Each line indicates a shared appearance in a YouTube video, allowing audiences of one personality to discover another.[1]

The alt-right pipeline (also called the alt-right rabbit hole) is a proposed

algorithms on various social media platforms function through the process recommending content that is similar to what users engage with, but can quickly lead users down rabbit-holes.[2][3][4]

Many political movements have been associated with the pipeline concept. The intellectual dark web,[2] libertarianism,[5] the men's rights movement,[6] and the alt-lite movement[2] have all been identified as possibly introducing audiences to alt-right ideas. Audiences that seek out and are willing to accept extreme content in this fashion typically consist of young men, commonly those that experience significant loneliness and seek belonging or meaning.[7] In an attempt to find community and belonging, message boards that are often proliferated with hard right social commentary, such as 4chan and 8chan, have been well documented in their importance in the radicalization process.[8]

The alt-right pipeline may be a contributing factor to domestic terrorism.[9][10] Many social media platforms have acknowledged this path of radicalization and have taken measures to prevent it, including the removal of extremist figures and rules against hate speech and misinformation.[3][7] Left-wing movements, such as BreadTube, also oppose the alt-right pipeline and "seek to create a 'leftist pipeline' as a counterforce to the alt-right pipeline."[11]

The effects of YouTube's algorithmic bias in radicalizing users has been replicated by one study,[2][12][13][14] although two other studies found little or no evidence of a radicalization process.[3][15][16]

Process

Use of the internet allows individuals with

internet memes means they can easily be recreated and spread to many different internet communities.[17][18]

YouTube has been identified as a major element in the alt-right pipeline. This is facilitated through an "Alternative Influence Network", in which various right-wing scholars, pundits, and internet personalities interact with one another to boost performance of their content. These figures may vary in their ideologies between conservatism, libertarianism, or white nationalism, but they share a common opposition to feminism, progressivism, and social justice that allows viewers of one figure to quickly acclimate to another.[1] They often prioritize right-wing social issues over right-wing economic issues, with little discussion of fiscal conservatism. Some individuals in this network may not interact with one another, but a collection of interviews, internet debates, and other interactions create pathways for users to be introduced to new content.[2]

YouTube's algorithmic system for recommending videos allows users to quickly access content similar to what they've previously viewed, allowing them to more deeply explore an idea once they've expressed interest. This allows newer audiences to be exposed to extreme content when videos that promote misinformation and conspiracy theories gain traction.[10][7] When a user is exposed to certain content featuring certain political issues or culture war issues, this recommendation system may lead users to different ideas or issues, including Islamophobia, opposition to immigration, antifeminism, or reproduction rates.[10][19] Recommended content is often somewhat related, which creates an effect of gradual radicalization between multiple issues, referred to as a pipeline.[10]

At times, the platform will also recommend these videos to users that had not indicated interest in these viewpoints.[4][19] Radicalization also takes place in interactions with other radicalized users online, on varied platforms such as Gab, Reddit, 4chan, or Discord.[10] Major personalities in this chain often have a presence on Facebook and Twitter, though YouTube is typically their primary platform for messaging and earning income.[7]

The alt-right pipeline mainly targets

self-doubt.[20]

Content

The alt-right pipeline has been found to begin with the intellectual dark web community, which is made up of internet personalities that are unified by an opposition to identity politics and political correctness, such as Joe Rogan, Ben Shapiro, Dave Rubin, and Jordan Peterson.[2] The intellectual dark web community overlaps and interacts with the alt-lite community, such as Steven Crowder, Paul Joseph Watson, Mark Dice, and Sargon of Akkad.[2] This community in turn overlaps and interacts with the alt-right community, such as James Allsup, Black Pigeon Speaks, Varg Vikernes, and Red Ice.[2] The most extreme endpoint often involves fascism or belief in an international Jewish conspiracy,[17] though the severity of extremism can vary between individuals.[7]

Alt-right content on the internet spreads ideology that is similar to earlier white supremacist and fascist movements. The internet packages the ideology differently, often in a way that is more palatable and thus is more successful in delivering it to a larger number of people.[21] Due to the conservative nature of the alt-right, much of the ideology is associated with the preservation of traditional values and ways of living. This creates a susceptibility toward conspiracy theories about secret forces that seek to destroy traditional ways of life.[22]

The antifeminist Manosphere has been identified as another early point in the alt-right pipeline.[6] The men's rights movement often discusses men's issues more visibly than other groups, attracting young men with interest in such issues when no alternative is made available. Many right-wing internet personalities have developed a method to expand their audiences by commenting on popular media; videos that criticize movies or video games for supporting left-wing ideas are more likely to attract fans of the respective franchises.[7]

The format presented by YouTube has allowed various ideologies to access new audiences through this means.[7] The same process has also been used to facilitate access to anti-capitalist politics through the internet community BreadTube. This community was developed through the use this pipeline process to introduce users to left-wing content and mitigate exposure to right-wing content,[7][11] though the pipeline process has been found to be less effective for left-wing politics due to the larger variety of opposing left-wing groups that limits interaction and overlap.[11] This dichotomy can also cause a "whiplash polarization" in which individuals are converted between far-right and far-left politics.[7]

Psychological factors

The psychological factors of radicalization through the alt-right pipeline are similar to other forms of radicalization, including

dark humor, causing it to be less shocking over time.[10] This may sometimes be engineered intentionally by members of the alt-right to make their beliefs more palatable and provide plausible deniability for extreme beliefs.[17][18] Acclimation is the process of being conditioned to seeing bigoted content. By acclimating to controversial content, individuals become more open to slightly more extreme content. Over time, conservative figures appear too moderate and users seek out more extreme voices. Dehumanization is the final step of the alt-right pipeline, where minorities are seen as lesser or undeserving of life and dehumanizing language is used to refer to people that disagree with far-right beliefs.[10]

The process is associated with young men that experience loneliness, meaninglessness, or a lack of belonging.[7] An openness to unpopular views is necessary for individuals to accept beliefs associated with the alt-right pipeline. It has been associated with contrarianism, in which an individual uses the working assumption that the worldviews of most people are entirely wrong. From this assumption, individuals are more inclined to adopt beliefs that are unpopular or fringe. This makes effective several entry points of the alt-right pipeline, such as libertarianism, in which ideologies attract individuals with traits that make them susceptible to radicalization when exposed to other fringe ideas.[5] Motivation for pursuing these communities varies, with some people finding them by chance while others seek them out. Interest in video games is associated with the early stages of the alt-right pipeline.[7]

Along with algorithms, online communities can also play a large part in radicalization. People with fringe and radical ideologies can meet other people who share, validate and reinforce those ideologies. Because people can control who and what they engage with online, they can avoid hearing any opinion or idea that conflicts with what their prior beliefs. This creates an echo chamber that upholds and reinforces radical beliefs. The strong sense of community and belonging that comes with it is a large contributing factor for people joining the alt-right and adopting it as an identity.[23]

Concerns and prevention

Internet radicalization correlates with an increase in

Harassment campaigns against perceived opponents of the alt-right movement are another common effect of radicalization.[10]

Many social media platforms have recognized the potential of radicalization and have implemented measures to limit its prevalence. High-profile extremist commentators such as Alex Jones have been banned from several platforms, and platforms often have rules against hate speech and misinformation.[7] In 2019, YouTube announced a change to its recommendation algorithm to reduce conspiracy theory related content.[7][19] Some extreme content, such as explicit deceptions of violence, are typically removed on most social media platforms. On YouTube, content that expresses support of extremism may have monetization features removed, may be flagged for review, or may have public user comments disabled.[3]

Studies

A September 2018 study published by the Data & Society Research Institute found that 65 right-wing

radicalization pipeline".[2][12][13][14]

A 2020 study published in The International Journal of Press/Politics argued that the "emerging journalistic consensus" that YouTube's algorithm radicalizes users to the far-right "is premature." Instead, the study proposes a "'Supply and Demand' framework for analyzing politics on YouTube."[28]

A 2021 study published in the

far right." Instead, the study found that "consumption of political content on YouTube appears to reflect individual preferences that extend across the web as a whole."[15] A 2022 study published by the City University of New York found that "little systematic evidence exists to support" the claim that YouTube's algorithm radicalizes users, adding that exposure to extremist views "on YouTube is heavily concentrated among a small group of people with high prior levels of gender and racial resentment.", and that "non-subscribers are rarely recommended videos from alternative and extremist channels and seldom follow such recommendations when offered."[16]

See also

References

  1. ^ a b c Lewis, Rebecca (18 September 2018). Alternative Influence: Broadcasting the Reactionary Right on YouTube (Report). Data & Society. Archived from the original on 25 May 2022. Retrieved 14 July 2022.
  2. ^
    S2CID 201316434
    .
  3. ^ from the original on 28 October 2022. Retrieved 28 October 2022.
  4. ^ a b "Mozilla Investigation: YouTube Algorithm Recommends Videos that Violate the Platform's Very Own Policies". Mozilla Foundation. 7 July 2021. Archived from the original on 25 March 2023. Retrieved 25 March 2023.
  5. ^ from the original on 25 July 2023. Retrieved 21 September 2022.
  6. ^ .
  7. ^ from the original on 17 May 2023. Retrieved 26 October 2022.
  8. ^ Hughes, Terwyn (26 January 2021). "Canada's alt-right pipeline". The Pigeon. Archived from the original on 25 March 2023. Retrieved 25 March 2023.
  9. ^ from the original on 25 July 2023. Retrieved 4 November 2022.
  10. ^ from the original on 24 May 2022. Retrieved 14 July 2022.
  11. ^ .
  12. ^ a b Lomas, Natasha (28 January 2020). "Study of YouTube comments finds evidence of radicalization effect". TechCrunch. Retrieved 17 July 2021.
  13. ^ a b Newton, Casey (28 August 2019). "YouTube may push users to more radical views over time, a new paper argues". The Verge. Archived from the original on 27 July 2023. Retrieved 17 July 2021.
  14. ^ ].
  15. ^ .
  16. ^ ].
  17. ^ a b c d Evans, Robert (11 October 2018). "From Memes to Infowars: How 75 Fascist Activists Were "Red-Pilled"". Bellingcat. Archived from the original on 21 November 2018. Retrieved 27 October 2022.
  18. ^ a b Wilson, Jason (23 May 2017). "Hiding in plain sight: how the 'alt-right' is weaponizing irony to spread fascism". The Guardian. Archived from the original on 28 October 2022. Retrieved 28 October 2022.
  19. ^ a b c Bennhold, Katrin; Fisher, Max (7 September 2018). "As Germans Seek News, YouTube Delivers Far-Right Tirades". The New York Times. Archived from the original on 14 March 2023. Retrieved 25 March 2023.
  20. ^ Scully, Aidan (10 October 2021). "The Dangerous Subtlety of the Alt-Right Pipeline". Harvard Political Review. Archived from the original on 27 July 2023. Retrieved 27 July 2023.
  21. S2CID 196005328
    .
  22. from the original on 30 May 2023. Retrieved 7 May 2023.
  23. from the original on 12 December 2022. Retrieved 7 May 2023.
  24. from the original on 25 July 2023. Retrieved 4 November 2022.
  25. ^ Veilleux-Lepage, Yannick; Daymon, Chelsea; Amarasingam, Amarnath (2020). The Christchurch attack report: key takeaways on tarrant's radicalization and attack planning (PDF) (Report). International Centre for Counter-Terrorism. Archived from the original (PDF) on 4 November 2022. Retrieved 4 November 2022.
  26. S2CID 240343107
    .
  27. ^ Ingram, Mathew (19 September 2018). "YouTube's secret life as an engine for right-wing radicalization". Columbia Journalism Review. Archived from the original on 23 September 2018.
  28. from the original on 13 October 2022. Retrieved 28 October 2022.