Echo chambers and filter bubbles don't reflect our media environment
Elizabeth Dubois and Guillermo Renna
Introduction1
Filter bubble and echo chamber metaphors are no longer, and likely never were, particularly useful for describing most individuals’ experiences of our Internet-enabled media environment or for understanding and responding to fears of political polarization, commonly described as an online harm. Both metaphors describe a media environment where people end up, either by algorithmic design or personal choices, primarily receiving information and connecting with people that confirm their existing beliefs. The fear is that without exposure to other ideas, people will be unable or unwilling to understand other perspectives leading to polarization, or be more easily manipulated, for example through disinformation campaigns.
But, empirical evidence suggests very few people get caught in filter bubbles or echo chambers. Not only that, we argue that political polarization, which is often the feared outcome of filter bubbles and echo chambers, is unlikely to be resolved by focusing on these metaphors. We also argue that the technological affordances and features some fear lead to polarization actually make it easier for people to share specific information, connect with particular communities, and mobilize toward social change. These things are of great societal value, but are far too often overlooked.
Evidence about filter bubbles and echo chambers
The filter bubble and echo chamber metaphors have become very popular in a world where the Internet enables a high-choice media environment that gives individuals access to a vast number of information sources.2 The metaphors are easy to grasp and often seem intuitively true. We have all had experiences where a recommender system keeps feeding us the same suggestion over and over, or where it seems like every friend on our social media feeds has the same view on a given topic. These are possible examples of what Pariser (2011) coined “the filter bubble.”
We also have all likely experienced moments where it seems like no matter what app you open or what television or radio station you turn to, you keep hearing about the same topic. This illustrates the idea of an echo chamber. In 2001, Sunstein cautioned that echo chambers might emerge as people had increased opportunities to choose their sources and channels of communication. He worried people would be motivated to select sources and channels which confirm their views and would be less likely to serendipitously come across information they disagree with or that is on topics they do not intentionally opt-in to.
The problem is, while we can each think of anecdotal examples, empirical evidence does not suggest that the majority of people are caught in filter bubbles or echo chambers. While we do often find ourselves in communities of people like us (what sociologists call homophily) and while search and social media algorithms are designed to personalize the information they prioritize in our feeds (called algorithmic curation), these processes do not make us unable to access other information, ideas, or people. This is because most people rely on many different sources and channels of communication to find information and connect with people depending on their needs and in varied contexts.3
A note on methods
One of the major issues with studying these concepts is that they are often measured very differently from one study to the next. Researchers do not always agree on what constitutes a filter bubble or echo chamber. A lack of agreed upon or consistent measurement makes it difficult to know if these studies measure the same thing. Beyond that, researchers often have to rely on whatever data they can get, which is restricted by platforms. And even when they can gain access to data, it is often messy and difficult to manage. This is particularly concerning since platforms do not have strong records of algorithmic transparency, which means that academics, the public, regulators, and legislators are impeded in their ability to evaluate potential harms or unintended consequences of using these algorithms. Finally, single-platform studies are common even though we know people rarely rely on a single platform to gather information and are exposed to information that could impact their beliefs and behaviors across a range of platforms.
Evidence about filter bubbles
The filter bubble metaphor describes a situation where our social media feeds and news aggregation suggestions are populated only with content that is similar to what we have previously viewed or interacted with.4 Filter bubbles arise from the distribution of information by decisions made outside an individual’s control.5 The filter bubble idea assumes that algorithmic curation6 is designed to personalize content, optimizing for future clicks, likes, and favorites. But it is important to recognize that this is a design choice and not the way algorithmic curation must be designed.7 For example, Ovadya (2022) suggests that optimizing for bridging links – prioritizing content and relationships that connect across groups – in a system could lead people’s feeds to be more diverse.8 So, while filter bubbles could emerge, it is not a foregone conclusion. It is also important to remember any personalization or algorithmic curation happens in a wider social context:9 regardless of the algorithms underpinning Facebook, we are already likely to be friends with folks who have similar views and experiences as we do.
When scholars have systematically tested for filter bubbles, very few have found support for them. In 2016, a review study found little evidence of their existence, but warned that increased personalization from tools like news aggregators could change this reality.10 But that did not manifest and there remains little evidence in support of filter bubbles.11 Rather than create individual ‘bubbles,’ Nechustai and Lewis (2019) find that, “the news agenda constructed on Google News replicates traditional industry structures more than disrupts them.”12 Moreover, by using social media, we actually see an increase in exposure to news, especially for those with little interest in it.13 So, rather than create information silos, social media use increases exposure to information, which could include cross-cutting information and at the very least is tied to increased activities to avoid echo chambers.14
That said, there are examples where specific uses of single social media platforms can lead to an increasingly narrow set of information, what some call a ‘rabbit hole.’15 For example, experiments on YouTube have showed that its recommendation system typically suggests increasingly extreme content,16 and the longer you follow the YouTube recommendations, the more ideologically narrow the content recommended becomes.17
However, there are two important caveats here. First, those most susceptible to fall into extremist rabbit holes tend to be already predisposed to extremist ideas.18 Second, very few people consume only information on a single topic from a single platform. The vast majority of people have much more diverse media diets. Even those who go down rabbit holes rarely do so in the absence of other input.
Evidence about echo chambers
The echo chamber metaphor helps explain people’s exposure to different ideas considering their wider media environments. Rather than focusing on the choices embedded in the algorithm of a single platform, the metaphor calls on us to instead look at the decisions people make in a high-choice media environment. An echo chamber would exist when an individual, confronted with many options for accessing information, only chooses information that is in line with their pre-existing beliefs.19
But, the empirical evidence to date is, at best, mixed.20 Some suggest echo chambers may be a reality for some people but only for certain topics, specifically those that are more political,21 and some find evidence of echo chambers for only the most partisan/extreme individuals.22 Yet others point out that even among those who are politically interested, simply using a variety of social media tools is related to greater avoidance of echo chambers.23
Importantly, we can also draw insight from the closely related concepts of selective exposure and selective avoidance.24 Evidence suggests that while people prefer information they agree with, they don’t necessarily avoid information they disagree with.25 And remember, even with algorithmic curation happening, people tend to be exposed to a greater variety of ideas when they use social media.26 Taken together, this suggests very few people are likely to be caught in echo chambers.
That said, there appears to be a small portion of individuals who may be unable to avoid echo chambers,27 and some already radicalized individuals may form echo chambers.28
Taking stock of the evidence
We believe concerns that individuals exist in filter bubbles or echo chambers are overblown. Rather than being sorted into filter bubbles or sorting ourselves into echo chambers, most people consume information and connect with others across multiple channels of communication. Human beings are multi-faceted. Some live in very homogenous communities, online or offline, others actively seek out and engage with opposing viewpoints. These dynamics change over time and are context dependent and consequently, we exist and interact with media differently in different contexts. Importantly, people are learning to work within an information environment that assumes algorithmic curation is happening. For example, 70% of Canadians say they took actions out of concern for the accuracy of the news they were reading, including 39% who said they checked several news sources to see if a story was portrayed in different ways.29 Ultimately, the filter bubble and echo chamber metaphors do not seem to reflect the relationship of most individuals to their media environments, except for a small minority. For the latter group, it is probably a more effective use of resources to use policy tools to address this specific concern, rather than treat this as a systemic, societal concern.
The misguided link between filter bubbles, echo chambers, and polarization
Despite the fact that filter bubbles and echo chambers are unlikely to accurately describe most people’s experience of the current media environment, the metaphors are often referred to when trying to explain online harms such as polarization and radicalization. But even if we had evidence to support filter bubbles and echo chambers, these metaphors can’t actually explain what is happening and why. Focusing on them can make it harder to develop policy responses to respond to those harms.
On a societal level, one of the main concerns around filter bubbles and echo chambers is that they create an environment where there is less discussion of conflicting ideas, which violates one of the principles of deliberative democracy. If everyone is only consuming information they agree with and not being exposed to other perspectives, then society divides into groups with no shared understanding or common ground. When the space between divided groups grows, it is called political polarization.30 If large swathes of the population aren’t engaging with each other, then society becomes difficult to govern because there are fewer common understandings or opportunities to develop compromises. This concern has been raised by politicians, journalists, and activists who are grappling with a rise in polarization in several Western democracies.31 For instance, President Obama has argued that one of the “dangers of the internet is that people can have entirely different realities. They can be cocooned in information that reinforces their current biases,”32 leading to the fragmentation of society into possibly conflicting groups.
The implied argument is that people are using the Internet to embed themselves in echo chambers (that could be reinforced by filter bubbles). There is some experimental evidence that shows when people are placed in groups of like-minded individuals to discuss political topics, they end up leaving with more hardened and extreme positions.33 The fear with filter bubbles and echo chambers is that they recreate this on a societal scale. For instance, following the events of the Freedom Convoy in Canada, in February 2022 many politicians, across the political spectrum attributed the strong opposing views they saw to filter bubbles and/or echo chambers, arguing that, “it has been unbelievable to see how people are getting their information from information silos. I think it is one of the biggest contributing factors to the division we are seeing in this country.”34
In particular, there is a concern that people will end up in partisan or ideological echo chambers. But, in the US, where polarization is arguably more evident than Canada,35 studies have found the same news programming is often consumed by Democrats and Republicans,36 including from more partisan outlets.37 And, in Canada, evidence suggests social media use does not lead to polarization.38 So, while we may see societies polarize, echo chambers don’t seem to be the culprit.
Though most individuals aren’t in echo chambers and filter bubbles, we know that people prefer to read information they agree with, and a high-choice media environment makes this easier. But, the problem here is that the situation is more dynamic and less one-directional. It’s often assumed that online spaces, by sorting individuals into distinct homogenous communities, leads to polarization. But there is evidence that polarization drives social media use, as opposed to the other way around.39 Unsurprisingly, a lot of offline social dynamics are reproduced in online spaces, and it becomes quite difficult to disentangle the dynamic relationship that exists.40 Given that people don’t actually tend to exist in filter bubbles or echo chambers, we should focus on the underlying causes of social polarization and radicalization in society.41
Moreover, political polarization certainly can exist in the absence of an echo chamber. A common argument is that if people were exposed to information that goes against their existing beliefs, then they would be less polarized. But the motivations for engaging with that other content, and the contexts in which people engage with others who have different views from them matters. A study that looked at the content of conversations on Twitter regarding the Catalonian referendum to secede from Spain found that individuals did engage with the other side, but the conversations were confrontational.42 This is important because if interactions are aggressive, and thus do not fulfill the democratic ideal of debate of ideas, it could contribute to further anger and resentment, and thereby increase polarization. Similarly, if individuals consume information only to refute it, then it’s likely to not be de-polarizing.43 So, being exposed to wider varieties of information alone does not necessarily lead to lower levels of polarization.
Beyond polarization, there are concerns that social media’s algorithmic curation and people’s choices in a high-choice environment could lead to radicalization, manipulation of public opinion, increased spread of disinformation, and political apathy, among other concerns. By focusing on filter bubbles and echo chambers instead of the actual underlying causes of these kinds of online harms, we risk developing responses and policies which are ineffective, or worse, make it impossible to benefit from finding like-minded others and specific information and communities online.
The overlooked value of filtering, sorting, and organizing
The filtering, sorting, and organizing roles of platforms can have great value in society, value that is often overlooked because of fears about filter bubbles and echo chambers. It is important to remember that people have always needed to filter, sort, and organize information and have done so in a number of ways, from the dewey decimal system to group membership, such as being a part of a political party or living in a particular city or country. We regularly rely on tools and systems to help us sort and organize. While there is a definite need for greater transparency in how these tools help us sort, organize and filter, when search engines and social media platforms make it easier for people to share specific information, connect with communities, and mobilize toward social change, there is important societal value.
Sharing information
A worry associated with filter bubbles and echo chambers is that people will only receive information on particular topics which they are already interested in. Rather than thinking of this as a bug, it can be a feature. Search engines help us hone in on specific information we want. The algorithms underpinning what shows up on TikTok put information on the screen that you might be interested in based on past use, not requiring a specific intentional search. Facebook groups for niche topics can allow people to opt-in to information on that topic. These are ways that platforms help us gain access to specific information, reducing information overload, and making information on niche topics more accessible.
Finding community
Humans often spend time with those who have similar life experiences and interests, or are broadly like them. This idea is known as homophily.45 Being a member of a community is important for social development and happiness.45 Social media platforms have been particularly useful for individuals to find communities they can feel a part of and to find spaces where they feel safe to express themselves.46 For example, many communities of support for people dealing with rare diseases exist so people can share their experiences and learn from one another. Queer-friendly groups online have emerged, which can be a haven for folks living in regions where being queer is illegal or generally not accepted by those around them.47 In other words, there are times when being served information that conforms to your existing beliefs and being connected with people like you can be extremely helpful.
Mobilizing toward social change
Enabling people to find communities and to share information within those communities is also essential for mobilization toward social change. An early example is a series of pro-democracy uprisings across the Middle East in 2010, known as the Arab Spring. In the lead up to these uprisings, communities formed on and offline, and during the uprisings, text messages and social media were used to spread messages and information quickly, which allowed for coordination and planning of protests.48 More recently, the Black Lives Matter movement grew in a social media context, where the tools helped people connect, share information, and coordinate action,49 as well as providing the space for challenging existing power structures.50 Of course in these examples, social media is one piece of a much larger puzzle. Further, the distinction between efforts toward beneficial social change and disruption leading to harms is often a matter of perspective. Nevertheless, the point remains: finding some spaces in which people can mobilize is essential to social change.
Conclusion
Ultimately, we believe that filter bubbles and echo chambers are not metaphors we should use to try and understand people’s experiences of their media environments. But we can take two important lessons from the last two decades of study into these metaphors: First, it is essential to understand how search and social media platforms design their tools and how those tools filter, organize, sort, and prioritize content. While we have always relied on tools to help us sift through information and to understand our relationships to others, the systems technology platforms put in place are more complex and more opaque, often intentionally so. A lot more transparency and accountability around this is needed. Second, it is essential to consider how people choose their channels of communication in a high-choice media environment. But we must not assume those processes will lead people to receive only information that confirms their existing beliefs, nor that it will necessarily lead to online harms.
Given the lack of convincing empirical evidence that echo chambers and filter bubbles accurately reflect the online environment, and because of the negative connotations associated with these metaphors,51 we should stop using these metaphors since it distracts from solving the actual problems that democracies face. If it’s not filter bubbles or echo chambers leading to increasing political polarization, then trying to use policies to ensure people do not find themselves in filter bubbles or echo chambers will not be fruitful.
We suggest focusing on policy responses that promote transparency and accountability of platforms which will help us better understand when and how information and relationships are organized and prioritized through algorithmic curation. This can also support better digital literacy and help equip people with the skills they need to critically reflect on what comes across their screens. We also suggest honing in on the underlying causes of perceived online harms and address those with targeted solutions.
Endnotes
Thank you to the Berkman Klein Center community members who contributed to early discussions about this essay and offered valuable insight and support. Thank you to Michelle Bartleman, a Research Assistant on this project.
Van Aelst, P., Strömbäck, J., Aalberg, T., Esser, F., de Vreese, C., Matthes, J.,..., Stanyer, J. (2017). Political communication in a high-choice media environment: A challenge for democracy? Annals of the International Communication Association, 41(1), 3–27.
Sharot, T., & Sunstein, C. R. (2020). How people decide what they want to know. Nature Human Behaviour, 4(1), 14-19.
Pariser, E. (2011). The filter bubble: How the new personalized web is changing what we read and how we think. Penguin Books.
Arguedas, A. R., Robertson, C.T., Fletcher, R., & Nielsen, R. K. (2022). Echo chambers, filter bubbles, and polarization: A literature review. Reuters Institute. https://reutersinstitute.politics.ox.ac.uk/echo-chambers-filter-bubbles-and-polarisation-literature-review
Rader, E., & Gray, R. (2015, April 18-23). Understanding user beliefs about algorithmic curation in the Facebook news feed. In Proceedings of the 33rd annual ACM conference on human factors in computing systems, Seoul, Republic of Korea (pp. 173-182).
Bozdag, E., & Van Den Hoven, J. (2015). Breaking the filter bubble: Democracy and design. Ethics and information technology, 17(4), 249-265.
Ovadya, A. (2022). Bridging-based ranking: How platform recommendation systems might reduce division and strengthen democracy. Belfer Centre for International Affairs, Harvard Kennedy School. https://www.belfercenter.org/publication/bridging-based-ranking
Zuckerman, E. (2013). Rewire: Digital cosmopolitans in the age of connection. W W Norton & Co.
Borgesius, F., Trilling, D., Möller, J., Bodó, B., de Vreese, C., & Helberger, N. (2016). Should we worry about filter bubbles? Internet Policy Review, 5(1). https://doi.org/10.14763/2016.1.401
Bruns, A. (2019). Are filter bubbles real? John Wiley & Sons; Haim, M., Graefe, A., & Brosius, H-B. (2018). Burst of the filter bubble? Effects of personalization on the diversity of Google News.” Digital Journalism, 6(3), 330–43. https://doi.org/10.1080/21670811.2017.1338145; Nechushtai, E., & Lewis, S. C. (2019). What kind of news gatekeepers do we want machines to be? Filter bubbles, fragmentation, and the normative dimensions of algorithmic recommendations. Computers in Human Behavior, 90, 298–307. https://doi.org/10.1016/j.chb.2018.07.043; Duggan, M., & Smith, A. (2016, October 25). The political environment on social media. Pew Research Center. https://assets.pewresearch.org/wp-content/uploads/sites/14/2016/10/24160747/PI_2016.10.25_Politics-and-Social-Media_FINAL.pdf.
Nechushtai, E., & Lewis, S. C. (2019). What kind of news gatekeepers do we want machines to be? Filter bubbles, fragmentation, and the normative dimensions of algorithmic recommendations. Computers in Human Behavior, 90, 298–307. https://doi.org/10.1016/j.chb.2018.07.043
Fletcher, R., & Nielsen, R. (2018). Are people incidentally exposed to news on social media? A comparative analysis. New Media & Society 20(7), 2450–68. https://doi.org/10.1177/1461444817724170.
Dubois, E., & Blank, G. (2018). The echo chamber is overstated: The moderating effect of political interest and diverse media. Information, Communication & Society, 21(5), 729-745. https://doi.org/10.1080/1369118X.2018.1428656
Ledwich, M., & Zaitsev, A. (2019). Algorithmic extremism: Examining YouTube's rabbit hole of radicalization. arXiv preprint arXiv:1912.11211.
O’Callaghan, D., Greene, D., Conway, M., Carthy, J., & Cunningham, P. (2015). Down the (White) Rabbit Hole: The Extreme Right and Online Recommender Systems. Social Science Computer Review, 33(4), 459-478; Whittaker, J., Looney, S., & Votta, F. (2021). Recommender systems and the amplification of extremist content. Internet Policy Review 10(2), 1-29. https://doi.org/10.14763/2021.2.1565
Brown, M., Bisbee, J., Lai, A., Bonneau, R., Nagler, J., & Tucker, J. (2022, October 13). Echo chambers, rabbit holes, and algorithmic bias: How Youtube recommends content to real users. SSRN Electronic Journal. https://dx.doi.org/10.2139/ssrn.4114905
Chen, A. Y., Nyhan, B., Reifler, J., Robertson, R. E., & Wilson, C. (2022). Subscriptions and external links help drive resentful users to alternative and extremist Youtube videos. arXiv:2204.10921 [cs.SI]. https://doi.org/10.48550/arXiv.2204.10921
Sunstein, C. (2004, December). Democracy and filtering. Communications of the ACM, 47(12), 57-59. https://doi.org/10.1145/1035134.1035166
Arguedas, A. R., Robertson, C.T., Fletcher, R., & Nielsen, R. K. (2022). Echo chambers, filter bubbles, and polarization: A literature review. Reuters Institute. https://reutersinstitute.politics.ox.ac.uk/echo-chambers-filter-bubbles-and-polarisation-literature-review.; Bruns, A. (2017). Echo chamber? What echo chamber? Reviewing the evidence. Paper presented at Future of Journalism 2017, Cardiff, 15 Sep. 2017. http://snurb.info/files/2017/Echo%20Chamber.pdf.; Terren, L., & Borge, R. (2021). Echo chambers on Social Media: A systematic Review of the Literature. Review of Communication Research, 9, 99-118. https://doi.org/10.12840/ISSN.2255-4165.028; Guess, A., Nyhan, B., Lyons, B., & Reifler, J. (2018). Avoiding the echo chamber about echo chambers: Why selective exposure to like-minded political news is less prevalent than you think. Knight Foundation, Knight Foundation White Paper.
Barberá, P., Jost, J.T., Nagler, J., Tucker, J. A., & Bonneau, R. (2015). Tweeting from left to right: Is online political communication more than an echo chamber. Psychological Science, 26(10), 1-12. https://doi.org/10.1177/0956797615594620
Eady, G., Nagler, J., Guess, A., Zilinsky, J., & Tucker, J. (2019). How many people live in political bubbles on social media? Evidence from linked survey and Twitter data. SAGE Open, 9(1), 1-21. https://doi.org/10.1177/2158244019832705; Boutyline, A., & Willer, R. (2017). The social structure of political echo chambers: Variation in ideological homophily in online networks. Political Psychology, (38)3, 551-569.
Dubois, E., & Blank, G. (2018). The echo chamber is overstated: The moderating effect of political interest and diverse media. Information, Communication & Society, 21(5), 729-745. https://doi.org/10.1080/1369118X.2018.1428656
Guess, A., Nyhan, B., Lyons, B., & Reifler, J. (2018). Avoiding the echo chamber about echo chambers: Why selective exposure to like-minded political news is less prevalent than you think. Knight Foundation, Knight Foundation White Paper; Kim, M., & Lu, Y. (2020). Testing partisan selective exposure in a multidimensional choice context: Evidence from a conjoint experiment. Mass Communication and Society, 23(1), 107-127. https://doi.org/10.1080/15205436.2019.1636283
Garrett, R. K., Carnahan, D., & Lynch, E. K. (2013). A turn toward avoidance? Selective exposure to online political information, 2004-2008. Political Behaviour, 35, 113-134; Jang, S. M. (2014). Seeking congruency or incongruency online?: Examining selective exposure to four controversial science Issues. Science Communication 36(2), 143-167; Weeks, B. E., Ksiazek, T. B., & Holbert, R. L. (2016). Partisan enclaves or shared media experiences? A network approach to understanding citizens’ political news environments. Journal of Broadcasting & Electronic Media, 60(2), 248-268; Kim & Lu, 2020).
Fletcher, R., & Nielsen, R. (2018). Are people incidentally exposed to news on social media? A comparative analysis. New Media & Society 20(7), 2450–68. https://doi.org/10.1177/1461444817724170.
Arguedas, A. R., Robertson, C.T., Fletcher, R., & Nielsen, R. K. (2022). Echo chambers, filter bubbles, and polarization: A literature review. Reuters Institute. https://reutersinstitute.politics.ox.ac.uk/echo-chambers-filter-bubbles-and-polarisation-literature-review; Dubois, E., & Blank, G. (2018). The echo chamber is overstated: The moderating effect of political interest and diverse media. Information, Communication & Society, 21(5), 729-745. https://doi.org/10.1080/1369118X.2018.1428656
Boutyline, A., & Willer, R. (2017). The social structure of political echo chambers: Variation in ideological homophily in online networks. Political Psychology, (38)3, 551-569.
The Digital News Report is based on an online survey with 2055 respondents. The results have been weighted to be representative of the population of Canadian adults. See the original work for details on methodology; Charlton, S., & Leclair, K. (2019). Digital News Report Canada. Digital News Report. https://www.cem.ulaval.ca/wp-content/uploads/2019/06/dnr19_can_eng.pdf
Prior, M. (2013). Media and political polarization. Annual Review of Political Science, 16, 101-127.
Gidron, N., Adams, J., & Horne, W. (2020). American Affective Polarization in Comparative Perspective. Cambridge: Cambridge University Press. https://doi.org/10.1017/9781108914123
Lee, S. (2017, December 27). Obama warns of social media echo chamber, in a Prince Harry interview. The Mercury News. https://www.mercurynews.com/2017/12/27/obama-warns-of-social-media-echo-chamber-in-a-prince-harry-interview/
Schkade, D., Sunstein, C. R., Hastie, H. (2007). What happened on deliberation day? California Law Review, 95(30), 915-940.
Ferreri, M. (2022, February 17). “Emergencies Act.” Canada. Parliament. House of Commons. Debates, 151(33). 44th Parliament, 1st Session. https://www.ourcommons.ca/DocumentViewer/en/44-1/house/sitting-33/hansard
Adams, M., & Parkin, A. (2022, December 10). Surveys show Canadian are less polarized and angry than Americans. Toronto Star. https://www.thestar.com/opinion/contributors/2022/12/10/surveys-show-canadian-are-less-polarized-and-angry-than-americans.html
Webster, J. G., & Ksiazek, T. B. (2012). The dynamics of audience fragmentation: Public attention in an age of digital media. Journal of Communication, 62(1), 39-56. https://doi.org/10.1111/j.1460-2466.2011.01616.x
Nelson, J. L., & Webster, J. G. (2017). The Myth of Partisan Selective Exposure: A Portrait of the Online Political News Audience. Social Media + Society, 3(3), 1-13. https://doi.org/10.1177/2056305117729314
Owen, T., Loewen, P., Ruths, D., Bridgman, A., Gorwa, R., MacLellan, S., Merkley, E., Potter, A., Skazinetsky, B., & Zhilin, O. (2019). Digital Democracy Project. Research Memo #3: Polarization and its discontents. Public Policy Forum. https://ppforum.ca/wp-content/uploads/2019/09/DDP-Research-Memo-3-Sept2019.pdf
Nordbrandt, M. (2021). Affective polarization in the digital age: Testing the direction of the relationship between social media and users’ feelings for out-group parties. New Media and Society, 1-20. https://doi.org/10.1177/14614448211044393
Metzler, H., & Garcia, D. (2022). Social drivers and algorithmic mechanisms on digital media. PsyArXiv. https://doi.org/10.31234/osf.io/cxa9u
Bruns, A. (2017). Echo chamber? What echo chamber? Reviewing the evidence. Paper presented at Future of Journalism 2017, Cardiff, 15 Sep. 2017. http://snurb.info/files/2017/Echo%20Chamber.pdf.
Balcells, J., & Padró-Solanet, A. (2016). Tweeting on Catalonia’s independence: The dynamics of political discussion and group polarisation. Medijske Studije, 7(14), 124-141.
Garrett, R. K., Gvirsman, S. D., Johnson, B. K., Tsfati, Y., Neo, R., & Dal, A. (2014). Implications of pro- and counterattitudinal information exposure for affective polarization. Human Communication Research, 40, 309-332. https://doi.org/10.1111/hcre.12028
McPherson, M., Smith-Lovin, L., & Cook, J.M. (2001). Birds of a feather: Homophily in social networks. Annual Review of Sociology, 27, 415-444. https://doi.org/10.1146/annurev.soc.27.1.415
Haller, M., & Hadler, M. (2006). How social relations and structures can produce happiness and unhappiness: An international comparative analysis. Social Indicators Research, 75, 169-215.
Miño-Puigcercós, R., Rivera-Vargas, P., & Cobo Romaní, C. (2019). Virtual communities as safe spaces created by young feminists: Identity, mobility and sense of belonging. Identities, Youth and Belonging: International Perspectives, 123-140.
Lucero, L. (2017). Safe spaces in online places: Social media and LGBTQ youth. Multicultural Education Review, 9(2), 117-128.
Frangonikolopoulos, C. A., & Chapsos, I. (2012). Explaining the role and the impact of the social media in the Arab Spring. Global Media Journal: Mediterranean Edition, 7(2), 10-20.
Mundt, M., Ross, K., & Burnett, C. M. (2018). Scaling social movements through social media: The case of black lives matter. Social Media + Society, 1-14. https://doi.org/10.1177/2056305118807911
Richardson, A. V. (2020). Bearing witness while Black: African Americans, smartphones, and the new protest# journalism. Oxford University Press, USA.
Bruns, A. (2017). Echo chamber? What echo chamber? Reviewing the evidence. Paper presented at Future of Journalism 2017, Cardiff, 15 Sep. 2017. http://snurb.info/files/2017/Echo%20Chamber.pdf; Bruns, A. (2019). Are filter bubbles real? John Wiley & Sons.