One central concern in current academic and public policy debates is how personalized recommendations used by search engines, social media and also more traditional news media impact the realization of media and information diversity. The algorithmic filtering and adaption of online content to personal preferences is often associated with a decrease in the diversity of information to which users are exposed fears about (e.g. ‘filter bubbles’ or ‘echo chambers’). However, filtering and recommender systems are not all similar and their impact is not beyond human control. At least in principle, recommendation systems can also be designed to stimulate more diverse exposure, and to expose viewers to opposing viewpoints and contextual information. While companies like Facebook and Google have reacted to recent debates on ‘fake news’ with steps in this direction by introducing features that promote ‘diverse perspectives’, questions remain on the commercial and strategic incentives of platforms and news providers, the principles that such ‘diversity sensitive design’ should be based, and how policy-makers may ‘nudge’ companies to promote more diverse exposure.

See Helberger, N., Karppinen, K., & D’Acunto, L. (2018). Exposure Diversity as a Design Principle for Recommender Systems. Information, Communication & Society 21(2), 191-207.