Many conservatives will tell you not to read the New York Times or listen to the lamestream media. Their liberal counterparts dismiss Fox News as a propaganda arm of the Republican Party. Conspiracy theorists and adherents of more niche ideologies may tell you to distrust all major media, arguing that they preserve the status quo. Although people prefer to point out the biases of their political sparring partners rather than their own, we are all aware that the individuals and organizations that analyze and curate the world’s information impose the lens of their own narrative framework.

Could we say the same about the company whose mission is “To organize the world’s useful information”? In other words, could we say the same about Google?

It may seem like a strange comparison. We can imagine Bill O’Reilly and his staff routinely ascribing dips in the stock market to the latest moves of the Democratic Party. We can imagine a writer at the New York Times choosing to write an article about improvements made on the Healthcare.gov website rather than its remaining glitches. But Google presents information according to an algorithm. There’s no man behind the curtain imposing his own view. The code treats searches about Obamacare the same way it treats searches for McDonalds near Scranton, Pennsylvania.

But just because no one at Google plots to present information with a certain ideological bent does not mean that no biases exist in how Google presents information.

Search engine algorithms bring up highly rated web pages that — for example — have been linked to by many other prominent web pages. It’s a useful feature that ensures someone searching for “George W. Bush” will see the former president’s Wikipedia page rather than a term paper written by a B- high school student. However, it also has a bias toward the mainstream and status quo — the New York Times over the Boston Review — and perhaps at times toward the sensational over the sound. We could call it the “SEO bias.”

The articles, photos, and commentary people post to Facebook also represents a news source and lens through which we understand current events. The fact that newsfeeds are filled by the posts of friends who are likely to share one’s own views has contributed to the Balkanization of the Internet. This is exacerbated by Facebook’s personalization of newsfeeds: The more a user clicks on and “Likes” articles and posts from a certain friend, the more that friend will dominate his or her newsfeed. So if someone keeps liking left-leaning statuses and articles from a few friends, soon those friends’ liberal articles will crowd out any other viewpoints.

Since Google personalizes search results in a similar fashion, search engines suffer from the same “filter bubble” problem in which personalization results in Internet users seeing only the views with which they already agree. One commentator gave the example of two people searching for information about BP: One would see search results about the Deepwater Horizon oil spill, while the other’s results would focus on investment news.

The BP oil disaster also reveals how the existence of Google’s advertising, or “sponsored links,” can shape perceptions. After the oil spill, BP purchased ads for search terms on Google and Yahoo like “oil spill” that linked Internet users to BP’s own press releases and news on the spill. Since many users don’t recognize the difference between sponsored links and normal results, BP effectively bought control of a large share of public understanding of the spill.

These biases in social media and on search engines are not inevitable. Two researchers at Yahoo Labs, for example, have proposed a way to tackle the “filter bubble” in social media. As Technology Review writes of their research:

Their idea [is] that although people may have opposing views on sensitive topics, they may also share interests in other areas. And they’ve built a recommendation engine that points these kinds of people towards each other based on their own preferences.

The result is that individuals are exposed to a much wider range of opinions, ideas and people than they would otherwise experience. And because this is done using their own interests, they end up being equally satisfied with the results (although not without a period of acclimitisation). “We nudge users to read content from people who may have opposite views, or high view gaps, in those issues, while still being relevant according to their preferences,” say Graells-Garrido and co.

The researchers tested their idea by recruiting Chileans with different views on abortion, a polemic topic in the country, to use a modified version of Twitter. The full paper is publicly available here, and describes how users saw more opposing viewpoints on abortion by being nudged toward following users who shared other interests like soccer.

The bias toward those willing to buy control of a narrative in the way BP did is likely here to stay, as search engines will not eliminate ads. (Although a nonprofit, open-source search engine could eschew ads.) But we imagine that a group of motivated engineers could create a solution to the SEO bias toward the mainstream. Perhaps the answer would be to randomly populate a share of search results with pages that have less strong page ranking.

The important point is that these are choices that have important consequences on the news, commentary, and information that the world sees. The fact that an algorithm is responsible means that one person may not control how we get information the same way that a domineering media mogul will strive to do. But the engineers behind the algorithms make choices that determine the lens through which we see the world.

Media companies understand this role and responsibility; the industry has some civic-minded code of conduct. News providers get pushback for squashing articles that would piss off advertisers. They have an ethos of impartiality and a norm of representing a diversity of opinion (even if they often fail to meet it). And they appoint public editors to watchdog the journalistic ethics of the newspaper or news program on behalf of the public it aims to inform.

Social media and search engines play an increasing role in shaping our understanding of world events, yet we generally don’t consider these technology companies as having a similar responsibility to the public.

Not that these companies are shying away from the responsibility. Google’s mission statement reads “Google’s mission is to organize the world’s information and make it universally accessible and useful.” Facebook’s reads: “Facebook’s mission is to give people the power to share and make the world more open and connected.” With such grandiose missions, it’s up to these companies to think seriously about how they fulfill this role.

This post was written by Alex Mayyasi. Follow him on Twitter here or Google PlusTo get occasional notifications when we write blog posts, sign up for our email list.