a cube with different colored squares

Source: Geralt

In February of 2011, pizza delivery driver Susan Guy saved the life of customer Jean Wilson after she went 3 days without ordering a pizza. 

Wilson had ordered one large pepperoni pizza everyday for 3 years. When Guy’s boss mentioned that the 82 year old Wilson hadn’t called, she decided to drive to Wilson’s house and check on her. 

No one replied to Guy’s knocking, and a neighbor told Guy that she never saw Wilson leave the house. Guy called 911, and police broke down the door and found Wilson lying disabled inside. “She treats us really well,” Guy said of Wilson to a regional news station. “She appreciates us, and that’s something we don’t get in customers a lot.” Emergency personnel took Wilson to the hospital where she made a quick recovery. 

In what may be a sign that this once globetrotting writer has gone too Silicon Valley, his first thought was what would have happened if Wilson ordered her daily pepperoni pizza through an app like Seamless, Grubhub, or Caviar instead of over the phone. Would Guy’s boss have noticed? Would someone have known that Wilson was an octogenarian who might need help? 

This line of questioning fits a perspective about how apps and new technology are replacing human connections with algorithms. Making modern life more impersonal. 

The perspective reflects a widespread tendency to describe the effects of “technology” — as if every new technology conspires to change our lives in the same way: Technology is driving us apart. Technology is killing our ability to focus. Technology is making the world flat. Technology is shrinking the globe. Technology is turning us all into self-centered narcissists. People describe technology as singular, not plural, with a zeitgeist that is going in one, easy to surmise direction. 

When people use the word ‘technology’ this way, they seem to mean consumer products, especially those that use the Internet. (People blame the demise of dating on Tinder and texting, not solar panels.) Or the Internet as a whole. The Shallows — a book that asks, “Is Google making us stupid?” to argue that “we are becoming ever more adept at scanning and skimming, but what we are losing is our capacity for concentration, contemplation, and reflection” — is an example of a book taking on the entire Internet. 

But as the author of The Shallows himself acknowledges, people have a tendency (dating back to Plato’s worry that the adoption of writing would make people forgetful) to either glorify or expect the worst of every new technology. Due to either a lack of evidence or an impatience to wait for it, as well as a universal bias to overestimate the impact of future events, people tend to make grand pronouncements on the meaning of new technology. But what if you look at the evidence?

—–

One response to the type of argument raised in The Shallows is that the Internet may represent a fair trade-off: less ability for deeper concentration, but greater ability to multitask and make disparate connections. Maybe a shallow but wider awareness is a good skill in the Internet age?

But the work of 3 Stanford researchers makes it hard to buy this argument. The professors recruited students who self-identified as either heavy or light “media multitaskers” to perform a series of tasks. The students who often juggled papers, email, chat services, and YouTube performed worse at ignoring irrelevant information and switching between tasks. The researchers expected the first result, but the fact that the multitaskers did worse at switching between tasks suggested that the more you multitask, the worse you get at it.

Another common technology trope is that people spend more time at home, alone, fixated on a screen and less time in public engaging with others. This January, The New York Times Magazine reported on the efforts of Keith Hampton to measure whether this was true. Hampton tracked down films of urban spaces made in the 1960s-1970s by sociologist William H. Whyte. By filming the same public spaces today, and comparing the footage, Hampton and his research team attempted to quantify how mobile phones have impacted public space. 

The results will disappoint luddites. Hampton found that cell phone use was a low 3-10%, and it was extremely rare for a pair or group of people to ignore each other in favor of their touchscreens. In addition, he found more people spending time in public squares in all but one location. In fact, the most striking find was about gender equity. The Times Magazine writes:

Across the board, Hampton found that the story of public spaces in the last 30 years has not been aloneness, or digital distraction, but gender equity. “I mean, who would’ve thought that, in America, 30 years ago, women were not in public the same way they are now?” Hampton said. “We don’t think about that.” 

Our blind spots extend to politics. Writing in the Boston Review, Evgeny Morozov describes the cyber-utopianism he sees in political analysis of the Internet:

In 1989 Ronald Reagan proclaimed that “The Goliath of totalitarianism will be brought down by the David of the microchip”; later, Bill Clinton compared Internet censorship to “trying to nail Jell–O to the wall”; and in 1999 George W. Bush (not John Lennon) asked us to “imagine if the Internet took hold in China. Imagine how freedom would spread.”  

From helping Kenyans monitor electoral violence to allowing Egyptians to share videos of military abuses on YouTube, the Internet has been a boon to democracy activism. But critics like Morozov rebut that the Internet is not inherently pro-democracy. China’s great firewall still functions — if imperfectly — and the Internet can seem to have a pro dictator bias. In Libya and Iran, for example, face recognition software identified participants in protests from pictures on social media, allowing police to arrest or intimidate them. Dictators have also found it easier to eavesdrop on digital communications between activists and opposition than on secret, in-person meetings.

—–

It’s a worthwhile enterprise to ask, as Nicholas G. Carr does in The Shallows, how the Internet may shape our thinking and experiences. Political scientists should ask how information and communication technologies affect international relations and economic development. The Internet is too big to ignore.

The problem lies in expecting the Internet, which has innumerable applications, to have one tautological effect. The aforementioned Morozov, an ebullient critic of Silicon Valley and Ted Talk technobabble, despises these one-size-fits-all narratives about the Internet. He writes of Clay Shirky, Columbia’s voice on the social and economic effects of the Internet:

Shirky’s rhetorical moves are well-known: assume that “the Internet” has similar effects everywhere and then use “the Internet” to claim expertise over various fields it disrupts. Magically, an expert on “the Internet,” Shirky also becomes an expert on everything it disrupts. A foundational myth of “disruption” – in Shirky’s case, it’s the appearance of Napster – does the rest.

Thus, Shirky’s job, as a consultant-cum-intellectual is to warn of the impending “Napster moment” in one industry after another: journalism, democracy promotion, education.

Morozov’s critique is that too many people treat the Internet like some primordial force unleashed from Pandora’s Box. The Internet has many parts and applications that are created by men and women according to their goals and interests. Yet all too many experts and laymen treat it as one supernatural force working through its own inevitable agenda.

We’d be better served searching for the specific effects of new apps, companies, and products instead of lumping them into cliché stories about the Internet.

Take the example of Twitter, which easily fits into the thesis of The Shallows. Co-founder Jack Dorsey imagined Twitter as a way to announce your “current status” to friends, from the potentially useful “going to the park” to the banal “in bed.” A million haters (this author included — mea culpa) immediately derided it as narcissistic, while Bill Keller of the New York Times described Twitter and other social media as “the epitome of in-one-ear-and-out-the-other.”

But Twitter is a tool, and it inspired many other narrative-bashing uses. Professionals use it to share and follow industry news, athletes and celebrities to build their brand, and people debate politics and political correctness in digital town halls. Twitter also brought the voices of protesters around the world into everyone’s living room, at which point journalists decided that social media and technology savvy youth were actually responsible for every revolution of the 21st century. 

Coursera, the provider of Massive Online Open Courses (MOOCs) allows anyone with an Internet connection to take a class on machine learning from a Stanford professor or a business management class from a Wharton professor. The founders describe their goal as making education a human right. As a result, Coursera fits easily into narratives about the Internet democratizing access to information along with Wikipedia. Innumerable commentators took the Shirky approach of expecting it to “disrupt” higher education à la Napster.

Battushig Myanganbayar, a young Mongolian student, fits this picture well. While living in a remote area, he aced a sophomore-level MIT circuits class at age 15. The Times covered his story, calling him “The boy genius of Ulan Bator.” Myanganbayar’s success vaulted him, however, to MIT; universities say that they will look to MOOCs to recruit promising students from remote areas like Myanganbayar. Forming a new pipeline of talent to the ivory tower is valuable, but not necessarily the same as democratizing higher education. 

Further, studies on MOOCs users have found that they mostly benefit middle and upper class students who already have a degree rather than lower class students without the means to benefit from MOOCs or the habits to take the class effectively (which are often learned in college). This led Sebastian Thrun to pivot from tackling the education divide to providing corporate training. Whether or not MOOCs move the needle on democratizing access to education remains to be seen, and it will be decided by the action of men and women in the industry. 

A final, non-Internet centric example is the overthrow of the stereotype of video game players as social outcasts. A study in the Journal of Computer-Mediated Communication documented how gamers augment their social life through gaming by playing together. In the words of one of the study’s authors:

“Gamers aren’t the antisocial basement-dwellers we see in pop culture stereotypes, they’re highly social people. This won’t be a surprise to the gaming community, but it’s worth telling everyone else. Loners are the outliers in gaming, not the norm.”

There are other pros and cons to debate of course: There’s the exercise component; the impact of games on young children, as educators believe the constraints of games inhibit how children develop by creating their own world with their imaginations; and in another illustration of the incredible limitations on our intuitions, before people pointed to The Matrix and violent video games as causes of the Columbine shootings, people believed that violent television and games could be calming, by playing out violent impulses in a safe environment.

—–

In sum, be wary of anyone explaining where technology is taking us whose pronouncements seem one sided or conclude that “tech” is having the same results everywhere, without a role for human agency. The truth is somewhere between the medium is the message and the fact that technology is a tool that can be used in innumerable ways, good and bad. Different tools will not all fit one-size-fits-all stories about “The Internet,” and their effect is neither easy to predict nor written in stone. (An obvious point that thought leaders and plebeians alike eagerly ignore when reviewing a new product or application.)

Calls for more nuance and less clichés and snap judgements are a dime a dozen. But it’s desperately needed in this case to save us from polarized debates of “The Internet will fix everything” versus “The Internet will turn us all into distracted narcissists.” To do anything else is as silly as opposing new food delivery apps to pine for the good ole days when we could trust our pizza delivery crews to look after senior citizens.

This post was written by Alex Mayyasi. Follow him on Twitter here or Google PlusTo get occasional notifications when we write blog posts, sign up for our email list.