Three Numbers About Silicon Valley That Should Scare The Shit Out Of Us — And One That Should Give Us Hope

Jumana Abu-Ghazaleh
8 min readAug 14, 2019

--

Photo by Matthew Henry on Unsplash

There’s an old saying about how people go bankrupt: slowly, and then all at once.

Slowly, and then all at once, could also describe the way that social tech has changed the world over the last 30 years.

There was no singular moment it all changed — the day after you bought your first smartphone was not too terribly different from the day before.

Smash cut to today: a world where the momentary whims of a Zuckerberg or a Bezos could tomorrow bring the economy, democracy, or shared truth as we know it, to a standstill.

It’s a world in which many of us now live permanently awash in outrage fatigue, anxiety fatigue, fatigue fatigue. It’s an understandable response. It’s also likely part of why multiple prominent academic psychologists are concerned that the devices in our pockets are precipitating a mental health emergency.

So how do we change course? Stop the fear?

I’d like to suggest that the best way to stop a recurring fear is to address what’s causing it; and we can’t do that effectively until we adopt the bravery necessary to understand and examine what we’re afraid of. It’s only then that we can take meaningful action.

So let’s put some facts to the fears.

These three numbers stopped me in my tracks. I hope they stop us all in our tracks — and suggest a change in direction.

2,617, or The Number Of Times Per Day The Average American Smartphone User Touches Their Device

That is the last reported average as of 2016 — it could well be higher now. Again, that’s the average, not the ultra-heavy-user number.

Let’s put this number in perspective: the average person takes between 12 and 18 breaths per minute.

That’s between 11,520 and 17,280 breaths while awake (assuming a generous 8 hours of sleep).

Which means that, on average, we check our phones every five breaths we take. (Cue The Police). And that’s not counting the hours the phone rests under the pillow.

75,000, or The Number of Data Points Collected and Sold by Private Companies About The Average American Consumer

Do you see a therapist?

If that seems like an overly personal question, consider that Apple, Google, Amazon and Facebook all quite likely know already.

And if you do have a therapist, do you think they could, right now, write down 75 distinct facts about you? How about 750? 7,500?

The tech giants and their data vendors could. By another factor of 10. Easily. As of 2017, private companies collect and sell 75,000 individual data points about the average American consumer. (And, yet again, note that that is the average.)

Think of everything that’s learnable about you from your online behavior — or simply the GPS location of your cell phone as you do your errands — or even your mouseclicks. (Time to re-cue The Police: every Amazon bill, every prescribed pill…)

It’s all being collected. And it’s all for sale.

15,000,000,000, or The Number Of Times YouTube Recommended Alex Jones Videos To Users Before Shutting Down His Account

That’s 15 Billion. With a B.

Alex Jones is merely the most visible face of this phenomenon wherein YouTube endlessly auto-suggests a stream of extremist content to viewers not searching for it. In recent years, much of the growth of the most reckless fringe voices has taken place quietly, one YouTube click at a time: when an academic researcher at a major flat-earth conference in 2018 interviewed attendees, she found a stunning 97% cited YouTube videos as their catalyst for becoming a flat-earther. All of the above are made possible through a YouTube-aided phenomenon called rabbit-holing: the process by which people sit down to watch a single video, and lift their head up four hours later to find an endless wave of algorithmically determined auto-playing auto-suggestions have eaten up their afternoon.

But it’s not just the initial suggestions of content — it’s what happens next. An MIT study found that on Twitter alone, false stories spread six times as far as true stories — and that there’s a kind of compound interest that takes place over time, where real news slows in shareability as time goes on, but fake news only grows.

What does all this add up to?

As an average user, I touch my smartphone +/- 2,617 a day. That yields +/- total 75,000 data points. These data points are used by others to customize my entire experience of truth and the world at large — sometimes by tech giants who have shown a willingness to experiment with inciting depression in their user base, i.e., humanity at large.

“Humanity at large” isn’t my attempt at being dramatic: Facebook has 2.4 Billion human users. That’s about one out of every three humans — just about as many people as live in Europe, Africa, and North America combined.

Google’s YouTube, by itself, has just under 2 Billion. That’s bigger than North and South America combined — times two.

That’s a lot of people, spending a lot of time, touching a lot of devices.

What we’re talking about here, really, is the power of a select few to transform shared reality into a set of hyper-atomized, hyper-individualized private realities. Both in perception — what vision of “truth” we’re automatically presented in our rabbit holes and ever-present advertisements — and in reality — even in something as mundane as the varying shampoo prices different people see — without ever having our explicit consent.

This is an unfathomably immense power: in an age of a shrinking commons, unstable governments, and growing consolidated corporate power, the titans of social tech hold our collective fates in their hands.

Wars. Elections. Genocides. Who goes to jail. Who gets the job. Who gets to live. Who gets to die. The fate of entire nation-states — all have turned, and will continue to turn, on the private decisions of a select few, unelected engineers, in a select few, deeply guarded rooms.

What It’s Like To Be A Captive Audience

Who is looking out for us? Very few of us believe the tech giants are. Last year, a supermajority of American Pew poll respondents (72%) stated that they believed the tech companies could be trusted to do the right thing either only “some of the time,” or “hardly ever.”

How many thought they could always be trusted? 3%.

And yet we still use these tools. Because to simply take part in the modern economy — to not be a hermit — we have to. These companies realize they have a more or less captive audience; they know we rely on their tools to simply exist in today’s civil society. And in turn, we know that when our government bothers to fine or regulate them, their half-hearted efforts somehow make the tech giants even richer. (The day after the Federal Trade Commission fined Facebook $5 Billion as a response to the Cambridge Analytica scandal, its market capitalization grew by approximately $6 Billion.)

Where Do We Find Hope Then?

This is not about muting notifications. It’s not about cookies. It’s not about addiction. Or misinformation. Or inciting violence. It’s about all of these and more. All at once.

The industry needs a new northstar to replace the default ‘move fast and break things’ which moved so fast, and broke so much. The industry needs a unifying vision that all technologists can commit to, be held accountable to and take pride in.

The industry must grow up. The industry must undergo professionalization. Like doctors did. Like engineers did. Like any trade that impacts the well-being of massive groups of people.

And we think they will. Because it’s not all bad news out there.

The good news is that tech workers themselves are starting to demand change.

Here’s one number that gives me hope.

20,000, or The Number Of Google Workers Who Walked Out In Protest Of The Company’s Sexual Harassment Policy

In an age where tech giants are routinely handed the power of life and death once typically held only by state actors, building ground-level tech worker power to dissent and object — the power of the people making the day-to-day changes to code, and those providing the backbone of the company beyond a computer terminal — is really encouraging. And vital.

Virtually every public tech crisis of the last decade, from Cambridge Analytica, to the serial problems behind Google+, to social media impacts on teenage mental health, have had the same basic backstory: workers told management about the problem, begged, pleaded with them to listen — and were told, subtly or not so subtly, that they had no interest in listening.

Last November, 20,000 Google workers in 50 cities took part in a coordinated walkout, in protest of company sexual harassment policy, among other workplace concerns. A few months earlier, thousands of low-level Google engineers signed onto a letter demanding Google drop an ethically questionable Pentagon contract — and won. Meanwhile, Amazon is facing its own internal tech worker revolt over its work with Trump’s ICE division — and that’s not even to mention the Prime Day strike of their warehouse workers.

A “see something, say something” poster in a lounge is no match for a KPI document screaming at you about user growth. But structures that allow workers to organize are a match and the status of conscientious objector at those companies must be respected and protected.

At the same time, we know that increased worker power at a single company won’t do the job: when Google workers revolted internally against morally questionable Pentagon contracts, Google was eventually forced to acquiesce to their demands — only to have Jeff Bezos brag days later about his intention to take those contracts from Google, no matter what his workers thought of that decision.

No. Fixing the industry is going to take a holistic, top-to-bottom set of changes — of which building worker power is a necessary, but insufficient-by-itself, step.

The writing is on the wall: Silicon Valley’s next generation is searching for a more thoughtful and responsible way forward; venerable academic institutions like MIT are reinventing themselves to better prepare students for responsible development and deployment of technology; more and more tech entrepreneurs are rejecting “toxic V.C.s”; some V.C.s themselves are issuing warnings about the dangers of investing in toxic business models; and even the giants are asking for regulation.

The truth is, Silicon Valley already wants to professionalize — it just doesn’t know it yet.

For the sake of our mental health, our civil society and our economy, let’s get on with it already.

--

--

Jumana Abu-Ghazaleh
Jumana Abu-Ghazaleh

Written by Jumana Abu-Ghazaleh

Founder @ Pivot For Humanity. Published in Fast Company, OneZero, IEEE Technology + Society. Board member. Palestinian. Start with empathy, always.

No responses yet