Image for post
Image for post
Photo by Simon Rae on Unsplash

(Note: this essay is adapted from a presentation made by the author at IEEE’s ISTAS2020 — it can be viewed here.)

If you’re reading this in the United States, it’s statistically quite likely that you were traveling for the Thanksgiving holiday around this time last year, and are now staying home.

First of all, thank you for your sacrifice and responsibility in not risking the further spread of covid-19 and contributing to the second wave.

I also want to ask you a question that might not seem like it has much to do with Thanksgiving, but I promise you, it does. …

Image for post
Image for post
Photo by Markus Spiske on Unsplash

When I heard Epic Games was going to war with Apple over the app store, I didn’t think of Fortnite.

I thought of Travis Kalanick, the co-founder of Uber.

Kalanick has recently been portrayed as the embodied avatar of the worst traits of Silicon Valley upstart CEOs — crudely misogynistic, utterly without respect for privacy, encouraging a deeply toxic workplace — with a startling God complex.

Literally: in Kalanick’s Uber, employees infamously had access to a “God mode” they could use to spy in real-time on the movements of everyone from ex-partners, to politicians, to celebrities.

Mike Isaac’s page-turner, Super Pumped, details Kalanick’s rise and fall painstakingly. But one thing Isaac’s book uncovered that received shocking little attention is the single person in front of whom Kalanick’s God complex entirely deflated: Apple CEO Tim Cook. …

The most important election this year has only one voter

Image for post
Image for post
Photo: Chip Somodevilla/Getty Images

If it’s not the single most powerful individual position in the world, it only has a few rivals. Think general secretary of the Chinese Communist Party. Or the Pope. Or chair of the U.S. Federal Reserve. Or maybe the crown prince of Saudi Arabia.

It’s a position that’s been embroiled in a contentious conversation for years now about the scope, and possibly criminal misuse, of the current incumbent’s power: the ability to encourage wars between nuclear powers. To rally or slow down global markets. To end, or initiate, horrifying human rights crises around the world.

The conversation has reached a fever pitch this year — just in time for election season. …

Image for post
Image for post
Photo by Lysander Yuen on Unsplash

Do you know a shorter way to write this sentence?:

“A podiatrist, an obstetrician, and a heart surgeon walk into a bar.”

Well, that’s pretty easy: “Three doctors walk into a bar.”


Let’s try this one on for size:

“An estate attorney, a public defender, and a Harvard Law professor pick up some sorbet together.”

Well, sure. “Three lawyers get some sorbet.” Asked and answered.

One more: a software engineer, a UX designer, and Sheryl Sandberg go for a bike ride together.

What do you have?

Three computer scientists? Absolutely not. Computer science is a defined academic field, and while it’s helpful for working on websites and apps, it’s 100% not required. …

Tech companies must take responsibility for their products’ harmful impacts, even if unintended

Image for post
Image for post
Demonstrators protesting police violence on June 2, 2020, in New York City. Photo: Ira L. Black/Corbis via Getty Images

Countering anti-Blackness within our society is a target-rich environment; white supremacist thinking is engrained in every industry, which means a need for each to be improved or totally rebuilt. That includes Silicon Valley and the technology it’s building to fuel our future.

Big Tech’s complicity with structural, systemic anti-Black racism and harming disadvantaged communities has long been evident but little improved. Recent protests over the murders of George Floyd, Breonna Taylor, and too many other Black Americans at the hands of police have forced tech workers and executives alike to take it seriously.

Sign up for The Bold Italic newsletter to get the best of the Bay Area in your inbox every week.

Image for post
Image for post
Photo by Lukas Blazek on Unsplash

I was having a conversation with a friend the other week, on the mend from non-covid pneumonia. My friend lives in New York, which, at the time of this writing, is the epicenter of a global respiratory pandemic. It was not exactly, as you might imagine, a cheerful conversation.

But there was something he said offhandedly that turned my ear and, with permission, I’d like to share it here:

“Well, at least the future is not canceled.”

“Are you sure about that?

“Yeah, pretty sure. Seems like the future is on hiatus.”

The future is on hiatus.

All the markers of time we use to conceptualize the passing of time — the public holidays, the releases of new films in theaters, the festivals and parades, even primaries and political conventions — are all, at best, provisionally rescheduled. Not canceled — yet. But waiting for some resolution we can’t quite put a timestamp on yet. Postponed. On pause — to return at some later, albeit indeterminate date. What kind of timeline are we talking about as it pertains to our actual future? Could be June, or September,or next year once a vaccine is developed. We vaguely can imagine life resuming over the summer, before reentering a state of suspended animation once the winter comes back. …

It’s time to imagine an alternative future for the tech industry

Image for post
Image for post
Photo: Steve Proehl/Getty Images

Right now, my Silicon Valley friends are feeling a lot of things.

Quite understandably — we all are. This is a perilous, slippery moment, one where straightforward precedents are hard to find. We stay home to the greatest extent possible. We hope the ground doesn’t give way from under us. We just don’t know. We wait and see. We work from home if we can.

Many tech workers are fortunate enough to be able to do so. (Though certainly not all of them.) Those who are currently working away from their supervisor’s physical presence and far from the influence of a campus that encourages maximum productivity have an opportunity to think about what they’re building — for the first time in perhaps a very long time — in real depth. …

It’s time to stop letting ourselves be the ‘abandoned carcass’ of our data

Image for post
Image for post
Photo: Drew Angerer/Getty Images

“We shine the light on whatever’s worst
Tryna fix something
But you can’t fix what you can’t see…”
Beyonce, “Pretty Hurts”

Just say what pops into your head first, no Googling. When you hear, who do you think of?

Jeff Bezos, right?

Let’s try this again. When I say Apple, you think…

Steve Jobs or Tim Cook. Automatic. Maybe you threw in Steve Wozniak. (Look at you!)

And Facebook?

Well, duh: Mark Zuckerberg, with Sheryl Sandberg trailing behind.

Now let’s try a slightly harder one: who first comes to mind when I say Google?

It’s possible you said Sergey Brin or Larry Page, but unlikely — neither Google co-founder has ever become a household name like Jobs or Bezos or Zuckerberg. …

First we have to understand what innovation really is and is not

Image for post
Image for post
Photo: Terry Vlisidis/Unsplash

When you’re representing a point of view that hasn’t been widely propagated in a given debate, it’s only natural to encounter skepticism and resistance. When that point of view challenges “conventional wisdom” around creativity, ingenuity, and human progress, the criticism is fast and furious.

So when I posit that the best way for Silicon Valley to address its problems is through a process that industrial sociologists call “professionalization,” it comes as no surprise that I am more often than not met with a mixture of befuddlement and derision.

Professionalization is the step-by-step process by which a trade or vocation — a wild west comprised of freelance practitioners — becomes a profession, with universal and, crucially, enforceable standards and norms. An industry, pre-professionalization, has no uniform standards, no formalized body of knowledge, and, therefore, no ability to sanction or eject a malpractitioner from its ranks. …

Why we need to start challenging the prevailing conversations around tech

Image for post
Image for post
Photo: Westend61/Getty Images

“What I find [ominous] is how seldom, today, we see the phrase ‘the 22nd century.’ Almost never. Compare this with the frequency with which the 21st century was evoked in popular culture during, say, the 1920s.”

—William Gibson, famed science-fiction author, in an interview on dystopian fiction.

The 2010s are almost over. And it doesn’t quite feel right.

When the end of 2009 came into view, the end of the 2000s felt like a relatively innocuous milestone. The current moment feels so much more, what’s the word?

Ah, yes: dystopian.

Looking back, “dystopia” might have been the watchword of the 2010s. Black Mirror debuted close to the beginning of the decade, and early in its run, it was sometimes critiqued for how over-the-top it all felt. Now, at the end of the decade, it’s regularly critiqued as made obsolete by reality. …


Jumana Abu-Ghazaleh

Jumana is the founder of Pivot For Humanity, a non-profit on a mission to professionalize Silicon Valley

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store