If predictions are like baseball, I’m bound to have a bad year in 2019, given how well things went the last time around. And given how my own interests, work life, and physical location have changed of late, I’m not entirely sure what might spring from this particular session at the keyboard.
But as I’ve noted in previous versions of this post (all 15 of them are linked at the bottom), I do these predictions in something of a fugue state – I don’t prepare in advance. I just sit down, stare at a blank page, and start to write.
Every year I write predictions for the year ahead. And at the end of that year, I grade myself on how I did. I love writing this post, and thankfully you all love reading it as well. These “How I Did” posts are usually the most popular of the year, beating even the original predictions in readership and engagement.
What’s that about, anyway? Is it the spectacle of watching a guy admit he got things wrong? Cheering when I get it right? Perhaps it’s just a chance to pull back and review the year that was, all the while marveling at how much happened in twelve short months. And 2018 does not disappoint.
Those of us fortunate enough to have lived through the birth of the web have a habit of stewing in our own nostalgia. We’ll recall some cool site from ten or more years back, then think to ourselves (or sometimes out loud on Twitter): “Well damn, things were way better back then.”
Then we shut up. After all, we’re likely out of touch, given most of us have never hung out on Twitch. But I’m seeing more and more of this kind of oldster wistfulness, what with Facebook’s current unraveling and the overall implosion of the tech-as-savior narrative in our society.
Mark Zuckerberg is in a crisis of leadership. Will he grasp its opportunity?
It seems like an eternity, but about one year ago this Fall, Uber had kicked its iconic founding CEO to the curb, and he responded by attempting a board room coup. Meanwhile, Facebook was at least a year into crisis mode, clumsily dealing with a spreading contagion that culminated in a Yom Kippur apology from CEO Mark Zuckerberg. “For those I hurt this year, I ask forgiveness and I will try to be better,” he posted. “For the ways my work was used to divide people rather than bring us together, I ask for forgiveness and I will work to do better.”
More than one year after that work reputedly began, what lesson from Facebook’s still rolling catastrophe? I think it’s pretty clear: Mark Zuckerberg needs to do a lot more than publish blog posts someone else has written for him.
A year and a half ago I reviewed Yuval Noah Harari’s Homo Deus, recommending it to the entire industry with this subhead: “No one in tech is talking about Homo Deus. We most certainly should be.”
Eighteen months later, Harari is finally having his technology industry moment. The author of a trio of increasingly disturbing books – Sapiens, for which made his name as a popular historian philosopher, the aforementioned Homo Deus, which introduced a dark strain of tech futurism to his work, and the recent 21 Lessons for the 21st Century – Harari has cemented his place in the Valley as tech’s favorite self-flagellant. So it’s only fitting that this weekend Harari was the subject of New York Times profile featuring this provocative title: Tech C.E.O.s Are in Love With Their Principal Doomsayer. The subhead continues: “The futurist philosopher Yuval Noah Harari thinks Silicon Valley is an engine of dystopian ruin. So why do the digital elite adore him so?”
If you pull far enough back from the day to day debate over technology’s impact on society – far enough that Facebook’s destabilization of democracy, Amazon’s conquering of capitalism, and Google’s domination of our data flows start to blend into one broader, more cohesive picture – what does that picture communicate about the state of humanity today?
The Los Angeles Times was the first newspaper I ever read – I even attended a grammar school named for its founding family (the Chandlers). Later in life I worked at the Times for a summer – and found even back then, the great brand had begun to lose its way.
I began reading The Atlantic as a high schooler in the early 1980s, and in college I dreamt of writing long form narratives for its editors. In graduate school, I even started a publication modeled on The Atlantic‘s brand – I called it The Pacific. My big idea: The west coast was a huge story in desperate need of high-quality narrative journalism. (Yes, this was before Wired.)
“We weren’t expecting any of this when we created Twitter over 12 years ago, and we acknowledge the real world negative consequences of what happened and we take the full responsibility to fix it.”
That’s the most important line from Twitter CEO Jack Dorsey’s testimony yesterday – and in many ways it’s also the most frustrating. But I agree with Ben Thompson, who this morning points out (sub required) that Dorsey’s philosophy on how to “fix it” was strikingly different from that of Facebook COO Sheryl Sandberg (or Google, which failed to send a C-level executive to the hearings). To quote Dorsey (emphasis mine): “Today we’re committing to the people and this committee to do that work and do it openly. We’re here to contribute to a healthy public square, not compete to have the only one. We know that’s the only way our business thrives and helps us all defend against these new threats.”
I’ve been covering Google’s rather tortured relationship with China for more than 15 years now. The company’s off again, on again approach to the Internet’s largest “untapped” market has proven vexing, but as today’s Intercept scoop informs us, it looks like Google has yielded to its own growth imperative, and will once again stand up its search services for the Chinese market. To wit:
GOOGLE IS PLANNING to launch a censored version of its search engine in China that will blacklist websites and search terms about human rights, democracy, religion, and peaceful protest, The Intercept can reveal.
I first moved to the Bay area in 1983. I graduated from high school, spent my summer as an exchange student/day laborer in England (long story), then began studies at Berkeley, where I had a Navy scholarship (another long story).
1983. 35 years ago.
1983 was one year before the introduction of the Macintosh (my first job was covering Apple and the Mac). Ten years before the debut of Wired magazine. Twenty years before I began writing The Search, launching Web 2.0, and imagining what became Federated Media. And thirty years before we launched NewCo and the Shift Forum. It’s a … long fucking time ago.