The Flawed Humanity of Silicon Valley

Behind the scenes of the surveillance economy.

Mr. Warzel is an opinion writer at large.

“Uncanny Valley,” a memoir by Anna Wiener.
Credit...Sonny Figueroa/The New York Times

Every week brings a fresh hell in the tech world. As news of the latest scandals pile up over weeks, months and eventually years, narratives switch. Friendly tech companies become “Big Tech.” The narrative is flattened. The tech giants become monolithic and their employees become caricatures — often of villains.

The truth is always messier, more interesting and more human. It is a central tension animating Anna Wiener’s excellent memoir, “Uncanny Valley.” The book traces Ms. Wiener’s navigating the tech world as a start-up employee in the mid 2010s — what might be thought of as the last years before Silicon Valley’s fall from darling status. Ms. Wiener said she was drawn into the tech world by its propulsive qualities. Graduating into a recession and spending her early 20s in publishing, tech offered opportunities: jobs, the seductive feeling of creating something and, of course, the money was good.

But what makes “Uncanny Valley” so valuable is the way it humanizes the tech industry without letting it off the hook. The book allows us to see the way that flawed technology is made and marketed: not by villains, but by blind spots, uncritical thinking and armies of ambivalent people coming into work each day trying their best — all while, sometimes unwittingly, laying the foundation of the surveillance economy.

From a privacy standpoint, “Uncanny Valley” is helpful in understanding what it’s like being on the other end of the torrent of information that streams from our devices each minute. Early on, Ms. Wiener recounts working for a successful data analytics company and the gold rush toward big data, noting that “not everyone knew what they needed from big data, but everyone knew that they needed it.”

When confronted with the mass of information her company collected, Ms. Wiener describes feeling uncomfortable with the “God Mode” view that granted employees full access to user data. “This was a privileged vantage point from which to observe the tech industry, and we tried not to talk about it,” she writes. This, she notes, becomes a pattern. When Edward Snowden blew the whistle on the National Security Agency’s Prism program in 2013, employees at her own data company never discussed the news.

What she describes is a familiar dissociation for anyone who spends time interrogating tech companies on their privacy policies. Her company simply didn’t consider itself part of the surveillance economy:

“We weren’t thinking about our role in facilitating and normalizing the creation of unregulated, privately held databases on human behavior. We were just allowing product managers to run better A/B tests. We were just helping developers make better apps. It was all so simple: people loved our product and leveraged it to improve their own products, so that people would love them, too. There was nothing nefarious about it. Besides, if we didn’t do it, someone else would. We were far from the only third-party analytics tool on the market. The sole moral quandary in our space that we acknowledged outright was the question of whether or not to sell data to advertisers. This was something we did not do, and we were righteous about it. We were just a neutral platform, a conduit. If anyone raised concerns about the information our users were collecting, or the potential for abuse of our product, the solutions manager would try to bring us back to earth by reminding us that we weren’t data brokers. We did not build cross-platform profiles. We didn’t involve third parties. Users might not know they were being tracked, but that was between them and our customer companies.”

They were, in other words, just doing their jobs.

Ms. Wiener frequently returns to this reticence to question the product, the end goals of the technology and the Silicon Valley ethos as a whole.

At her next job working on the terms of service team for a large open source code platform, she reveals how the evolution of the internet pushed her and her co-workers into becoming “reluctant content moderators.” Soon it became her team’s job to fashion a balance between preserving free speech on her platform and protecting it from trolls and neo-Nazis:

“We wanted to tread lightly: core participants in the open-source software community were sensitive to corporate oversight, and we didn’t want to undercut anyone’s techno-utopianism by becoming an overreaching arm of the company-state. We wanted to be on the side of human rights, free speech and free expression, creativity and equality. At the same time, it was an international platform, and who among us could have articulated a coherent stance on international human rights?”

As a journalist who has covered content moderation issues for the better part of a decade, the perspective is somewhat clarifying. Decisions that feel ad hoc or made by one or two people in the belly of a large company often are. What looks from the outside like a conspiracy or nefarious techno-authoritarianism is often just confusion caused by poor management, poor communication and dizzying growth. “Most of the company did not seem aware of how common it was for our tools to be abused,” Ms. Wiener writes of her group of de facto moderators. “They did not even seem to know that our team existed. It wasn’t their fault — we were easy to miss. There were four of us for the platform’s nine million users.”

In this instance, “Uncanny Valley” shows how the internet can thrust ordinary people into extraordinary positions of power — usually without qualifications or a how-to guide. This is not to say that the book excuses any of the industry’s reckless behavior. Like a good travel writer, Ms. Wiener positions herself as an insider-outsider, “participating in something bigger than myself and still feeling apart from it.” And she is sufficiently critical of her and her peers’ participation in the industry. She writes that she would “wonder whether the N.S.A. whistle-blower had been the first moral test for my generation of entrepreneurs and tech workers, and we had blown it,” she writes at one point near the end of the memoir.

Ms. Wiener’s memoir comes at a point where the backlash against Silicon Valley is strong enough to have earned its own name. Narratives have hardened and aggrieved tech employees are adopting a “bunker mentality.” As Ranjan Roy of the newsletter Margins wrote recently of Facebook, “the rank and file are seeing that they are the villains, and will increasingly become so.” As so much of the reporting shows, the increased scrutiny and criticism of the techlash is important and almost all is warranted. Big Tech has amassed wild, unregulated power that has grown unchecked.

Still, it’s easy to get conspiratorial and to fall comfortably into black and white notions of good versus evil. “Uncanny Valley” is a reminder that the reality is far more muddled but no less damning. Our dystopia isn’t just the product of mustache-twirling billionaires drunk with power and fueled by greed — though it is that, too, sometimes. It’s also the result of uncritical thinking, blind spots caused by an overwhelmingly white male work force and a pathological reluctance to ask the bigger question: Where is this all going? What am I building?

Facebook will now show you exactly how it stalks you — even when you’re not using Facebook.

Ring doorbell app packed with third-party trackers.

Leaked documents expose the secretive market for your web browsing data.

40 groups have called for a U.S. moratorium on facial recognition technology

Like other media companies, The Times collects data on its visitors when they read stories like this one. For more detail please see our privacy policy and our publisher's description of The Times's practices and continued steps to increase transparency and protections.

Follow @privacyproject on Twitter and The New York Times Opinion Section on Facebook and Instagram.

glossary replacer