In 1986, political theorist Langdon Winner published The Whale and the Reactor: A Search for Limits in an Age of High Technology, a rather thin but deeply troubling book that offered a warning not just about technology itself, but about our failure to notice it. Among the many resonating concepts he introduced, one stands out with particular relevance to the challenges facing our society today: technological somnambulance.
“Somnambulance” is just a big word for sleepwalking, but Winner’s original phrase deserves our attention. In it, he evokes the image of a society sleepwalking through the most profound transformations of its day. These are not changes happening on the margins, tiny tweaks to background code or systems. These are seismic shifts … technologies that remake how we live, communicate, govern, think, and relate to one another. And yet, astonishingly, these changes unfold with barely a whisper of meaningful public discussion (let alone deliberation). We are, in Winner’s view, not resisting new technologies, nor are we fully embracing them. Instead, we are simply sleepily drifting into their embrace, unconscious of the consequences.
Winner is no enemy of technology. His concern is not that we have built machines, but that we have failed to build a politics around them. Winner believes strongly that technologies are not neutral tools dropped into the world for us to pick up or put down as we please. They arrive with values embedded in them. They create “affordances” - ways of acting and organizing - that become more than options; they become our habits, our “default” settings, sometimes even our laws. When we treat these shifts as purely technical rather than social, we abdicate something essential – our role as citizens in shaping the structure of our shared life.
The question to ask is not whether to accept or reject a particular technology, but rather what kinds of social and political life are likely to result from its adoption. — Langdon Winner
Consider Facebook as an example.
Its early promise was charming – almost banal: “a way to stay in touch with friends and family.” Who could object? But when it opened to anyone over the age of 13 with an email account in 2006, a seemingly minor shift marked a cultural turning point. What if we had paused then to ask harder questions? Might we have foreseen the rise of disinformation economies, the corrosion of civil discourse, or the normalization of mass surveillance as a business model? Could we have anticipated threats to data governance, the psychological toll on users, the vulnerabilities of children and youth, or the strain on democratic institutions? Perhaps.
But we didn’t.
We were too busy logging in, too ready to accept, unexamined, the seductive assumptions that connection is inherently good, that scale is neutral, and that anything technically possible must be socially desirable. Had we examined the assumptions seriously, we might have insisted on rules, rights, and responsibilities. We might have held some hearings. We might have drafted some laws.
Winner didn’t offer a prophecy, but a method: look at the politics of the machine. Who does new technology empower? Who does it displace? What assumptions does it carry about the nature of human relationships, about privacy, about work, about truth? These should not be questions for engineers alone. They should be civic questions, questions for all of us.
“To invent a new device or technique is to bring a particular form of power and authority into the world” — Langdon Winner
It is striking that Winner wrote all of this before the rise of the internet, before social media, before smartphones, before algorithmic recommendation systems or generative AI. But his concerns could hardly be more timely.
Winner would urge us not to squander the opportunity to ask such questions now. While we were asleep at the switch through Facebook’s rise, the age of generative AI, automated decision-making, biometric surveillance, and brain-computer interfaces is still unfolding. Winner would want us to see these new technologies not just as things we use, but things we live within. They shape our environment, our choices, our very selves. And that’s why we need to be awake to them.
So, how do we wake up?
Firstly, we must reject the myth of technological determinism – the idea that change is inevitable and always good, and that our only role is to adapt to it as quickly as we can. This is wrong. Technologies are human-made and human-shaped, and we choose which ones to develop, to fund, to build, to regulate, and to adopt. We choose how they are governed or whether they are governed at all.
Secondly, we need education - not just technological literacy, but community literacy about technology. People need to understand not only how technological tools work, but how they are designed to work on us! We need opportunities for public discussion that aren’t controlled by industry advocates.
Thirdly, we need scholars, journalists, artists, politicians, policymakers and, most importantly, ordinary citizens - to ask the hard questions, to participate in the debate about what kind of technological future we want. And this is the hard part. “Technological somnambulance” suggests not just ignorance, but a kind of willful negligence or inattention. It’s not merely that people don’t know what’s happening, it’s that we often choose not to think about it. And it’s not due to stupidity or incompetence, it’s more because we’re all busy. Or tired. Or reasonably focused on the more immediate concerns of work, family, or health. And meanwhile, that humming we hear in the background is the constant drone of technological change, often driven by actors whose interests are not aligned with the public good.
“The basic question—who will be the master and who the slave?—is posed every time a new device, technique, or system is introduced.” — Langdon Winner
It is easy to feel powerless in the face of all this. But we’re not powerless. We are only asleep. And sleep is reversible and preventable.
So, the next time you see a new app, a new wearable, a new AI tool … before you sign in, scroll down, or accept the terms of service, ask yourself, Who built this? What do they want from me? What does this really achieve? What does this replace in my life?
The act of paying attention, of asking questions, of refusing to drift isn’t all that dramatic - but they will result in a radical re-alignment of the all-too-usual power of the technology over the user. We are asking the questions any responsible citizen should ask. And by doing so we are awakening – we are starting, once again, to live with our eyes wide open.
We are living in such frightening time. Thanks for reminding us all to pay more attention to the obvious.
Interesting read and a call to action.