About this ebook
A NATIONAL BESTSELLER
A programmer, musician, and father of virtual reality technology, Jaron Lanier was a pioneer in digital media, and among the first to predict the revolutionary changes it would bring to our commerce and culture. Now, with the Web influencing virtually every aspect of our lives, he offers this provocative critique of how digital design is shaping society, for better and for worse.
Informed by Lanier’s experience and expertise as a computer scientist, You Are Not a Gadget discusses the technical and cultural problems that have unwittingly risen from programming choices—such as the nature of user identity—that were “locked-in” at the birth of digital media and considers what a future based on current design philosophies will bring. With the proliferation of social networks, cloud-based data storage systems, and Web 2.0 designs that elevate the “wisdom” of mobs and computer algorithms over the intelligence and wisdom of individuals, his message has never been more urgent.
Jaron Lanier
Jaron Lanier es experto en informática, músico, artista gráfico y escritor. Una de las cien personalidades más influyentes del mundo en 2011 según la revista Time, es muy conocido en el campo de la informática por sus trabajos sobre la realidad virtual, concepto acuñado por él, que le valieron el galardón al Lifetime Career del IEEE en 2009. En un artículo de la revista Wired se le define como «la primera figura de la tecnología que ha logrado el estrellato en la cultura contemporánea». Ha trabajado tanto en entornos académicos, sobre todo en relación con Internet2, como en el sector privado, participando en la creación de empresas que acabaron compradas por Oracle, Adobe y Google. Obtuvo un doctorado Honoris Causa del Instituto de Tecnología de New Jersey en 2006. En la actualidad trabaja en Microsoft en proyectos aún secretos. La Enciclopedia Británica le ha incluido en la lista de los trescientos inventores más importantes de la historia. Su libro Contra el rebaño digital (Debate, 2011) fue un éxito internacional.
Read more from Jaron Lanier
Who Owns the Future? Rating: 4 out of 5 stars4/5Radical Markets: Uprooting Capitalism and Democracy for a Just Society Rating: 4 out of 5 stars4/5
Related to You Are Not a Gadget
Technology & Engineering For You
The Big Book of Hacks: 264 Amazing DIY Tech Projects Rating: 4 out of 5 stars4/5Ultralearning: Master Hard Skills, Outsmart the Competition, and Accelerate Your Career Rating: 4 out of 5 stars4/5The Indifferent Stars Above: The Harrowing Saga of the Donner Party Rating: 4 out of 5 stars4/5Empire of AI: Dreams and Nightmares in Sam Altman's OpenAI Rating: 5 out of 5 stars5/5Digital Minimalism: Choosing a Focused Life in a Noisy World Rating: 4 out of 5 stars4/5The Art of War Rating: 4 out of 5 stars4/5Nuclear War: A Scenario Rating: 4 out of 5 stars4/5Vanderbilt: The Rise and Fall of an American Dynasty Rating: 4 out of 5 stars4/5The Big Book of Maker Skills: Tools & Techniques for Building Great Tech Projects Rating: 4 out of 5 stars4/5The ChatGPT Millionaire Handbook: Make Money Online With the Power of AI Technology Rating: 4 out of 5 stars4/5The CIA Lockpicking Manual Rating: 5 out of 5 stars5/5The Four: The Hidden DNA of Amazon, Apple, Facebook, and Google Rating: 4 out of 5 stars4/5Understanding Media: The Extensions of Man Rating: 4 out of 5 stars4/5How to Do Nothing: Resisting the Attention Economy Rating: 4 out of 5 stars4/5The Systems Thinker: Essential Thinking Skills For Solving Problems, Managing Chaos, Rating: 4 out of 5 stars4/5A Guide to Electronic Dance Music Volume 1: Foundations Rating: 5 out of 5 stars5/5The Coming Wave: AI, Power, and Our Future Rating: 4 out of 5 stars4/5Broken Money: Why Our Financial System is Failing Us and How We Can Make it Better Rating: 4 out of 5 stars4/5Seeing Further: The Story of Science and the Royal Society Rating: 4 out of 5 stars4/5UX/UI Design Playbook Rating: 4 out of 5 stars4/5Bad Blood: Secrets and Lies in a Silicon Valley Startup Rating: 4 out of 5 stars4/5Basic Engineering Mechanics Explained, Volume 1: Principles and Static Forces Rating: 5 out of 5 stars5/5How to Lie with Maps Rating: 4 out of 5 stars4/5How to Disappear and Live Off the Grid: A CIA Insider's Guide Rating: 0 out of 5 stars0 ratingsSurveillance and Surveillance Detection: A CIA Insider's Guide Rating: 3 out of 5 stars3/5Technical Book of the Car Rating: 5 out of 5 stars5/5The Homeowner's DIY Guide to Electrical Wiring Rating: 4 out of 5 stars4/5
291 ratings30 reviews
- Rating: 2 out of 5 stars2/5
Aug 20, 2023
I've checked this book out of the library three times and have yet to finish it. This most recent time, I did not even crack it open, even though I had an inordinate amount of free time for one reason or another. I think the reason really boils down to this - it's not well written. On top of that, it's boring.
I think the ideas that Mr. Lanier brings up are interesting enough. The thought that current technology is affecting how people interact with each other and the world and perhaps even how they think is getting wide attention. But that is probably the book's problem. It needs to stand out in a sea of similar speculation. But all that this book has to offer is speculation. It's a series of disjointed anecdotes and observations by the author - an extended opinion essay. I made it almost half way through the book and I do not remember being offered a shred of substantive supporting science for the allegations being made.
And did I mention it was boring? It's almost as if the author expects the reader to pay attention and put up with any and all rambling simply because he is such an interesting person. Sorry. This book goes back to the library again, this time for good. I think I know where I saw an 'executive summary' of it. Maybe I'll make it through that. - Rating: 5 out of 5 stars5/5
Aug 2, 2022
eye opening read. one of those books that makes you think why didn't i think of that before. and then i get to the bit where he does write about something i thought about recently. its always great when somebody else backs up your own theory's. this book backed up some of my theory's and opened my eyes to many many more possibilities and pathways that technology may be headed down, but doesn't necessarily have to. - Rating: 2 out of 5 stars2/5
Sep 22, 2021
A fairly thoughtful book on tech policy-related issues, but I think Neil Postman does this better. Lanier is too pessimistic in his assessment of the "mob mentality" of the Internet, but makes some valid cautionary points. He also overreaches in his criticism of the Open Source Software (OSS) movement, and doesn't seem to me to have enough appreciation for the true cultural value of Linux and other projects. That's unfortunate, because Lanier moves on to denigrate modern culture as "retro", but fails to note how subjective critiques like his can be. For these reasons and more, I found his attack on social networking and digital advertising pretty unpersuasive. This is still a good book to read for diehard geeks or those in tech policy positions in government-- it may provoke thought, even if Lanier's conclusions prove unsatisfying. - Rating: 3 out of 5 stars3/5
Jul 31, 2020
Some interesting observations here, don't agree with it all (especially his rant about modern music, and his seemingly almost complete ignorance of electronic music) but he's on to a couple of things in terms of how culture is not catching up with technology. Have to admit much of this, especially near the end went way over my head and the short sections aren't my ideal way to read but an interesting book nonetheless. - Rating: 3 out of 5 stars3/5
Jul 29, 2019
Didn’t hold up to a re-read.
Jaron has some interesting thoughts, and I share his broad pessimism, but this reads like a series of fragmentary blog posts and anecdotes. which is a bit funny, considering his beef with fragmentary modes of communication. - Rating: 3 out of 5 stars3/5
Jun 21, 2018
A critique of the technophilic attraction for the hive mind and piracy of artists's and writers' works by a technophile. - Rating: 3 out of 5 stars3/5
Sep 18, 2014
I tend to align myself with many of those that Lanier directly criticizes, but I found a lot that is valuable in this book. I was a little disappointed that Lanier's arguments hadn't taken into account the responses to his digital Maoism essay but that disappointment was greatly outweighed by a very excellent book.
While I do not buy into Lanier's negative arguments, I do buy into his positive arguments for humanism and that we can control the direction that technology grows in. I wouldn't have thought myself to be a technological determinist, but Lanier's line of reasoning about lowering our standards of humanity and intelligence in order to make machines seem more human.
While I don't blame Lessig, Shirky, or the fine minds at the Berkman Center for dehumanizing technology, I do buy the idea that we can infuse our technological choices w/ our deepest values and affect the outcome. - Rating: 3 out of 5 stars3/5
May 6, 2013
I liked it too much to rate it any lower than 3 stars, but it also pissed me off a lot more than any other book I haven't physically thrown across a room.(I have only ever thrown a very few books across a room.) - Rating: 4 out of 5 stars4/5
Apr 27, 2013
Lanier crafts an argument with his experience as a technologist, and as a fellow humanist, I agree with him. He thinks that we are indulging in lazy thinking when we compare a computer with a person, or cede personhood to the cloud. He discusses the need to hold on to the ability for creative people, especially musicians, writers, and filmmakers, to get paid for their work. There are also some fuzzy ideas that make less sense to me, but mostly I enjoy his writing about issues that seem extremely important, especially how computer programs have taken over Wall Street. People don't even know how to explain the financial instruments they are selling. - Rating: 5 out of 5 stars5/5
Apr 6, 2013
Lanier is not only an innovator, he ponders deep questions. This is simply the most satisfying critique of the current shape of the digital revolution that I have encountered. Lanier makes a case for reconsidering the conventional wisdom about open source software, the future of digital culture, and much more. His basic assumption that human beings as individuals have dignity and worth in themselves doesn;t sound revolutionary until one explores the limitations of Facebook, Wikipedia and similar phenomena. - Rating: 3 out of 5 stars3/5
Apr 1, 2013
"It might at first seem that the experience of youth is now sharply divided between the old world of school and parents, and the new world of social networking on the internet, but actually school now belongs on the new side of the ledger. Education has gone through a parallel transformation, and for similar reasons.
Information systems need to have information in order to run, but information underrepresents reality. Demand more from information than it can give, and you end up with monstrous designs. Under the No Child Left Behind Act of 2002, for example, US teachers are forced to choose between teaching general knowledge and "teaching to the test." The best teachers are thus often disenfranchised by the improper use of educational information systems.
What computerized analysis of all the country's school tests has done to education is exactly what Facebook has done to friendships. In both cases, life is turned into a database. Both degradations are based on the same philosophical mistake, which is the belief that computers can presently represent human thought or human relationships. These are things computers cannot currently do."
-pg. 69 - Rating: 3 out of 5 stars3/5
Apr 1, 2013
I was so blown away by the excerpt of this book in Harper's that I deleted my facebook account before I even finished it. I also enjoyed the book, though I got lost about one exit after the 'songle'. But this is to be expected. Manifestos have a good first half or however much is apportioned to stating the problem. Therein lies recognition and the endorphin bath of connected dots. Solutions are harder to express. There's maybe one* person born per generation who can show us those unseen patterns that open the invisible doors to the future. But for illustrating a new variable of the human condition that we ignore at our own risk, Lanier is tops and I read this with close attention.
*this number comes from the ass-o-meter. I don't really know, nor do I have an opinionated estimate. Though I wish I did. - Rating: 4 out of 5 stars4/5
Jul 19, 2012
An excellent, honest and informed look at online trends and the state of computing technology and its effect on our culture today. I recommend it to anyone who works even vaguely closely to IT. - Rating: 3 out of 5 stars3/5
Jun 19, 2012
You Are Not a Gadget is Jaron Lanier's critique the modern internet's tendency to favor the wisdom of crowds over the individual. Basically, he's warning against how we devalue our human uniqueness while pursuing an increasingly smarter computer mind. That's the gist, I think. The topics within the chapters often stray so it's not always clear what point he's trying to make.
Mr. Lanier is no doubt intelligent - best known for being one of the main contributors to the technology of virtual reality - and I more or less agree with his entire premise. Unfortunately, his arguments come across like the rants of an older man who doesn't like the change he sees coming. You half expect him to start lamenting about not understanding kids these days.
The truly frustrating part is that he's probably right, but his convoluted delivery makes me doubt him. - Rating: 5 out of 5 stars5/5
Dec 22, 2011
Jaron Lanier criticizes the received wisdom about how the internet and Web 2.0 relate to true creativity, freedom and authorship. Lanier zeroes in on how our present technologies and softwares lock us into certain patterns of thought and belief that are inherently limited and limiting. Lanier, who was a tech pioneer in the early days of the internet, finds the present configuration of the web to be an impoverished place for people to grow as artists and citizens. This book was very provocative, and very deep. Lanier is a corrective to Clay Shirky, who argues the other side: Web 2.0= utopian forms of social organization, of action and creation (pardon my shorthand, but that's the gist of Shirky as I remember his book "Here Comes Everybody"). - Rating: 3 out of 5 stars3/5
Jul 30, 2011
I like Jaron, I think his ideas are insightful and passionate, but I don't think this book did justice to his nuanced romantic perspective. I wish this book was better structured and, frankly, better written, but I still enjoyed Jaron's ideas so I'll briefly address them.
The main thesis of 'You Are Not a Gadget' is that if we are not careful in designing our information technology systems with the proper reverence for the ineffable individual human person, then we risk creating techno-social architectures which may mold our culture into acquiescing a diminished human experience. He lambasts the free/open-source/creative-commons/web2.0 movements that have come to dominate the mainstream ideology of most technologists these days, blaming them for preventing the formation of a large middle class of artists and creative professionals that was hoped for in the early days of the internet. He also derides the increasingly popular futurist ideas of the technological singularity, holding in contempt what he views as simplistic notions of intelligence and our primitive abilities to model them in software. What those groups all have in common is their threat to subjugate the individual to the hive mind, influencing us into making our most important decisions for the benefit of the machines or the collective instead of the human.
Jaron doesn't only criticize, he also offers up his own suggestions for fixing the current state of affairs and has his own future scenario of how he'd like to see humans using information technology. I think some of his suggestions are exciting and worthy of discussion, but the limited and superficial treatment of them in the book don't amount to much. Again, I wish this book was better written because I really believe in many of these ideas, but as it stand I have to recommend reading Jaron's essays online or listening to his talks instead of reading this book in order to get a better appreciation of his ideology. - Rating: 3 out of 5 stars3/5
Jun 28, 2011
Jaron Lanier is a musician who by circumstance ended up in the world of new technologies. His book is his cry from the heart about the dehumanizing effects of cyber technology. Naturally the interaction between technology and us is two way. Hence it is easy to let technology dominate our lives and even change our attitudes and behaviours. In this context, Lanier lays bare the human impact of this technology’s culture on the artistic community. He worries that the reach of the internet will turn us into simple peripheral units, and society will not sufficiently reward people for their creative work.
Certainly read this book for his critique, feelings and impressions. There were parts that I especially liked. However, be warned that, as a digital humanist, he opines that one can dispense with rational arguments. Indeed the presented post-rationalization of his feelings is flimsy at best. Furthermore his coverage of issues is somewhat narrow, shallow and lopsided. Indeed many of his more careless assertions have obvious counter examples. Essentially he intuits a colourful broad-brush picture of issues. Thus this picture should be viewed from afar without attention to localized detail. - Rating: 2 out of 5 stars2/5
Apr 20, 2011
This is a book which argues passionately for sustained thought, aesthetics, and relationships, in a world which is becoming increasingly "fragmentary" due in part to the philosophical lock-in created by the way we use and design our information tools.
Ironically, the book is written in a frustrating series of loosely-connected one to three page micro-essays.
Following Lanier's thought is made more difficult by this; he hops from gripe to gripe, expecting the user to follow his train of thought with little aid other than the force of his invective. It is a "manifesto," and as such it is sorely lacking in the trappings of deep thought like attributed sources, or clear definitions of concepts.
I agree with a lot of what Lanier says, which is one of the things which makes this sloppy book so frustrating. For example, he quite rightly states that anonymity is a de-personalizing force which enables much of the worst, most harmful behavior on the Internet. However, he never credits Facebook for fighting this trend by opening their authentication system to other web sites. Instead, he attacks Facebook for its oppressive, impersonal graphic design - a fair criticism, but it seems, a missed opportunity for balance in this "manifesto".
In a similar example, in one chapter he criticizes the idea that "open science" could lead to discoveries in evolutionary biology, and then in the next he praises the advances in computational linguistics made possible by large electronic text collections. Again, both criticisms seem fair based on my limited knowledge of the disciplines... but Lanier doesn't address that aspect, or even the similarity of the two efforts.
At the end of the day, I find myself wishing for a more scholarly book by Lanier on a smaller subset of these topics. Particularly in the computing areas where I respect his vast knowledge - the nature of the operating system, the possibilities of virtual reality, the very nature of "intelligence", artificial or not, and the way all of these areas have social effects - I know Lanier has a lot to teach me. Unfortunately, this book isn't about "teaching." It's about "convincing." - Rating: 2 out of 5 stars2/5
Mar 12, 2011
I didn't get it. It appears as though the author is attempting to philosophize about technology/internet as if it had a mind of its own or something.
I dunno. Technology isn't people. And I don't need to read a book to tell me this, especially when his arguments don't really make much sense.
And, on top of that, I can't stand the way the pages are broken out with headers and subheaders - as if it were an academic paper with subsections. It ain't an academic paper so don't go trying to pretend it's more than some musings with half-baked "evidence" - Rating: 2 out of 5 stars2/5
Feb 13, 2011
Lanier has been around a while, so what he has to say about UNIX and asynchronous time, the shaping of consciousness and "cybernetic totalism", and the early stuff of codes and commands, are pretty interesting. But he likes to pick on Wikipedia and MIDI too often and in fairly superficial ways (not that I would stand to defend either). The redundancy only goes to show that I think he is best at being the experienced computer scientist and not the half-baked musicologist, sociologist, or cultural critic. And he's a terrible philosopher, despite his humanist bent and good intentions. I enjoyed the beginning, where Lanier flies as the informed and thoughtful historian. But 3/4 of the way, his points become horribly oblique and just plain weird in that Wired magazine kind of way (just examine the headings). I wanted to like this book, but the end just fell apart for me. - Rating: 2 out of 5 stars2/5
Jan 1, 2011
intriguing and provocative, but I thought many of the arguments were not well made. - Rating: 4 out of 5 stars4/5
Dec 21, 2010
The book You are Not a Gadget contains musings of the author, Jaron Lanier, about how technology has and will continue to affect individuals as well as human society in general. Lanier is a computer scientist, visual artist and musician, and has worked with associates in a variety of science fields. He pulls historical and contemporary examples of philosophy, psychology, religion, music, art & technology and applies those concepts to today's technology.
This five part (fourteen chapters) book raises important questions and provides useful discussion on the topics at hand. It is a well written book that is both thought provoking and quick reading, and will make readers pause to evaluate how they incorporate technology in their own lives. This book might be useful for promoting classroom discussion about technological issues at either the undergraduate or graduate levels. - Rating: 4 out of 5 stars4/5
Aug 10, 2010
An interesting manifesto about the changing face of the internet and digital culture vis a vis its relation to human beings. I appreciate Lanier’s humanism as I agree that the value of computers lies only in its relation to the humans who use them. I’m not so sure if I agree with his statements that the internet is culturally reactive rather than active. While this may be true in the general sense (and definitely the reason why I avoid YouTube if I can *rimshot*) I’ve seen some powerful activism and creativity online. In these cases, crowds and the hive mind are able to draw together otherwise isolated and silenced individuals. There are other things that I feel he overlooks, and there are certain presumptions he makes about people in general when talking about what is better for people. But overall, an interesting book. I’m not overly educated in philosophical discussions of the web so I’m not the best person to process Lanier’s arguments, but for someone who is, it’s worth reading. - Rating: 4 out of 5 stars4/5
Jul 10, 2010
requires a reply; i've got considerable notes on this book; i like someone who offers "triangulation" as lanier does; my daughter recommended this to me as sort of a rebuttal to nicholas carr's "big switch". for some time, i've been among the digerati, saying "it's over, we won." thanks to lanier, i can indefinetly postpone that kindle purchase. there are things worth retaining. while he's an analog person -- heck, the whole virtual reality thing is based on voltage, and not bits -- and i'm still digital, i'm quite comfy with his critque of my neighborhood.
literarily, his last few chapters are of the and-let-me-get-this-off-my-chest variety. so, they're not up to the sharp invictive of his earlier ones. compared to carr, he gives unix it's larger place in the pantheon, and it's not without a proportionate share of the problems, according to laneir.
lanier woke me up to how software tends to "lock down" certain decisions. in his case, the midi music format is one of those: "how do keypresses model rich tonality" is his challenge as a music maker. well enough. any bit of software makes choices for us in ways we don't always see.
even with taleb's black swan, challenging "the bell curve", i still insist any individual's rating _must_ follow the binomial relationship. giving lanier's "gadget, U R ~" a 4.0 doesn't disturb the balance of my ratings. - Rating: 1 out of 5 stars1/5
Jul 9, 2010
The writing was absolutely atrocious and I couldn't make it past the second chapter. Manifesto indeed. - Rating: 2 out of 5 stars2/5
Jun 29, 2010
I agree with many of Lanier's points, especially regarding the damaging effects of technology that reduces our humanity. But his superficial, frantic ramblings are lamentable---this is a manifesto against the crowd-based technophilia exemplified by Wired magazine, written precisely in Wired's hyped, glitzy, mock-philosophical style. - Rating: 5 out of 5 stars5/5
Apr 15, 2010
Everyone who uses social media should read this book. Lanier, an insider of the Silicon Valley community, describes the development of our modern day computing tools and speaks to their implications for our society. Anyone who has ever been frustrated by default settings, or forced to do more typing when Microsoft Word incorrectly intuits the next tab space in a document, will appreciate Lanier's critique. Read this and share the ideas with your colleagues. - Rating: 4 out of 5 stars4/5
Apr 6, 2010
You Are Not A Gadget is a fantastic manifesto. Jaron states his positions firmly in some cases. In other cases it is obvious he has thought deeply on the issues but hasn't come to a workable conclusion on how to better processes and software. He admits he doesn't have the answers but has some hope that technology, especially the internet, will improve in the future before it gets locked into inanity.
He is critical of Web 2.0 (including FaceBook) and the online hive mind. Crowds are not always wise, sometimes they are pretty stupid. He takes on the ability of artists, with a focus on musicians, to earn a living in this environment. It hasn't worked out well for them. He rails against the Singularitarians and their rabid zeal that compares to fundamentalist's eager anticipation of the rapture. He touches are many other areas on technology and the ways he thinks it is bereft of the possibilities that could be with greater vision.
A manifesto doesn't have to be completely agreed with by the reader. Very few are. A manifesto should decree the manifesto writer's position on the issues written about. In this instance he has created a manifesto that will be discussed and referred back for quite some time. It's going to fester under some technologist's skin and inspire others to create software 'that doesn't suck'. Some will write Lanier off as a 'goofball'. I found he made me think more deeply about the internet and to continue to think about its direction. - Rating: 3 out of 5 stars3/5
Mar 28, 2010
I admit that about 25% of the book went over my head; I'm not a software engineer. But, a lot of what Lanier (the father of virtual reality) says resonates with me. The unquestioning acceptance of the "wisdom of the crowd" for one being a tenant that definitely needs questioning. Why programming decisions made decades ago (like making the individual's presence anonymous) now have consequences -- such as trolling and the trivialization of online discussion. Lanier makes a strong and eloquent case for replacing the anonymous Wikipedia collaboration model with more human, individual, creative and context-rich model. Lanier also points out that we have experienced a paucity of musical and popular cultural innovations in the past several decades (with some notable exceptions like Pixar films (yay!)) and he blames the ubiquity of file sharing which demotivates artists. As someone who isn't that much of a devotee of popular culture and music, not a software engineer -- not sure whether I really am able to judge or critique Lanier's book effectively. But as a human who uses Web 2.0 tools a lot, and an information seeker -- much of what he said resonated. Thought provoking. - Rating: 3 out of 5 stars3/5
Jan 30, 2010
What Jaron Lanier does is take us up 50,000 feet and allow us to view things with perspective. We have been overwhelmed by the unnoticed "lock-in" and simply adjust and reduce ourselves to fit the requirements of online dating, social media, forums, and the software we employ. Web 2.0 is homogenizing humanity, taking us down to the lowest common denominator instead of allowing encouraging us to bloom in different directions. Everything we now "enjoy" seems to be backward looking - music is sampled and retro, news is criticized mercilessly, but very few are creating it any more, relationships are Tweets...
Friends don't let friends communicate via Facebook - they do it on the phone or in person. But the direction we are taking instead reduces interaction, kills creativity, journalism, music, science....it's not as pretty as predicted.
These are truly valuable criticisms, and this is an important, if flawed book. Flawed because after a hundred page pounding of logic and evidence, Lanier spends the second hundred pages telling us how wonderful it is to be a scientist and play with humans and cuttlefish. I was particularly annoyed with an unnecessary couple of paragraphs devoted to swearing, which he did not link to social media, and which he says might be connected to a parts of the brain controlling orifices and obscenity. Swearing is purely cultural, not physiological. In Quebec, the worst swearing is against the Catholic Church, Translated into English "Christ Tabernacle" sounds like something WC Fields said to skirt the censors. But it's the most vile thing you can say in polite conversation in Montreal. On the other hand Motherf----r doesn't translate into French at all. And what's any of this got to do with online reductionism? Zilch.
Others have pointed to other sections they disagree with, and they all seem to occur in the last half of the book. But don't let that deter you, as it distracted him. The original message is important. People create. Software does not. Software restricts. Don't leave anonymous contributions. Build a creative website of your own design. Probe deeply and uniquely - beyond Wikipedia. Reflect before you blog.
Our humanity and creativity are being put at risk by the miasma foisted on us by the incredible leveling machine of the internet. Instead of becoming exciting, the internet has become boring. Instead of creating new music, it has assassinated the entire industry. Instead of bringing people together, it lets them off the hook. That's worth exploring, and for about 100 pages, Lanier does a grand job of it.
Book preview
You Are Not a Gadget - Jaron Lanier
THE RECEPTION of You Are Not a Gadget has been extraordinary, far exceeding my expectations. One incident in particular stands out because it epitomized one of the main points of the book. I was speaking at the South By Southwest conference in Austin, Texas, a principal festival of emerging digital culture, and I had been warned that I might be booed or harassed by some in the audience who disagreed with my ideas.
After I took the stage, the first thing I said, after playing a little music on a khaen,* was that it would be a worthy experiment for the audience to not tweet or blog while I was talking. Not out of respect for me, I explained, but out of respect for themselves. If something I said was memorable enough to be worthy of a tweet or blog post later on—even if it was to register violent disagreement—then that meant what I said would have had the time to be weighed, judged, and filtered by someone’s brain.
Instead of just being a passive relay for me, I went on, what was tweeted, blogged, or posted on a Facebook wall would then be you. Giving yourself the time and space to think and feel is crucial to your existence. Personhood requires encapsulation. You have to find a way to be yourself before you can share yourself.
To my pleasant surprise, they applauded! It’s really that simple. This book is not antitechnology in any sense. It is prohuman.
You Are Not a Gadget argues that certain specific, popular internet designs of the moment—not the internet as a whole—tend to pull us into life patterns that gradually degrade the ways in which each of us exists as an individual. These unfortunate designs are more oriented toward treating people as relays in a global brain. Deemphasizing personhood, and the intrinsic value of an individual’s unique internal experience and creativity, leads to all sorts of maladies, many of which are explored in these pages. While the core argument might be described as spiritual,
there are also profound political and economic implications.
For instance, the idea that information should be free
sounds good at first. But the unintended result is that all the clout and money generated online has begun to accumulate around the people close to only certain highly secretive computers, many of which are essentially spying operations designed to gain information to sell advertising and access or to pull money out of a marketplace as if by black magic. The motives of the people who comprise the online elites aren’t necessarily bad—I count many of them as friends—but nevertheless the structure of the online economy as it has developed is hurting the middle class, and the viability of capitalism for everyone in the long term.
The implications of the rise of digital serfdom
couldn’t be more profound. As technology gets better and better, and civilization becomes more and more digital, one of the major questions we will have to address is: Will a sufficiently large middle class of people be able to make a living from what they do with their hearts and heads? Or will they be left behind, distracted by empty gusts of ego-boosting puffery?
It isn’t just about musicians and journalists, who are already being impoverished. Suppose you drive a truck or a taxi. Have you noticed that experimental computerized cars are starting to drive themselves? What will your children do for a living as computers get better and better?
Those of us close to privileged internet nodes might come to enjoy extraordinary benefits as medicine and other technologies progress into the digital realm. Maybe we’ll live many times longer than the children of former truck drivers.
But the notion that cheaper computers, smartphones, etc., will compensate for the growing economic gap is just not true. Ultimately mounting poverty will outpace cost savings and everyone will suffer. We can’t count on anything but a strong middle class to maintain many things dear to us: widespread self-determination and liberty, a dynamic commercial market filled with surprises, and a democracy that can’t be bought because ordinary people have enough clout to stand up for themselves. Some of the current popular online designs, as appealing as they might seem at first, are leading us away from these wonderful things.
What effect technology will have on the distribution of wealth and opportunity in our society is not a new question, of course. It was widely obsessed over in the nineteenth century, as it became clear that industrialization was transforming the world. We remember some of the stressful, early takes on the great question: Luddite riots, Marx’s writings, H. G. Wells’s The Time Machine, The Ballad of John Henry,
and, a little later, that preternatural oracle of internet culture, E. M. Forster’s short story The Machine Stops.
Here we are, many generations later, experiencing the start of the long-anticipated end game. Hopefully this book can help shake the dominant internet culture and wake it from its stupor.
*A Laotian mouth organ
IT’S EARLY in the twenty-first century, and that means that these words will mostly be read by nonpersons—automatons or numb mobs composed of people who are no longer acting as individuals. The words will be minced into atomized search-engine keywords within industrial cloud computing facilities located in remote, often secret locations around the world. They will be copied millions of times by algorithms designed to send an advertisement to some person somewhere who happens to resonate with some fragment of what I say. They will be scanned, rehashed, and misrepresented by crowds of quick and sloppy readers into wikis and automatically aggregated wireless text message streams.
Reactions will repeatedly degenerate into mindless chains of anonymous insults and inarticulate controversies. Algorithms will find correlations between those who read my words and their purchases, their romantic adventures, their debts, and, soon, their genes. Ultimately these words will contribute to the fortunes of those few who have been able to position themselves as lords of the computing clouds.
The vast fanning out of the fates of these words will take place almost entirely in the lifeless world of pure information. Real human eyes will read these words in only a tiny minority of the cases.
And yet it is you, the person, the rarity among my readers, I hope to reach.
The words in this book are written for people, not computers.
I want to say: You have to be somebody before you can share yourself.
CHAPTER 1
SOFTWARE EXPRESSES IDEAS about everything from the nature of a musical note to the nature of personhood. Software is also subject to an exceptionally rigid process of lock-in.
Therefore, ideas (in the present era, when human affairs are increasingly software driven) have become more subject to lock-in than in previous eras. Most of the ideas that have been locked in so far are not so bad, but some of the so-called web 2.0 ideas are stinkers, so we ought to reject them while we still can.
Speech is the mirror of the soul; as a man speaks, so is he.
PUBLILIUS SYRUS
Fragments Are Not People
Something started to go wrong with the digital revolution around the turn of the twenty-first century. The World Wide Web was flooded by a torrent of petty designs sometimes called web 2.0. This ideology promotes radical freedom on the surface of the web, but that freedom, ironically, is more for machines than people. Nevertheless, it is sometimes referred to as open culture.
Anonymous blog comments, vapid video pranks, and lightweight mashups may seem trivial and harmless, but as a whole, this widespread practice of fragmentary, impersonal communication has demeaned interpersonal interaction.
Communication is now often experienced as a superhuman phenomenon that towers above individuals. A new generation has come of age with a reduced expectation of what a person can be, and of who each person might become.
The Most Important Thing About a Technology Is How It Changes People
When I work with experimental digital gadgets, like new variations on virtual reality, in a lab environment, I am always reminded of how small changes in the details of a digital design can have profound unforeseen effects on the experiences of the humans who are playing with it. The slightest change in something as seemingly trivial as the ease of use of a button can sometimes completely alter behavior patterns.
For instance, Stanford University researcher Jeremy Bailenson has demonstrated that changing the height of one’s avatar in immersive virtual reality transforms self-esteem and social self-perception. Technologies are extensions of ourselves, and, like the avatars in Jeremy’s lab, our identities can be shifted by the quirks of gadgets. It is impossible to work with information technology without also engaging in social engineering.
One might ask, If I am blogging, twittering, and wikiing a lot, how does that change who I am?
or If the ‘hive mind’ is my audience, who am I?
We inventors of digital technologies are like stand-up comedians or neurosurgeons, in that our work resonates with deep philosophical questions; unfortunately, we’ve proven to be poor philosophers lately.
When developers of digital technologies design a program that requires you to interact with a computer as if it were a person, they ask you to accept in some corner of your brain that you might also be conceived of as a program. When they design an internet service that is edited by a vast anonymous crowd, they are suggesting that a random crowd of humans is an organism with a legitimate point of view.
Different media designs stimulate different potentials in human nature. We shouldn’t seek to make the pack mentality as efficient as possible. We should instead seek to inspire the phenomenon of individual intelligence.
What is a person?
If I knew the answer to that, I might be able to program an artificial person in a computer. But I can’t. Being a person is not a pat formula, but a quest, a mystery, a leap of faith.
Optimism
It would be hard for anyone, let alone a technologist, to get up in the morning without the faith that the future can be better than the past.
Back in the 1980s, when the internet was only available to small number of pioneers, I was often confronted by people who feared that the strange technologies I was working on, like virtual reality, might unleash the demons of human nature. For instance, would people become addicted to virtual reality as if it were a drug? Would they become trapped in it, unable to escape back to the physical world where the rest of us live? Some of the questions were silly, and others were prescient.
How Politics Influences Information Technology
I was part of a merry band of idealists back then. If you had dropped in on, say, me and John Perry Barlow, who would become a cofounder of the Electronic Frontier Foundation, or Kevin Kelly, who would become the founding editor of Wired magazine, for lunch in the 1980s, these are the sorts of ideas we were bouncing around and arguing about. Ideals are important in the world of technology, but the mechanism by which ideals influence events is different than in other spheres of life. Technologists don’t use persuasion to influence you—or, at least, we don’t do it very well. There are a few master communicators among us (like Steve Jobs), but for the most part we aren’t particularly seductive.
We make up extensions to your being, like remote eyes and ears (web-cams and mobile phones) and expanded memory (the world of details you can search for online). These become the structures by which you connect to the world and other people. These structures in turn can change how you conceive of yourself and the world. We tinker with your philosophy by direct manipulation of your cognitive experience, not indirectly, through argument. It takes only a tiny group of engineers to create technology that can shape the entire future of human experience with incredible speed. Therefore, crucial arguments about the human relationship with technology should take place between developers and users before such direct manipulations are designed. This book is about those arguments.
The design of the web as it appears today was not inevitable. In the early 1990s, there were perhaps dozens of credible efforts to come up with a design for presenting networked digital information in a way that would attract more popular use. Companies like General Magic and Xanadu developed alternative designs with fundamentally different qualities that never got out the door.
A single person, Tim Berners-Lee, came to invent the particular design of today’s web. The web as it was introduced was minimalist, in that it assumed just about as little as possible about what a web page would be like. It was also open, in that no page was preferred by the architecture over another, and all pages were accessible to all. It also emphasized responsibility, because only the owner of a website was able to make sure that their site was available to be visited.
Berners-Lee’s initial motivation was to serve a community of physicists, not the whole world. Even so, the atmosphere in which the design of the web was embraced by early adopters was influenced by idealistic discussions. In the period before the web was born, the ideas in play were radically optimistic and gained traction in the community, and then in the world at large.
Since we make up so much from scratch when we build information technologies, how do we think about which ones are best? With the kind of radical freedom we find in digital systems comes a disorienting moral challenge. We make it all up—so what shall we make up? Alas, that dilemma—of having so much freedom—is chimerical.
As a program grows in size and complexity, the software can become a cruel maze. When other programmers get involved, it can feel like a labyrinth. If you are clever enough, you can write any small program from scratch, but it takes a huge amount of effort (and more than a little luck) to successfully modify a large program, especially if other programs are already depending on it. Even the best software development groups periodically find themselves caught in a swarm of bugs and design conundrums.
Little programs are delightful to write in isolation, but the process of maintaining large-scale software is always miserable. Because of this, digital technology tempts the programmer’s psyche into a kind of schizophrenia. There is constant confusion between real and ideal computers. Technologists wish every program behaved like a brand-new, playful little program, and will use any available psychological strategy to avoid thinking about computers realistically.
The brittle character of maturing computer programs can cause digital designs to get frozen into place by a process known as lock-in. This happens when many software programs are designed to work with an existing one. The process of significantly changing software in a situation in which a lot of other software is dependent on it is the hardest thing to do. So it almost never happens.
Occasionally, a Digital Eden Appears
One day in the early 1980s, a music synthesizer designer named Dave Smith casually made up a way to represent musical notes. It was called MIDI. His approach conceived of music from a keyboard player’s point of view. MIDI was made of digital patterns that represented keyboard events like key-down
and key-up.
That meant it could not describe the curvy, transient expressions a singer or a saxophone player can produce. It could only describe the tile mosaic world of the keyboardist, not the watercolor world of the violin. But there was no reason for MIDI to be concerned with the whole of musical expression, since Dave only wanted to connect some synthesizers together so that he could have a larger palette of sounds while playing a single keyboard.
In spite of its limitations, MIDI became the standard scheme to represent music in software. Music programs and synthesizers were designed to work with it, and it quickly proved impractical to change or dispose of all that software and hardware. MIDI became entrenched, and despite Herculean efforts to reform it on many occasions by a multi-decade-long parade of powerful international commercial, academic, and professional organizations, it remains so.
Standards and their inevitable lack of prescience posed a nuisance before computers, of course. Railroad gauges—the dimensions of the tracks—are one example. The London Tube was designed with narrow tracks and matching tunnels that, on several of the lines, cannot accommodate air-conditioning, because there is no room to ventilate the hot air from the trains. Thus, tens of thousands of modern-day residents in one of the world’s richest cities must suffer a stifling commute because of an inflexible design decision made more than one hundred years ago.
But software is worse than railroads, because it must always adhere with absolute perfection to a boundlessly particular, arbitrary, tangled, intractable messiness. The engineering requirements are so stringent and perverse that adapting to shifting standards can be an endless struggle. So while lock-in may be a gangster in the world of railroads, it is an absolute tyrant in the digital world.
Life on the Curved Surface of Moore’s Law
The fateful, unnerving aspect of information technology is that a particular design will occasionally happen to fill a niche and, once implemented, turn out to be unalterable. It becomes a permanent fixture from then on, even though a better design might just as well have taken its place before the moment of entrenchment. A mere annoyance then explodes into a cataclysmic challenge because the raw power of computers grows exponentially. In the world of computers, this is known as Moore’s law.
Computers have gotten millions of times more powerful, and immensely more common and more connected, since my career began—which was not so very long ago. It’s as if you kneel to plant a seed of a tree and it grows so fast that it swallows your whole village before you can even rise to your feet.
So software presents what often feels like an unfair level of responsibility to technologists. Because computers are growing more powerful at an exponential rate, the designers and programmers of technology must be extremely careful when they make design choices. The consequences of tiny, initially inconsequential decisions often are amplified to become defining, unchangeable rules of our lives.
MIDI now exists in your phone and in billions of other devices. It is the lattice on which almost all the popular music you hear is built. Much of the sound around us—the ambient music and audio beeps, the ringtones and alarms—are conceived in MIDI. The whole of the human auditory experience has become filled with discrete notes that fit in a grid.
Someday a digital design for describing speech, allowing computers to sound better than they do now when they speak to us, will get locked in. That design might then be adapted to music, and perhaps a more fluid and expressive sort of digital music will be developed. But even if that happens, a thousand years from now, when a descendant of ours is traveling at relativistic speeds to explore a new star system, she will probably be annoyed by some awful beepy MIDI-driven music to alert her that the antimatter filter needs to be recalibrated.
Lock-in Turns Thoughts into Facts
Before MIDI, a musical note was a bottomless idea that transcended absolute definition. It was a way for a musician to think, or a way to teach and document music. It was a mental tool distinguishable from the music itself. Different people could make transcriptions of the same musical recording, for instance, and come up with slightly different scores.
After MIDI, a musical note was no longer just an idea, but a rigid, mandatory structure you couldn’t avoid in the aspects of life that had gone digital. The process of lock-in is like a wave gradually washing over the rulebook of life, culling the ambiguities of flexible thoughts as more and more thought structures are solidified into effectively permanent reality.
We can compare lock-in to scientific method. The philosopher Karl Popper was correct when he claimed that science is a process that disqualifies thoughts as it proceeds—one can, for example, no longer reasonably believe in a flat Earth that sprang into being some thousands of years ago. Science removes ideas from play empirically, for good reason. Lock-in, however, removes design options based on what is easiest to program, what is politically feasible, what is fashionable, or what is created by chance.
Lock-in removes ideas that do not fit into the winning digital representation scheme, but it also reduces or narrows the ideas it immortalizes, by cutting away the unfathomable penumbra of meaning that distinguishes a word in natural language from a command in a computer program.
The criteria that guide science might be more admirable than those that guide lock-in, but unless we come up with an entirely different way to make software, further lock-ins are guaranteed. Scientific progress, by contrast, always requires determination and can stall because of
