More important than "What" and "How" is "Why"?

http://www.nadin.ws/archives/2844

Enslaved by Digital Technology

Interview with Roberto Simanowski in Digital Humanities and Digital Media. Conversations on Politics, Culture, Aesthetics, and Literacy, pp. 184-205 (Open Humanities Press, 2016, Series Fibreculture Books)
PDF

Interview with Roberto Simanowski in Digital Humanities and Digital Media. Conversations on Politics, Culture, Aesthetics, and Literacy, pp. 184-205 (Open Humanities Press, 2016, Series Fibreculture Books)
PDF
Mihai Nadin is a scholar and researcher in electrical engineering, computer science, aesthetics, semiotics, human-computer interaction, computational design, post-industrial society, and anticipatory systems. He developed several computer-aided educational aids prior to the widespread use of the Internet and was one of the first proponents in the United States of integrating computers in education. Nadin investigates in several publications the implication of the digital paradigm and discusses in depth the new civilization resulting from it in his 1997 book, The Civilization of Illiteracy. Mihai Nadin holds advanced degrees in Electrical Engineering and Computer Science and a post-doctoral degree in Philosophy, Logic and the Theory of Science; he has served as Endowed Professor at the University of Texas at Dallas since 2004.
Mihai Nadin sees the human condition at stake in the Gold Rush obsession of digital technology entrepreneurs; he considers big data the ‘ultimate surrender to the technology of brute force’ Mihai Nadin 185 and the age of information ‘by definition an age of total transparency.’ He detects a new Faustian deal where Faust trades better judgment for perfect calculation; he unmasks social media as the ‘background for conformity’ and revolutionary technology as the underlying foundation of the ruling economic system.
Prelude
Roberto Simanowski: What is your favored neologism of digital media culture and why?
Mihai Nadin: “Followed”/“Follower:” It fully expresses how the past overtook the present. “Crowd” anything: self-delusional slogans for the daisy brain.
RS: If you could go back in the history of new media and digital culture in order to prevent something from happening or somebody from doing something, what or who would it be?
MN: I would eliminate any word that starts with “hyper” and “super,” and every scoring facility. Alternatively, I would prevent Bill Gates from developing DOS, and Apple from giving up on its language (the IOS7 cries out as an example of failing to live up to the company’s foundations). Yes, I would eliminate Term Coord, the European Union’s attempt to standardize terminology. More important: I would establish a framework for reciprocal responsibility. No company should be immunized against liability procedures. If your product causes damage due to sloppy design, insufficient testing, perpetuation of known defects, you are liable. Forget the legal disclaimers that protect disruptive technologies that disrupt our lives. And no user should be allowed to further the disruption. A simple analogy: Carmakers are liable for anything that systematically leads to accidents; drivers are liable for using cars irresponsibly. Does the analogy of the technology of the industrial age extend to that of the digital age? On an ethical level, of course. Innovation does not legitimize discarding ethics.
RS: What comes to mind if you hear “Digital Media Studies”?
MN: Opportunism. The unwillingness to think about a totally different age.
RS: If you were a minister of education, what would you do about media literacy?
MN: I would introduce “Literacies” (corresponding to all senses and to cognitive abilities) as the ubiquitous foundation of everyone’s education. “Vive la différence” would be the common denominator.
Politics and Government
RS: While in the 1990s Internet pioneers such as John Perry Barlow declared the independence of Cyberspace from the governments of the old world, now it seems people hope for governments to protect privacy online and to intervene in the takingover and commercialization of the Internet by huge corporations such as Google and Facebook?
MN: Pioneers are always mercenaries. Of course, to open a new path is a daring act—so much can go wrong. There is a lot of romanticism in what the Internet forerunners were saying. Most of the time, their words were far louder than their accomplishments were meaningful or significant. Declaring the Internet as an expression of independence from the government when you are actually captive to DARPA is comical at best. MILNET (split from ARPANet), further morphed into classified and non-classified Internet Protocol Router Networks, should have warned us all about what we will eventually surrender. Was Minitel (France, 1978) better? It offered little functionality, and was not dependent on the private data of its users. DOS—the operating system that even in our days underlies the world of PCs (since 1981)—was adopted without any consideration for the integrity of the individual. Apple stole from Xerox something that, even today, the company does not fully understand. But Xerox does data management in our days (it took over the tollways in Texas), and Apple sells music and whatnot—sometimes in collusion with publishers. You have to keep the competition vigilant. In the early years, everybody was in a hurry. This was the second coming of the California Gold Rush in which college dropouts found opportunity. Indeed, in no university, nobody—academic or not— knew enough about the future that the pioneers were promising to turn into paradise on Earth. When the blind lead the blind, you will never know when you arrive, because you really don’t know where you are going.
RS: This is a strong, devastating statement: The pioneers of digital media and culture as mercenaries, comics, thieves, dropouts, and blind persons without ideas and beliefs?
MN: Without idealizing the past or demonizing the beginners, let’s take note of the fact that Lullus understood that with new means of expression we can better understand the universe. And we can ask more interesting questions about the human being and its own understanding of the world. Pascal would not miss the value of feelings in the human perception of reality, and in the attempt to subject it to calculations. Leibniz, with whose name computation is associated, would seek no less than a universal language for making possible, for example, the understanding of history from a perspective of accomplishments. He was not interested in translating Chinese philosophy word-by-word. He was interested in ideas. (If you want to ask “What’s that?”—i.e., what are “ideas”—this interview is not for you!)
College dropouts should not be vilified, but also not idealized. It helps to start something free of the constraints of cultural conventions. It does not help to realize that what is at stake is not a circuit board, a communication protocol, or a new piece of software, but the human condition. The spectacular success of those whom we associate with the beginnings lies in monetizing opportunities. They found gold! The spectacular failure lies in the emergence of individuals who accept a level of dependence on technology that is pitiful. This dependence explains why, instead of liberating the human being, digital technology has enslaved everyone—including those who might never touch a keyboard or look at a monitor. To complain about the lack of privacy is at best disingenuous. Those who rushed into the digital age gave it up! In Web 2.0, profits were made not by producing anything, but in profiling everyone. The nouvelle vague activism of our days is a mantra for legitimizing new profitable transactions, not a form of resistance. If everyone really cared for their rights, we would have them back. All that everyone really wants is a bigger piece of the pie (while starting the nth diet).
RS: I am not sure about the diet but I completely agree that the implications of digital culture also affect those staying away from digital media, if such staying away is possible at all. But how guilty are those giving up privacy, their own and as a concept in general, by rushing into the digital age? Considering the conviction from the McLuhan camp that first we shape our tools and afterwards our tools shape us and that any medium has the power to impose its own assumptions on the unwary, I wonder how deliberate the acceptance of the new media’s assumptions are. Hand in hand with the human being’s unwariness goes triumph as homo faber, which Hans Jonas, in his book The Imperative of Responsibility: In Search of an Ethics for the Technological Age, calls the human fatality. We are entrapped by our success, Jonas states, in respect to the human’s belief in technology. Our power over nature has become self-acting and made man into a “compulsive executer of his capacity.” What would be required now is a power over that power. Did we really expect Gold Rush entrepreneurs to develop this kind of self-discipline?
MN: The echo chamber metaphor was used so far mainly to describe politics. It simply says that feedback of a narcissistic nature reinforces prejudices. Under Hitler, Stalin, Mao, and current Islamic extremism, masses tend towards hysterics. Selfinduced delusions and political idolatry are twins. Does it look any different within the objective, rational domain of science and technology?
The expectation of objectivity is sometimes rewarded: there are scientific and technological developments of authentic novelty. But let’s be clear: revolution means to turn things around, full circle, and in this respect, the information age is such a development. Technologically, this is a time of amazement. Conceptually, it is rather the reinvention of the wheel in digital format. For a long time, no new idea has percolated. The innovators aligned themselves with those in power and those with money. When the profit potential of the typewriter—the front end of IBM computers in the attempt to be free of perforated cards—was exhausted, word processing emerged. The X-acto knife gave way to the cut-and-paste procedure. It was not a new way of thinking, but rather a continuation of old patterns.
I am deeply convinced that computation (not only in its digital format) will eventually open up new opportunities and break from the past. The self-discipline in your question—how to keep a lid on the obsession with profit at any price—should actually become the determination to give free rein to creativity. Under the pressure of profit-making, there is no authentic freedom. In the echo chamber of science, celebration of one adopted perspective— the deterministic machine—leads to the automatic rejection of any alternative.
RS: Big Data is the buzzword of our time and the title of many articles and books, such as Big Data: A Revolution That Will Transform How We Live by Viktor Meyer-Schönberger and Kenneth Cukler (2013). The embracing response to the digitization and datafication of everything is Data Love, as the 2011 title of the conference series NEXT reads, which informs the business world about ‘how the consumer on the Internet will be evolving.’ It is a well-known fact that big data mining undermines privacy. Is, however, that love mutual, given the acceptance and even cooperation of most of the people?
MN: Big data represents the ultimate surrender to the technology of brute force. Wars are big data endeavors, so are the economic wars, not to mention the obsession with power and total control of the so-called “free individual.” Whether we like it or not, “information society” remains the closest description of the age of computers, networks, smartphones, sensors, and everything else that shapes life and work today. We are leaving behind huge amounts of data—some significant, some insignificant. Babbage’s machine, like the first recording devices, like the abacus and so many pneumatic and hydraulic contraptions, are of documentary importance. I am sure that if entrepreneurs of our days could find any value in them, they would not hesitate to make them their own and add them to the IP portfolio of their new ventures. What cannot be monetized is the human condition expressed in such previous accomplishments. You cannot resuscitate Babbage or Peirce, except maybe for some Hollywood production or some new game.
Data becomes information only when it is associated with meaning. However, our age is one of unreflected data generation, not one of quest for meaning. Data production (“Give me the numbers!”) is the new religion. Politics, economics, and science are all reduced to data production. Ownership of data replaced ownership of land, tools, and machines. Human interaction is also reduced to data production: what we buy, where we buy, whom we talk to, for how long, how often, etc. The Internet as the conduit for data is boring and deceiving. This is not what Vinton Cerf, to whose name the global transmission protocol TCP/IP is attached, had in mind. Instead of becoming a medium for interaction, the Internet got stuck in the model of pipes (sewage pipes, oil pipes, water pipes, and gas distribution pipes) and pumps (servers being engines that pump data from one place to another). Berners-Lee’s world-wide web made it easier to become part of the network: the browser is the peephole through which anyone can peek and everyone’s eyeballs become a commodity. Great pronouncements will not change this reality more than radical criticism (sometimes, I confess, a bit exaggerated). But we should at least know what we are referring to.
By the way: creative work—of artists, scientists, craftsmen (and women)—takes place on account of sparse data. Survival is a matter of minimal data, but of relevant information.
RS: Again, your account is quite radical and disillusioning, though not unjustified. In response, let me ask to what extent the browser reduces people to commodified eyeballs. Hasn’t the Internet (or the Web 2.0) rather turned every viewer and listener into a potential sender thus weaving a network of “wreaders,” as George P. Landow termed the reader-authors in hypertext in 1994, or “prosumers,” as the corresponding Web 2.0 concept reads? Isn’t this the spread of the microphone, to allude to Bertolt Brecht’s demand in his radio essays 85 years ago? Isn’t this dogma of interaction the actual problem? MN: In the evolution from centralized computing (the “big iron” of the not so remote past) to workstations, to client server architecture, to the Cloud (real-time) re-centralization, we have not come close to establishing the premise for a human knowledge project. “The knowledge economy” is a slogan more than anything else. Computation made possible the replacement of living knowledge by automated procedures. However, most of the time, computation has remained in its syntax-dominated infancy. On a few occasions, it started to expand into the semantic space: consider the diligent work of the ontology engineers. The time for reaching the pragmatic level of authentic interactions has not yet come. The ontology engineers do not even realize that there is such a dimension. If and when it comes, we will end the infancy stage of computation. “Eyeballs” are not for interaction in meaningful activities, but rather for enticing consumers. Interaction engages more than what we see.
RS: One basic tool of data accumulation and mining is Google, which through every search query not only learns more about what people want and how society works, but also centralizes and controls knowledge through projects such as Google Books. How do you see this development?
MN: There is no tragedy in digitizing all the world’s books, or making a library of all music, all movies, etc. After all, we want to gain access to them. This is their reason for being: to be read, listened to, experienced. The tragedy begins when the only reason for doing so is to monetize our desire to know and to do something with that knowledge. I remember shaking hands with that young fellow to whom Terry Winograd introduced me (May 1999). Larry Page was totally enthusiastic upon hearing from me about something called “semiotics.” At that time (let me repeat, 1999) none of my friends knew what Google was, and even less how it worked. They knew of Mosaic (later Netscape Navigator), of the browser wars, even of AltaVista, Gopher, and Lycos (some survived until recently). Today, none can avoid “Googling.” (Lucky us, we don’t have to “Yahoo!”) The act of searching is the beginning of pragmatics. Yes, we search in the first place because we want to do something (not only find quotes). Pragmatics is “doing” something, and in the process recruiting resources related to the purpose pursued. Larry Page is one of the many billionaires who deserve to be celebrated for opening new avenues through searches that are based on the intuitive notion that convergence (of interest) can be used in order to find out what is relevant. But nobody will tell him—as no one will tell Raymond Kurzweil—that the real challenge has yet to be addressed: to provide the pragmatic dimension.
The fact that Google “knows” when the flu season starts (check out searches related to flu) is good. But if you used this knowledge only for selling ads, you miss the opportunity to trigger meaningful activities. Seeking life everlasting is not really a Google endeavor. It is a passion for which many people (some smart, some half-witted) are willing to spend part of their fortunes. They can do what they want with their money. Period! But maybe somebody should tell them that it makes more sense to initiate a course of action focused on the betterment of the human condition. Or at least (if betterment sounds too socialist) for more awareness, for a higher sense of responsibility. Properly conceived, Facebook (or any of the many similar attempts) could have been one possible learning environment, way better adapted to education than the new fashionable MOOCs [massive open online courses]. Instead, it is an agent of a new form of addiction and human debasement.
Algorithms and Censorship
RS: Speaking of Facebook, here and in any social network where the amount of views, likes, and comments is counted the cultural consequence of computation seems to be comparison and ranking. Has communication shifted from ambiguous words to the excitement of who or what wins the competition?
MN: “Scoring” is the American obsession of the rather insecure beginnings projected upon the whole world. In the meaningless universe of scoring, your phone call with a provider is followed by the automatic message eliciting a score: “How did we do?” Less than 2% of users fall into the trap. The vast majority scores only when the experience was extremely bad—and way too often, it is bad. This is one example of how, in the age of communication (telling someone how to improve, for example), there is no communication: the score data is machine processed. The best we can do when we want to achieve something is to talk to a machine. The one-way only channels have replaced the two-way dialog that was meant to be the great opportunity of the digital age. We don’t want to pay for talking with a human being. In reality, we don’t want to pay for anything. As the scale expands, everything becomes cheap, but nothing has become really better. To speak about influence in social media, for example, is to be self-delusional. Scoring is not only inconsequential, but also meaningless.
Indeed, in this broad context, the human condition changes to the extent that the notion of responsibility vanishes. Consequences associated with our direct acts and decisions are by now projected at the end of a long chain of subsequent steps. At best, we only trigger processes: “Let’s warm up a frozen pizza.” You press a button; the rest is no longer your doing in any form or shape. More than ever before has the human being been rendered a captive receiver under the promise of being empowered. The variety of offerings has expanded to the extent that, instead of informed choices, we are left with the randomness of the instant. As a matter of fact, the “living” in the living is neutralized. The age of machines is making us behave more like machines than machines themselves. The excitement and energy of anticipation are replaced by quasi-instinctual reactions. By no means do I like to suggest an image of the end of humanity, or of humanness. There is so much to this age of information that one can only expect and predict the better. For the better to happen, we should realize that dependence on technology is not the same as empowerment through technology. The secular “Church of Computation” (as yet another Church of Machines) is at best an expression of ignorance. If you experience quantum computation, genetic computation, intelligent agents, or massive neural networks, you realize how limiting the deterministic view of the Turing machine is. And you learn something else: There is a price to everything we want or feel entitled to.
RS: During the debate of the NSA scandal in summer 2013, Evgeny Morozov titled an essay in the German newspaper Frankfurter Allgemeine Zeitung The Price of Hypocrisy, holding that not only the secret service or the government undermines the privacy of the citizen, but the citizens themselves by participating in information consumerism. Morozov points to the Internet of things that will require even more private information in order to work, i.e., to facilitate and automate processes in everyday life that until now we had to take upon ourselves. The price for this kind of extension of man’s brain is its deterioration through the loss of use. On the other hand, some claim that if the swimming pool heats up automatically after discovering a BBQ scheduled in our calendar, our brains are freed up for more important things.
MN: Yes, we want to automate everything—under the illusion that this will free us from being responsible for our own lives. For those shocked by the revelation that there is no privacy on the Internet of data, I can only say: This is a good measure of your level of ignorance, and acceptance of a condition in which we surrender to the system. Walk into Taste Tea in San Francisco, where the credit card app “Square” registers your iPhone presence and logs into your account. This is not government intrusion, but the convenience of automated payment. We cannot have it both ways—privacy and no privacy at all. Fundamentally, the age of information is by definition an age of total transparency. That Internet imaginaire that some pioneers crowed about—it will make us all more creative, freer than ever, more concerned about each other—was only in their heads. And not even there. The “innocents” were already in bed with the government—and with the big money.
It is not the government that betrayed the Internet. The “innocents” volunteered back doors as they became the world’s largest contracting workforce for spy agencies. The hotness IQ ranking for university studies in our days (cybersecurity, anyone? data-mining?) reflects the situation described above: “Follow the money!” Total transparency is difficult. A new human condition that accepts total transparency will not miraculously emerge, neither in San Francisco, nor in India, China, or Singapore. Government will have to be transparent. Who is prepared for this giant step? The government could make it clear: We observe you all (and they do, regardless of whether they make it known or not). Those hiding something will try to outsmart the system. The rest will probably be entitled to ask the Government: Since you are keeping track of everything, why not provide a service? My files are lost, you have them, provide help when I need it. We pay for being observed, why not get something in return?
RS: Lets talk more about automated decisions and vanishing responsibility that is a central topic of your work during the last decade. In your article “Antecapere ergo sum: what price knowledge” you foresee a rather bleak future in which responsibility is transferred from humans to machines by calculation and algorithmic datamining. You also speak of a new Faustian deal where Faust conjures the Universal Computer: “I am willing to give up better Judgment for the Calculation that will make the future the present of all my wishes and desires fulfilled.” How do anticipation, computation, Goethe’s Faust and Descartes’ ergo sum relate to each other?
MN: In order to understand the profound consequences of the Information Revolution, one has to juxtapose the characteristics of previous pragmatic frameworks. I did this in my book, The Civilization of Illiteracy (a work begun in 1981 and published in 1997), available for free download on the Internet. There are books that age fast (almost before publication); others that age well, and others waiting for reality to catch up. Look at the cover of my book. I conceived that image as part of the book in 1996/97: something that might remind you of Google books and of what many years later became the iPad. The image from the Vatican Library is indicative of what my book describes in detail: that is, make available the libraries of the world to everyone.
This book is more than ever the book of our time. I don’t want to rehash ideas from the book, but I’d like to make as many people as possible aware of the fact that we are transitioning from a pragmatics of centralism, hierarchy, sequentiality, and linearity to a framework in which configuration, distribution, parallelism, and non-linearity become necessary. The theocracy of determinism (cause¬effect) gives way to non-determinism (cause¬effect¬cause). It is not an easy process because, for a long time we (in western civilization, at least) have been shaped by views of a deterministic nature.
To understand the transition, we must get our hands dirty in pulling things apart—pretty much like children trying to figure out how toys work. Well, some of those toys are no longer the cars and trains that my generation broke to pieces, convinced that what made them run was hidden down there, in the screws and gears forming part of their physical makeup. Search engines, algorithms, and rankings—the new toys of our time—are only epiphenomenal aspects. At this moment, nobody can stop people from Googling (or from tearing apart the code behind Google), and even less from believing that what the search comes up with is what they are looking for.
We rarely, if ever, learn from the success of a bigger machine, a larger database, a more functional robot, or a more engaging game. We usually learn from breakdowns. It is in this respect that any medium becomes social to the extent that it is “socialized.” The so-called “social media” are top-down phenomena. None is the outcome of social phenomena characteristic of what we know as “revolutions” (scientific, technological, political, economic, etc.). They are the victory of “We can” over “What do we want?” or “Why?” And as usual, I go for questions instead of appropriating the slogans of others.
RS: As a remark on how we capitulate to our capabilities: During the anti-NSA protests in summer 2013, somebody presented a poster stating “Yes we scan.” This of course alluded to the famous slogan in Obama’s election campaign, articulating disappointment in the new president and perhaps also calling for a new movement. Read together, both slogans symbolize the determinism at least of this technological part of society: We scan because we can.
MN: Without accepting even a hint of a dark plot, we need to understand what is called “social media” as an outcome of the transaction economy. It was embodied in, among other things, new businesses. Uber and Lyft disrupt taxi services; airbnb and HomeAway disrupt the hotel business. The disruption had many dimensions, for instance, efficiency but also ethics. In the transaction economy ethics is most of the time compromised. The transaction economy replaces the industrial model, even the post-industrial model. To stick to the toy metaphor: Someone decided that we are all entitled to our little cars, fire engines, and trucks. We get them because they are deemed good for us. And before we even start being curious, the next batch replaces what we just started to examine. It is no longer our time, as inquisitive children, that counts. Others prescribe the rhythm for our inquisitive instincts. And for this they redistribute wealth. In rich and poor countries, phones are given away. You need to keep the automated machines busy. Money is not made on the phones but on the transmission of data. This is a new age in the evolution of humankind. Its definitory entity is the transaction, carried out with the expectation of faster cycles of change, but not because we are smarter and less inert; rather because our existence depends on consuming more of everything, even if that means sacrificing integrity.
The faster things move around, the faster the cycle of producing for the sake of consumption. Each cycle is motivated by profit-making. The huge server farms—the toys of those controlling our economic or political identity—are really not at all different from the financial transaction engines. Nothing is produced. A continuous wager on the most primitive instincts is all that happens. Thousands of followers post sexually explicit messages and invitations to mob activity; they trade in gossip and selling illusions. Ignorance sells better and more easily than anything else. Not only copper bracelets, cheap Viagra, diet pills, and everything else that succeeds in a large-scale market. If you Google, you first get what those who have paid for it want you to see, sometimes to the detriment of other (maybe better) options. Fake crowds are engineered for those living in the delusion of crowd sourcing.
The transaction economy, with all its high-risk speculation, is the brain-child of Silicon Valley. San Francisco is far more powerful than Washington DC, New York, and even Hollywood. Chamat Palihapitiya put it bluntly: We’re in this really interesting shift. The center of power is here, make no mistake. I think we’ve known it now for probably four or five years. But it’s becoming excruciatingly, obviously clear to everyone else that where value is created is no longer in New York, it’s no longer in Washington, it’s no longer in LA. It’s in San Francisco and the Bay Area. (Palihapitiya is one among many bigwigs going public on such a subject.)
RS: Speaking of the replacement of Hollywood by Silicon Valley, Adorno once accused the culture industry of liberating people from thinking as negation, as addressing the status quo. Being busy learning the status quo, i.e., finding out how all the new toys work—politically upgraded and camouflaged by euphemistic concepts such as “social” and “interactive”—seems to be a clever strategy to achieve the same result. Your new book, Are You Stupid? describes stupidity as the outcome of a system faking change because it is afraid of it. Who rules that system? Who is behind that strategy?
MN: Let us be very clear: the revolutionary technology that was seen as liberating in so many ways actually became the underlying foundation of the transaction economy. Never before has the public been forced into the rental economy model as much as the digital revolution has done. You no longer own what you buy, but rent the usage, to be screwed by the “landlord” (who makes more money by selling your identity than by providing you with a viable product). And even that is not straightforward. It has become impossible to connect to the Internet without being forced into a new version or a new patch. It has all gone so far that to buy a cell phone means to become captive to a provider. In the USA, a law had to be promulgated in order to allow a person to unlock “your” phone! (Remember: this is the age of the rent economy, of transactions not production.) Profits have grown exponentially; service never lives up to promises made and to the shamelessly high prices charged. In this context, social media has become not an opportunity for diversity and resistance, but rather a background for conformity. Once upon a time, within the office model, it was not unusual that women working together noticed that their menstrual periods synchronized. Check out the “Friends” on various social media: They now all “think” the same way, or have the same opinion. That is, they align to fashion and trends, they have their “period” synchronized. All at the lowest common denominator.
In making reference to such aspects of the “social” media, I might sound more critical than their investors would prefer. But as long as we continue to idealize a technology of disenfranchisement and impotence, we will not overcome the limitations of obsession with data to the detriment of information. The toy train reduced to meaningless pieces entirely lost its meaning. Remember trying to make it move as it did before curiosity took the better of it? Information eventually grew from playing with toys: the realization that things belong together, that the wheel has a special function, etc. Given the fact that in the digital embodiment knowledge is actually hidden, replaced by data, the human condition that results is one of dependence. There is no citizenry in the obsession with the newest gadget, bought on credit and discarded as soon as the next version makes the headlines. The Netizen that we dreamed of is more a sucker than an agent of change.
RS: The discussion of stupidity in the light of new technology associates Nicholas Carr’s 2008 article Is Google making us stupid? and brings up the question to what extent a search engine leaves the time to inquiring and acquire (or take apart and play with) knowledge. How do you see the role that search engines such as Google play in society?
MN: In everything individuals do, they influence the world—and are influenced by the world. Within an authentic democracy, this is an authentic two-way street: you elect and you can be elected. Google, or any other search engine for that matter, reflects the skewed relation between individuals and reality. Some own more than the other(s). This ownership is not just economic. It can take many other forms. If you search for the same word on various sites and at various times, the return will be different. In the naïve phase of data searching, way back in the 90s of the last century, relevance counted most. In the transaction economy, search itself is monetized: many businesses offer SEO [search engine optimization] functions. It pays to “find” data associated with higher rewards. Such rewards are advertisement, political recognition, technical ability, etc. In other words, through the search, a cognitive economic, political, etc. reality is engineered as the “engine” forces the searcher to receive it.
Of course, social media relies on search engines, because instead of empowering participants, it engineers the nature of their relations. This remark is not meant to demonize anyone. Rather, it is to establish the fact that in post-industrial capitalism, profit-making is accelerated as a condition for economic success. Those who do not keep up with the speed of fast transactions turn into the stories of wasted venture capital and failed start-ups. The cemetery of failed attempts to work for the common good is rarely visited. America, but to a certain extent Germany, England, and France, sucks up talent from the rest of the world, instead of rethinking education for the new context of life and work in the information society. When the world learned about the worrisome depth at which privacy was emptied of any meaning, by government and business, it was difficult to distinguish between admiration and uproar.
The fascinating Silicon Valley ecology deserves better than unreserved admiration. It is time to debunk the mythology of self-made millionaires and billionaires—and even more the aura of foundations à la Gates, which are mostly self-serving. America has encouraged the rush to the new gold not because it loves the new science and technology, but rather because it recognized new forms of profit-making. Unfortunately, the human condition associated with the information society continues to be ignored. At the scale at which profits are multiplying, crime is also multiplying.
The more recent wars that the USA has carried on are not possible without computers and the technology developed for waging them. Moreover, the profoundly dangerous undermining of democracy through vast surveillance of citizens is also the product of digital know-how bordering on the infamous. Computer science programs in many universities are nothing but training facilities for businesses at taxpayer expense. Research is very often a service to the military and the intelligence community, not an avenue towards new science, and even less an expression of ethical responsibility for the long-term consequences of new technologies. We teach the young and less young of our nation (and of other nations) the violence of games, and then wonder why America is the world champion in crime.
Media Literacy
RS: Let me come back to your book The Civilization of Illiteracy where you depict a civilization unfolding in which media complement literacy, and literacy—the way it is conceptualized in the Gutenberg Galaxy—is undermined by new literacies demanded and developed by digital technology. The general tone of the book is one of excitement and the invitation to be ready for the new challenges. Your answers in this interview so far indicate that this has changed.
MN: Fifteen years after The Civilization of Illiteracy was published (and almost 30 years since I started writing it), I cannot be more optimistic than I was at the time it was published. I already mentioned that I am convinced that it is the book of our time: new developments are still catching up with some of its predictions. It is an 890-page book, which I thought would be the last book of the civilization of literacy. I do not see anything terrifying in the reality that the human condition changes. It is not a curse, but a blessing. Corresponding to the new pragmatic framework, we are all experiencing the need to adapt more rapidly, and sometimes to trade depth for breadth. We do not have enough courage to discard everything that is still based on the structure of the previous pragmatic framework. The program in which I teach just built a new arts and technology teaching and learning facility: the same factory model; the same centralized, hierarchic structure. In reality, such a building should not have been erected. On the one hand, there are the big pronouncements regarding the state of science and technology; on the other, captivity to the past. Conflict does not scare me. I see in conflict the possibility of an authentic revolution in education and in many other societal activities. What scares me is the deeply ingrained conformity to the medieval model of teaching and learning. And the demagoguery associated with monetizing all there is. The Faustian trade-off is skewed: I will give you the illusion of eternity in exchange for your abdicating your desire to discover what it means to live. RS: How do you see this Faustian trade-off (coming) in place? Are you talking about the computational, digital turn in the Humanities? MN: A recent book on Digital Humanities (Anne Burdick, Johann Drucker, Peter Lunenfeld, Todd Presner, Jeffrey Schnapp, MIT Press 2012) claims that ‘Digital Humanities is born of the encounter between traditional humanities and computational methods.’ Of course ‘recent’ does not qualify as ‘significant.’ We learn from the text (and the comments it triggered) that ‘Digital Humanities is a generative practice,’ and that it ‘contributes to the “screen culture”’ of the 21st century. But we do not gain access to the questions of the human condition. We learn about design, but not from an informed perspective of the activity, rather on account of a reactive process of design that lacks a visionary dimension. McLuhan is quoted (again, the echo chamber metaphor is quite well illustrated in the tone of the writing); so are John Berger, Scott McCloud (on comics), and even Charles and Ray Eames. In respect to computation, the discourse is even more muddled. The words are often right; missing is the deeper understanding of the dynamics of human existence and activity. The applied aspect made the book a good candidate for adoption—and explains why it was funded: it promotes a notion of humanity congruent with that of technology.
In reality, “Humanities” is the expression of resistance. Those involved in humanities probe the science and technology instead of automatically accepting them. These remarks should not be construed as a book review. I use the book as an opportunity to recognize those honestly interested in understanding what is happening in our days, but also to point out that the endeavor is complicated by the fact that we are part of the process. You don’t have insight into the earthquake that reshapes the landscape. The hype over big data is of the same nature as the hype over the digital (sic!) humanities. Humanities—i.e., the many disciplines that fit under this heading—is rushing into a territory of methods and perspectives defined for purposes different from those of the humanities. To give up the long view for the immediacy of results is not a good trade-off. I am amused by those great “humanists” who seek out programmers for testing their own ideas. Smiling, we bid farewell to the past (some might recognize behind this formulation an author who saw part of this coming).
RS: Let me bring in another aspect of this. Computation—or algorithmic reading—has been a tool of research in the humanities for some time. Digital Humanities aims at the application of digital processes and resources for text and image analysis, large data mining, and data visualization. The rationale behind it: Machines are better in processing data than humans. However, the reading that algorithms carry out is “distant” in contrast to the close reading by humans. Your comment to Digital Humanities above is quite straight and critical. In the same spirit you state in your article “Reassessing the Foundations of Semiotics:” Quantity does not automatically lead to improved comprehension. The challenging semiotic project is, as you continue, not only to find information in big data, but also meaning in information. What do you expect from Digital Humanities in terms of reassessed semiotics?
MN: The great assumption is that there is a universal machine: the Turing machine. This assumption has led to the spread of the most insidious forms of determinism. Algorithmic computation became the magic formula for fighting disease, making art, and building rockets. It is forgotten that Turing defined only a specific form of automated mathematics. Universities, as centers of inquiry, were only too happy to replace the thinking of previous ages with the inquiry associated with virtual machines. They housed the big mainframe machines. Everything became Turing computational, and at the same time, as circular as the underlying premise. If you can describe an activity—that is, if you have an algorithm—algorithmic computation would perform that particular operation as many times as you wished, and in every place where that operation is involved. As long as the focus is on algorithmic descriptions, computation is assumed to be universal. Indeed, the arithmetic behind selling tomatoes in a market or exploring the moon became the same.
It turns out that quite a number of problems—the most interesting ones, actually—are not algorithmic. Protein folding, essential in living processes, is one example. So is computer graphics, involving interactive elements. Furthermore, adaptive processes can not be described through algorithmic rules. More important, anticipatory processes refuse to fit into neat algorithmic schemes. At the time when I advanced the notion that the computer is a semiotic engine, my enthusiasm was way ahead of my ability to understand that the so-called universal machine is actually one of many others. Today we know of DNA programming, neural network computation, machine learning (including deep learning), and membrane computation, some equivalent to a Turing machine, some not.
We are not yet fully aware that the knowledge domain covered by the universal computation model (the Turing machine) is relatively small. We are less aware of the fact that specific forms of computation are at work in the expression of the complexity characteristic of the living. The university is still “married” to the deterministic model of computation because that’s where the money is. If you want to control individuals, determinism is what you want to instill in everything: machines, people, groups. Once upon a time, the university contributed to a good understanding of the networks. Today, it only delivers the trades-people for all those start-ups that shape the human condition through their disruptive technologies way more than universities do. Working on a new foundation for semiotics, I am inclined to see semiotics as foundational for the information age. But that is a different subject. If and when my work is done, I would gladly continue the dialog.


Posted in Anticipation, Post-Industrial/Post Literate Society, Semiotics, Ubiquitous Computing & Digital Media

copyright © 2o16 by Mihai Nadin | Powered by Wordpress

Locations of visitors to this page