Regulation feeds corruption.

https://www.nadin.ws/archives/800

It’s the user, stupid . . . or maybe not

Digital technology is pervasive and makes so many things possible: from the ubiquitous e-mail to movies downloaded to one’s computer. Computers are part of our cars, they are our cell phones. They are becoming an essential part of the world we live in—and even part of our own bodies. Albeit, the unintended costs associated with […]

Digital technology is pervasive and makes so many things possible: from the ubiquitous e-mail to movies downloaded to one’s computer. Computers are part of our cars, they are our cell phones. They are becoming an essential part of the world we live in—and even part of our own bodies. Albeit, the unintended costs associated with too many of the information technology products are, at this juncture, way too high. Identity theft is only one example. It painfully costs the victim money, credit rating, time, and a great deal of hassle. We all lose when the economy as a whole is subject to damage reaching hundreds of billions. The unbearable and growing load of spam, the daily reports of security breaches—affecting millions of people—form the image of an increasingly vulnerable society. Hacking the computer embedded in our cars, or the cell phone through which many conduct some of their business is already on record. Have you ever thought about the security, or lack thereof, of the new prostheses attached to or implanted in our living bodies and which are monitored by embedded computers?
How have we gotten to this point and the notion that lack of security of digital data processing is inherent in the technology are no longer mere academic issues. Way too often, the rush to bring innovation to the market leads to a tacit acceptance of half-baked goods. The high of acquiring the latest gadget overrides the patience required to obtain the best possible. This rush is actually increasing in the computer industry, and vulnerabilities are becoming more costly. Instead of providing the highest protection for the user, the computer industry settles for “security through obscurity,” a term that became part of the professional parlance. In laymen’s terms, security through obscurity means “Let’s hope that no one will notice product deficiencies.” Like leaving the back door open under the assumption that thieves will use the front door. And in case someone notices them, lawyers cover the industry through the most egregious terms of use: “We do not warrant that our product will meet your requirements.” (This is the actual text the user has to agree to before using the product.)
Blame the user—for not knowing how to work with the product, or for negligence. Instead of designing secure systems and writing secure code, the industry accepts recycled programs, hacks, or tweaks that perpetuate shortcomings excused as inherent in any beginning. And when some malicious—or creative—person discovers the weak spot, the industry issues yet another patch—Windows™ is notorious for them—along with another warning to the users. Quite often, the user ends up being a guinea pig for products not fully tested prior to their release. Debugging is an expensive operation. Some companies save money by literally having their clients, purchasers expecting bug-free products, unwittingly do the debugging for them. Can you imagine the same strategy applied to new cars, heating systems, air conditioners, microwave ovens, etc., etc.? How many times were you required to sign a release form that frees the company of all responsibility if, for example, your microwave explodes? (The jury is still out on whether cell phones damage our health.)
It is time to defend ourselves. How? Let us put our minds together and find the most effective ways.


Posted in Blog

copyright © 2023 by Mihai Nadin