Tuesday, March 27, 2007

A Chip In Your Shoulder?

Back around 1987 or so, I walked by the bulletin board in the Department of Electrical and Computer Engineering at the University of Massachusetts Amherst and saw a letter with a note scrawled at the bottom, "Anybody want to help Ms. X?" A woman had written the letter to our department chair because we had a reputation for doing research in microwave remote sensing and the detecting of radio waves. In the letter, she said that she was convinced the FBI had secretly embedded a radio-wave spying chip in her body. She did not go into details about the circumstances under which this had been done, nor did she say exactly where she thought the chip was. But she knew it was there, and she wanted to know if she could come to our labs to be examined by us with our sensitive equipment.

Needless to say, nobody took her up on her offer to be "examined," although her letter was the topic of some lunchtable conversation for the next few days. I understand that this sort of belief is not uncommon among individuals whom psychiatry used to term "paranoid," although I don't know what terminology would be used today. Well, yesterday's paranoid fear is today's welcome reality—welcomed by some, at least. The cover of the March 2007 issue of the engineering magazine IEEE Spectrum shows an X-ray montage of a young guy holding both hands up near the camera. In the X-ray images, two little sliver-shaped chips are clearly visible in the fleshy part of each hand between the thumb and forefinger.

Inside, the reader finds that Amal Graafstra, an entrepreneur and RFID (radio-frequency identification) enthusiast, thinks having RFID chips in each hand is just great. After convincing a plastic surgeon to insert the chips, which are a kind not officially approved for human use yet (they're sold to veterinarians for pet-tracking purposes), he rewired his house locks, motorcycle ignition, and various other gizmos that used to need keys or passwords. Now he can just make like Mandrake the Magician, waving his hand in front of his door or his motorcycle instead of hauling out keys. When he posted the initial results of his experiments on a website, he got all kinds of reactions ranging from essentially "Way to go, dude!" to negative comments based on religious convictions. As he explains, "Some Christian groups hold that the Antichrist . . . will require followers to be branded with a numeric identifier prior to the end of the world—the 'mark of the beast.' So I got some anxious notes from concerned Christians—including my own mother!"

Right after reading Mr. Graafstra's article, you can turn to a piece by Ken Foster and Jan Jaeger on the ethics of RFID implants in humans. (Full disclosure: I am acquainted with Prof. Foster through my work with IEEE's Society on Social Implications of Technology.) They dutifully point out the potential downsides of the technology, including the chance that what starts out as a purely voluntary thing, even a fashionable style among certain elites, might turn into a job requirement or something imposed by a government on citizens or aliens or both. They mention the grim precedent set by the Nazi regime when its concentration-camp guards forced every prisoner to receive a tattooed number on the arm. RFID chips are not nearly as visible as tattooes, but can contain vastly more information. Think of a miniature hard drive with your entire work history, your places of residence, your sensitive financial information and passwords, all carried around in your body and possibly accessible to anyone with the right (or perhaps wrong) equipment. Such large amounts of data cannot be stored in RFID chips yet, but if the rate of technical progress keeps up, it will be possible to do that soon. And following the rough-and-ready principle that anything which can be hacked, will be hacked, implanted RFID chips pose a great potential risk to privacy.

While Messrs. Graafstra, Foster, and Jaeger debate the pragmatic consequences of this technology, I would like to bring up something that the "Christian groups" alluded to, although they approached it in a way that is biased by some fairly recent innovations in Christian theology dating only to the mid-19th century.

A deeper theme that dates from the earliest Hebrew traditions of the Old Testament is the idea of the human body as a sacred thing, not to be treated like other material objects. The Old Testament prohibited tattooes, ritual cutting, and other practices common among ancient tribes other than the Israelites. The Christian tradition carried these ideas forward in various ways, but always with a sense that the human body is not simply a collection of atoms, but is a "substance" (a philosophical term) which stands in a unique relation to the soul.

The problem with trying to relate these ideas to modern practices is that hardly anybody, Christian, Jewish, or otherwise, pays any attention to them any more. What with heart transplants, cochlear implants, artificial lenses for cataract surgery, and so on, we are well down the road of messing with the human body to repair or improve its functions. And because something is sacred does not mean necessarily that it cannot be touched or altered in any way. The best that I can extract from this tradition in regard to the question of RFID implants, is to encourage people to give this matter special consideration. It's not the same thing as carrying around a fanny pack, or a key ring, or even a nose ring. Once it's in there, you've got it, and it can be anything from a minor annoyance to major surgery to get it out. My bottom line is that with RFID implants, you're messing with the sacred again. And there has to be some meaning to the facts that this general sort of notion was applied first on a large scale by one of the most evil governments of the twentieth century, and that it used to be an imaginary fear latched onto by mentally unbalanced individuals. Only, I don't know what the meaning is.

Sources: The articles "Hands On" by Amal Graafstra and "RFID Inside" by Kenneth Foster and Jan Jaeger appear in the March 2007 issue of IEEE Spectrum, accessible free (as of this writing) at http://spectrum.ieee.org/mar07/4940 and http://spectrum.ieee.org/mar07/4939, respectively.

Tuesday, March 20, 2007

Identities For Sale

Well, here's a way we can solve the trade imbalance between China and the U. S. According to Symantec, the computer-security company, the U. S. harbors more than half of the world's "underground economy servers"—computers that are used for criminal activities, including the control of other computers called "bots" without the knowledge or consent of their owners. And it turns out that about a fourth of all bots are in China. So we're using China's computers to steal money, data, and identities from around the world. And it's even tax-free, if the criminals who organize this sort of thing play their cards right. This market is running so well that you can buy a new electronic identity, complete with Social Security number, credit cards, and a bank account, for less than twenty bucks. Don't like who you are? Become someone else!

Lest anyone take me seriously, the above was written in the spirit of Jonathan Swift's "modest proposal" of 1729 to alleviate poverty in Ireland by encouraging families to sell their babies to be eaten. I do not think it is a good thing that we lead the world in the number of servers devoted to criminal ends. But it's a fact worth pondering, and one question in particular intrigues me: why is computer crime so organized and, well, successful in this country?

Part of the answer has to do with the extraordinary freedom we enjoy compared to many other countries, both in the economic and political realms. While businesspeople complain about Sarbanes-Oxley and other burdensome regulations here, they should compare these relatively mild restrictions with those in China or many countries in Europe, where red tape and bureaucracy, not to mention the occasional corrupt official, can bog down business deals and keep foreign firms away.

Another part of the answer has to do with the relative ease of committing computer crime, and the relative difficulty law enforcement officials have in catching bit-wise criminals. According to the Symantec report, which was summarized in an article on the San Jose Mercury News website, much of the code needed for criminal work was done in regular nine-to-five shifts. This indicates the era of the late-night amateur hacker is giving way to the white-collar criminal who either does his work under the radar of a legitimate business, or simply sets up shop as a company whose activities are purposely vague to outsiders. And nothing could be more in keeping with modern U. S. business practices. It's easy to tell what goes on at a steel mill: there's smoke, flames, and railroad cars full of steel coming in and out. But you can walk into numberless establishments in office parks around this country, look around, even watch over somebody's shoulder, and you'll still have trouble figuring out what many of these outfits actually do.

And that's maybe a third reason the U. S. is so hospitable to computer crime: the ease with which you can hide behind anonymity here. In more traditional cultures, the loner is a rarity, and most people are tied to friends and relatives by networks of interdependent connections, obligations, and moral strictures. But here no one thinks badly of a person who lives alone in an apartment, works at a company called something like United Associated Global Enterprises, and keeps to himself. The fact that he is trading in millions of dollars' worth of stolen identities every week is known only to him and perhaps a few associates who could be scattered around the country or the world. Maybe the lack of distinctive identity that such bland, interchangeable surroundings impose on people who live and work in them makes it perversely attractive to deal in other peoples' identities, even for nefarious purposes.

Computer networks were designed in the early years by people who were, if not saints, at least folks who were very good at legitimate uses of computer technology, and they were dealing at first only with other people like themselves. There is a strong streak of idealism in many computer types, and that is one reason that many of them worked so hard to realize their ideal of a world community joining together on the Internet. But few of them had extensive experience with criminality, and so the possibility that someone might actually abuse this wonderful new system was not considered very seriously, in some ways. I speak as an amateur here, not as an expert. But the radically egalitarian structure of the Internet embodies a philosophy as much as it embodies a technical system.

There is no use crying over spilt idealism, and we have to deal with the way the Internet and computers are today, not the way they might have been if the founders had taken a more sanguine view of human nature when they set up the early protocols. I understand that sooner or later the Internet and its basic protocols will have to be overhauled in a far-reaching way. Maybe then we can put in some more sophisticated ways of tracking bad guys down, and of preventing the kinds of attacks that come without warning and shut down whole net-based businesses. But technology can take us only so far. As long as there are people using the Internet and not just machines, some of them are going to try to con, cheat, lie, and steal. The more that future systems are designed with that in mind, the better.

Sources: The Symantec report was summarized by Ryan Blitstein of the San Jose Mercury News on Mar. 19, 2007 at http://www.siliconvalley.com/mld/siliconvalley/16933863.htm. Jonathan Swift's "Modest Proposal," the heavy irony of which was completely missed by some of its first readers, is available complete at http://www.uoregon.edu/~rbear/modest.html.

Wednesday, March 14, 2007

Who Needs a Digital Life?

One day I rescued from the throw-out pile outside another professor's office a book entitled simply Computer Engineering, by C. Gordon Bell and two co-authors, all employees of the Digital Equipment Corporation. Published in 1978, it is a time capsule of the state of the computer art according to DEC, which around then was giving IBM a run for its money by coming out with the VAX series of minicomputers. This was just before the personal computer era changed everything.

This month I ran across the name Gordon Bell again, this time in the pages of Scientific American. By now, Bell is a vigorous-looking nearly bald guy with a strange idea that Microsoft, his current employer, has given him the resources to try out. After struggling to digitize his thirty years' career worth of documents, notes, papers, and books (including, no doubt, Computer Engineering), he decided in 2001 to not only go paperless, but to experiment with recording his life—digitally. The goal is to record and make available for future access everything Bell reads, hears, and sees (taste, touch, and smell weren't addressed, but I'm sure they're working on those too). The article shows Bell with a little digital camera slung around his neck. The camera senses heat from another body's presence or changes in light intensity, and snaps a picture along with time, GPS location, and wind speed and direction too, for all I know. So far this project has accumulated about 150 gigabytes in 300,000 records.

Two things are surprising about this. Well, more than two things, but two immediately come to mind. One is that 150 gigabytes isn't that much anymore. The computer I'm typing this on has a 75-gigabyte hard drive, and somehow or other I've managed to use 50 or so gigabytes already. Most of it is a single video project, and Bell admits most of his space is used by video. With new compression technologies video won't take up so much room in the future.

The second surprising thing is, why would anybody want to do this? Yes, it is becoming technologically feasible in the last few years, but only head-in-the-sand nerds automatically assert that because we can do a technical thing, we must do it. I hope Mr. Bell is beyond that stage, but one wonders. One of Bell's main motivations was simply to be able to remember things he would otherwise forget. Things like, "Oh, what was the name of that old boy I worked on the VAX with in 1975?" If it was anywhere in his scanned-in paper archives, I guess he can find out now. But at what cost?

Cost for Bell is not that much of an obstacle, seeing as how Microsoft is behind the project, and in any case, if technology keeps heading in the same general direction, costs for this sort of thing will plummet and everybody down to kindergarteners will be able to carry around their digital lives in their cellphones, or cellphone earring, or whatever form it will take. But since this blog is about ethical implications of technology, let's look at just two for the moment: dependence and deception.

Nobody knows what will happen to a person who grows up never having to memorize anything. I mean, where do you stop? I don't need to remember my phone number, my digital assistant does that. I don't need to know the capital of South Dakota, my digital assistant knows that. I don't need to know the letters of the alphabet, my digital . . . and so on. At the very least, if we go far enough with digital-life technology, it will create a peculiar kind of mental dependence that up to now has been experienced only by people on iron lungs. When a technology becomes a necessity, and something happens to the necessity, you can be in deep trouble. So far the project doesn't seem to have done Gordon Bell any harm, except to have absorbed much of his time and energy for the last several years. But if this sort of thing becomes as commonplace as electric lighting (which did in fact revolutionize our lives in ways that are both good and bad), it would work changes in culture and human relationships that at the very least, deserve a lot more thought and consideration than they have received up to now.

The second implication concerns deception. For practical purposes, there is no such thing as a networked computer system that is absolutely immune to jimmying of some kind: viruses, worms, falsification of data, and identity theft. Bell and Gemmell admit as much toward the end of the article when they talk about questions of privacy. You think someone stealing your Social Security number is bad, wait till somebody steals that photo your digital assistant took of your "escort" on Saturday night in Las Vegas during that convention. Their proposed solution, as is typical with true believers of this kind, is more technology: intelligent systems to "advise" us when sharing information would be stupid. But what technology will keep us from being stupid anyway? And their solution to the storage of what they call "sensitive information that might put someone in legal jeopardy" is to have an "offshore data storage account . . . to place it beyond the reach of U. S. courts." It's so thoughtful for Scientific American to place in the hands of its readers such convenient advice about how to evade the law. This advice betrays an attitude that is increasingly common among certain groups who feel strongly that the digital community trumps all other human institutions, including legal and governmental ones.

Well, I'm glad Mr. Bell is still exploring the wonderful world of computers, even if his interests in the wider ranges of human experience appear not to have changed since his early days on the VAX project. Despite the tone of technological determinism in his article, I assert that the way digital lives will develop and be used is far from predictable, and it is even far from certain that it will happen at all. If the technology does become popular, I hope others will think more deeply than Bell and Gemmell have about the possible dangers and downsides.

Sources: "A Digital Life" by C. Gordon Bell and Jim Gemmell appeared in the March 2007 edition of Scientific American (pp. 58-65, vol. 296, no. 3).

Tuesday, March 06, 2007

The Ethics of Electronic Reproduction

Since so much of what we see, hear, read, and talk about has passed through digitization and cyberspace, it's easy to let that fact fade into the background and ignore the myriad of tricks that engineers have put into the hands of video editors, sound recording experts, and crooks. A story about sound-recording fraud with a neat ironic twist was reported recently by John von Rhein, the Chicago Tribune's music critic. It seems that one William Barrington-Coupe, the man behind a small record label called Concert Artists, wanted to make his concert-pianist wife Joyce Hatto look good, or at least sound good, on recordings issued under her name. So he "borrowed" recordings of famous pianists such as Vladimir Ashkenazy and altered the timing just enough to throw off suspicion that would arise if anyone noticed that Joyce Hatto's version of Rachmaninoff's Prelude in C sharp minor, for example, lasted two minutes and forty seconds, exactly as long as Vladimir Ashkenazy's. He did this digitally, of course, which is how he got caught.

Seems there is software out there that can compare the bits directly between two digital recordings. Although I don't know the details, I can imagine that a direct bit-by-bit comparison, even with digital time fiddling thrown in, could reveal copying of this kind much more positively than any subjective human judgment. Anyway, somebody tried it out on one of Joyce Hutto's Concert Artist CDs and found that the bits actually originated from the playing of Hungarian pianist Laszlo Simon. Confronted with the evidence, Barrington-Coupe confessed, making publicity of a kind he probably wasn't hoping for.

The Tribune critic von Rhein makes the point that this is only the most egregious case of the kind of thing that has been going on for generations: electronic manipulation of performances to make them sound better. "Better" can mean anything from editing out mistakes and poorly performed passages to complete voice makeovers that can make a raspy-voiced eight-year-old boy sound like Arnold Schwarzenegger. Von Rhein traces this trend back to the introduction of tape recording and its comparatively convenient razor-blade-and-cement editing techniques, but there's an even earlier example: reproducing piano rolls. As early as the 1920s, inventors developed a system that recorded not only the timing of keystrokes but their force, in sixteen increments from loud to soft, and reproduced these strokes on a fancy player piano that embodied elements of digital technology implemented with air valves and bellows. Famed artists such as George Gershwin recorded numerous reproducing piano rolls, whose dynamics sounded much better than the ordinary tinkly player pianos of the day. It is well known that these reproducing piano rolls were edited by the performers to remove imperfections and otherwise improve upon the live studio performance.

Most people who listen to music these days are at least vaguely aware that even so-called "live" recordings have been doctored somewhat, and few seem to care. When someone strays into outright fraud, as Mr. Barrington-Coupe did, most people would agree that this is wrong. But should we be free to take what naturally comes out of a piano or a horn and transmogrify it digitally any way we wish, while still passing it off as "live" or "original"?

The novelty of this sort of thing is largely illusory. If we pass from the auditory to the visual realm, there is abundant evidence that those who could afford to make themselves look better than reality have done so, all the way back to ancient Rome. Portrait painters, other artists, and craftspeople have always faced the dilemma of whether to be strictly honest or flattering to their subjects. If the subject also pays the bill, and if honesty will not make as many bucks as flattery, flattery often wins out. The fact that flattery can now be done digitally is not a fundamental change in the human condition, but simply reflects the fact that as our media change, we take the same old human motivations into new fields of endeavor and capability.

What is truly novel about the story of Barrington-Coupe and his wife Joyce Hatto is not the intent or act of fraud, but the way he was caught. It was said long ago that he who lives by the sword dies by the sword. It often turns out these days that he who attempts to deceive by digital means gets caught by means of digital technology as well. On balance, I don't think we have a lot to worry about concerning musicians who want to sound a little better than reality on their recordings. Those who would forbid them the use of digital improvements are to my mind in the same category as those who want to prohibit the use of makeup by women. Maybe there are good religious reasons for such a prohibition, but it would make the world a little less attractive. The greater danger of digital technology as applied to media appears to me to lie in the area of control by large, powerful interests such as corporations and governments. But that is a discussion for another time.

Sources: John van Rhein's article appeared in the Mar. 4, 2007 online edition of the Chicago Tribune at http://www.chicagotribune.com/technology/chi-0703040408mar04,1,3237470.story?coll=chi-techtopheds-hed (free registration required).