8. The Auto-ID Trajectory: Converging Disciplines
Having studied the past and present applications of manual and automatic identification technology it is now feasible to investigate the likely future of auto-ID. While this chapter can be considered predictive in nature, it is based on leading-edge research, most of which has not been cited collectively as has been done here, in the auto-ID trajectory context. As identification techniques and devices have evolved incrementally since the 1900s, the turn of the twenty-first century has witnessed a new breed of auto-ID innovations; traditional devices that have found uses in non-traditional applications, many of which can be considered radical in their novelty. By tracing these developments the possible trajectories can be determined shedding light on the short-to-medium term course of auto-ID over the next fifty years. It should be noted that the trend towards digital convergence, shown within the auto-ID industry itself in chapter seven, is also present at a macro level, across different disciplines. Thus this chapter will inexorably be linked to showing how auto-ID devices have been utilised in other fields of study, such as medicine, and the innovative applications that have been born from these newly-formed relationships. This is a significant contribution to auto-ID research; not only being able to understand the autonomous nature of auto-ID but also being able to comprehend where new research dollars are likely to be spent, granting one the ability to ponder on the implications of subsequent developments. In addition, the chapter will present a view of complementary and supplementary peripheral technologies that are essential parts of this trend toward technological convergence. Finally, the human metaphor will be used to explore the auto-ID paradigm, beginning with auto-ID devices that are carried, to those that are worn, to those that penetrate the skin, and to those that wish to do away with the flesh altogether.
8.1. The Rise of Wearable Computing
According to Siewiorek (1999, p. 82) the first wearable device was prototyped in 1961 at MIT (Massachusetts Institute of Technology) by Edward Thorp and Claude Shannon. The idea for the device came in 1955 in an attempt to be able to predict roulette. However, the term “wearable computer” was first used by a research group at Carnegie Mellon University in 1991, coinciding with the rise of the laptop computer (early models of which were known as “luggables”). Wearable computing can be defined as: “‘anything that can be put on and adds to the user’s awareness of his or her environment.’ Mostly this means wearing electronics which have some computational power” (Sydänheimo et al. 1999, p. 2012). While the term “wearables” is generally used to describe wearable displays and custom computers in the form of necklaces, tie-pins and eyeglasses, it is the opinion of this researcher that the definition should be broadened to incorporate PDAs (personal digital assistants), e-wallets, and other mobile accessories such as cellular phones and smart cards that require the use of belt buckles or satchels attached to conventional clothing.
Before the widespread diffusion of personal computers (PCs) and laptops it was auto-ID devices in the form of bar code cards, magnetic-stripe cards and smart cards that were ‘luggable’ and to some degree wearable with the aid of an external clip or fastener. In the case of contactless smart cards they could even be carried in a wallet or purse or in a trouser or shirt pocket. While they did not have the same processing power as PCs or laptops, auto-ID devices did point to a practical ideal, in terms of their size. IBM and other computer manufacturers have quickly caught onto the notion of wearable computing- their vision of a portable computer that could be worn instead of carried has been well-documented. According to Phil Hester of IBM’s Personal Systems Group, the wearable PC, a hybrid device, would allow a user to freely walk around a building connected to a wireless network and perform all the day-to-day functions like send emails but with the added option of voice navigation/recognition (Wilcox 1999, p. 1). Wearable computing is about to reinvent the way we work and go about our day-to-day business, just like auto-ID devices did in the 1970s and 1980s. It is predicted that highly mobile professionals will soon take advantage of smart devices that will be built into their clothing so that they will be able to “…check messages, finish a presentation, or browse the Web while sitting on the subway or waiting in line at a bank” (Schiele et al. 2001, p. 44).
8.1.1. First Generation Wearables: Mobile Phones, PDAs and Pagers
Early prototypes of wearable computers throughout the 1980s and 1990s could have been described as outlandish, bizarre, abnormal-looking or even weird. For the greater part, wearable computing efforts have focused on head-mounted displays (a visual approach) that unnaturally interfered with human vision and made proximity to others cumbersome (Sawhney & Schmandt 1997, p. 171). But the long-term aim of research groups is to make wearable computing inconspicuous as soon as technical improvements allow for it. The end user should look as ‘normal’ as possible (S. Mann 1997, p. 177). This is where auto-ID techniques like voice recognition have been very useful. One need only consider the size of the first mobile phones in the early 1990s; they weighed the size of a small brick, were expensive, and very few people thought that widespread diffusion would be achieved. Yet today, numerous countries have reached in excess of 70 per cent penetration, which equates to a mobile phone for almost every adult in that country. As Cochrane (1999, p. 1) observed, “[t]oday, mobiles are smaller than a chocolate bar and cost nothing, and we can all afford them. And they are not bolted into vehicles as was originally conceived, but kept in pockets and hung on trouser belts.” In fact, today it is commonplace to find professionals and younger technology-savvy students not only carrying mobile phones but notebooks, PDAs and even smart flash storage cards/keys. To this list Starner (2001a, p. 46) adds a pager, electronic translator and calculator wristwatch. Starner even made the observation that “[s]ome people wear too many computers.” He noted that these separate computers have similar components such as a microprocessor and memory. In other words, there is a fair amount of redundancy in these separate devices. Wearable computers of the future will integrate all these functions into the one unit. The hope of wearable device developers is that the capabilities will converge to such an extent that the user will not consider the mobile phone as separate from a PDA or a PDA separate from a notebook. Nokia’s 9001 Communicator is an example of this convergence. It has the combinatory functionality to act as a phone, pager, diary and digital camera all in the one unit. See exhibit 8.1 for a diverse range of first generation “wearables”.
220.127.116.11. Industrial Application
Wearable computers should not just be considered solely for personal electronics but suitable for industrial purposes as well (see exhibit 8.2 on the following page). Several companies like Symbol Technologies, Honeywell and Xerox have researched industrial wearable devices for over a decade, along with newer names completely focused to this cause including Xybernaut and ViA. Perhaps one of the most well-known industrial uses of wearable computing is the United Parcel Service (UPS) case study. In 1995, UPS challenged Symbol Technologies “…to create a Wearable Data Collection device for their package loaders” (Stein et al. 1998, p. 18). Symbol’s goal
“…was to create a wearable system that increased efficiency and productivity through mobility and hands-free computing and scanning. Good ergonomics is essential for any commercially available wearable computer product” (Stein et al. 1998, p. 19).
After considerable feedback between users at UPS and Symbol and evaluations for possible disease transmission given the wearable computer would assume skin contact, the Wrist Computer was released in 1996. At one point Symbol was shipping about seventeen thousand units per month to UPS, such was the success of the product. What is interesting to note is that Steiner et al. (1998, p. 24) report that the “[t]he initial response from users who had been using hand-held computers was to not want to give up the wearable once they tried it.” Perhaps the same can be said for other wearable devices. How many can do without their mobile phones today, or PDAs, or smart card transaction cards?
8.1.2. Second Generation Wearables: E-Wallets and Wristwatches
As wearable computing devices get smaller and smaller there has been a conscious effort to create an electronic wallet that combines the traditional wallet, the computer and communication technology. For some time many believed that the Mondex smart card system would act to revolutionise the way people exchanged money (see exhibit 8.3). AT&T was so convinced that it invested in developing an electronic wallet. The “‘Mondex Wallet’ allows users to perform on-line transactions and view balance and transaction information stored on their card” (L. Cooper 1999, p. 87). The Mondex Wallet has not reached its potential diffusion rates but this has more to do with market maturity than anything else. While the Wallet is not the sophisticated type of wearable device that S. Mann and others envision, it is an incremental step towards that vision. Swatch has also introduced an electronic wallet in the form of a wristwatch, known as Swatch Access. The wristwatch features a “miniature antenna and a computer chip, similar to those used in conventional smart card payment systems. This allows users to perform transactions using money stored on the chip” (L. Cooper 1999, p. 87). Trials of the watch have taken place in Finland’s transport system. Another more sophisticated wristwatch solution known as, Digital Angel, “offers a unique combination of GPS, wireless Internet and sensor technologies” (ADS 2002a). The all-in-one unit which looks like a conventional watch can monitor temperature, contains a boundary alert function and has panic button feature. The versatility of the technology is seen in its wide range of formats and configurations such as a pager-like device, necklace, pendant, bracelet, and even belt buckle (ADS 2002a).
18.104.22.168. Medical Application
Wearables have also found a niche market in medical applications. See the Vivago Home System in exhibit 8.4 below. Hinkers et al. (1995, p. 470) describes a small wearable device that continuously monitors glucose levels so that the right amount of insulin is calculated for the individual reducing the incidence of hypoglycaemic episodes. Hinkers once predicted the use of automated insulin delivery systems as well which are currently under development. Medical wearables even have the capability to check and monitor 26 different products in one’s blood (Ferrero 1998, p. 88). Today medical wearable device applications include:
…monitoring of myocardial ischemia, epileptic seizure detection, drowsiness detection… physical therapy feedback, such as for stroke victim rehabilitation, sleep apnea monitoring, long-term monitoring for circadian rhythm analysis of heart rate variability (HRV) (Martin et al. 2000, pp. 44)
Some of the current shortcomings of medical wearables are similar to those of conventional wearables, namely the size and the weight of the device is too heavy. In addition wearing the devices for long periods of time can be irritating due to the number of sensors that may be required to be worn for monitoring. The gel applied for contact resistance between the electrode and the skin can also dry up causing nuisance. Other obstacles to the widespread diffusion of medical wearables include government regulations and the manufacturers’ requirement for limited liability in the event that an incorrect diagnosis is made by their equipment (Martin et al. 2000, pp. 44). More recently the issue of privacy has been raised especially for medical wearable devices that are applied within shared hospital facilities where access to results could be abused.
8.1.3. Third Generation Wearables: Smart Clothes and Accessories
There are two things we carry with us everywhere we go, that is, clothes (such as undergarments, shirts, pants and accessories etc) and our actual bodies (composed of skin, muscles, nerves, water etc). Wearable computing experts have always sought a seamless and transparent way to introduce their high-tech devices. Many wearable computing developers believe the answer lies in distributing the equipment evenly throughout the body so that it does not feel excessively heavy for the end-user or look cumbersome. Known as “smart clothes” or “underwearables”, they will do more than keep you warm. “With the help of computers and special high-tech fabrics, smart clothes could send and receive information and adjust to give you what you need at any moment” (Kastor 2000, p. 1). A research group in Belgium has been developing the “i-Wear” range (i.e. Intelligent Wear). Siddle (2000, p. 1) reports that the clothes:
will perform many of the current functions of mobile phones, computers and even hospital monitoring equipment… The company [i-Wear] says the range of tasks that the clothes will be able to perform is vast, from taking phone calls to keeping a check on the health of the wearer.
While mass-scale commercial production of such clothes is probably a decade away, shirts with simple memory functions have been developed and tested. Sensors will play a big part in the functionality of the smartware helping to determine the environmental context and undergarments closest to the body will be used for body functions such as the measurement of temperature, blood pressure, heart and pulse rates. For now however, the aim is to develop ergonomically-astute wearable computing that is actually useful to the end-user. Head-mounted displays attached to the head with a headband may have acted to prototype the capabilities of wearable computing but it was not practical and definitely not attractive. Displays of the next generation will be mounted or concealed within eyeglasses themselves (see exhibit 8.5) (Spitzer et al. 1997, p. 48). Accessories like ear-rings, cuff-links, tie-pins and pendants are also considered wearables if they contain intelligence. The Gesture Pendant, for instance, can be used in an Aware Home granting occupants the ability to be recognised and their activities interpreted to improve the quality of their life. The wearer has the ability to control different house elements like lights, the television, radio, telephone via simple hand gestures that are detected and interpreted by the smart pendant. The target audience for this Gesture Pendant are the elderly or disabled who suffer from particular ailments but who would still want to maintain their independence by living in their own homes. The device could be also used for medical monitoring over time.
22.214.171.124. Military Application
The military is paying particular attention to wearable computing developments. Combatants of the future may look like something/someone out of a film like “Universal Soldier”. This should not be surprising since as far back as the 1960s there were attempts to make a “Man Amplifier”; to grant a soldier the added help of an exoskeleton, a sort of first line of defence in protection of the mortal flesh. While the Man Amplifier was unsuccessful due to obvious technological limitations of the time, today systems like FREFLEX (Force Reflecting Exoskeleton) are being trialled to augment human strength characteristics (Repperger et al. 1996, pp. 28-31). The US Army for instance, has been involved in trying to build a military uniform that utilise wearable computing components. They are seeking a uniform that can make:
…soldiers nearly invisible, grant superhuman strength and provide instant medical care… All this would be achieved by developing particle-sized materials and devices- called “nanotechnology”- nestled into the uniform’s fabric… Supercharged shoes could release energy when soldiers jump… Microreactors could detect bleeding and apply pressure… Light-deflecting material could make the suit blend in with surroundings (LoBaido 2001, p. 1).
This may sound a little far-fetched but it is not. A British company that has called itself the Electronic Shoe Company have developed a pair of walking boots that can be used to power electrical equipment such as a mobile phone. Footwear could also be used to help orientate the soldier, leading them to specific targets through the safest possible route, with the capability of even detecting landmines. In the event of injury to a soldier it is hoped that smart shirts like the Sensate Liner (in which is woven optical fibre) can even aid to localise life-threatening wounds to the upper torso (Gorlick 1999, p. 121). According to Kellan (2000a, p. 1) each soldier would be equipped with a wearable computer, GPS locator and wireless connections to the military network. This would grant individuals the ability to send signals back to base camp in times of trouble or for base camp to send new instructions to the soldier based on more up-to-date intelligence reports. It is not inconceivable for whole divisions to be redirected to areas of safety, minimising the loss of life.
8.2. The Paradigm Shift- From Wearable to Implantable Devices
A new line of “wearables” is now emerging that does not quite fit the definition of the traditional wearable that assumes a presence outside the human body. Implantable devices such as RF/ID transponders discussed in section 7.5, cannot exactly be referred to as “wearables” because the component is not worn, rather it is ingrained, embedded, entrenched in the human body. The implant device is more than an extension; it becomes one with the body, a seamless fusion between flesh and foreign object. Years ago, automated biometric recognition techniques were heralded as a coming together of humans and machines but today we have something beyond a meeting point, we have the potential for a union of biological proportions on an evolutionary scale. The human who has been implanted with a microchip is an Electrophorus, a bearer of “electric” technology (see exhibit 8.6). One who “bears” (i.e. a phorus) is in some way intrinsically or spiritually connected to that which they are bearing, in the same way an expecting mother is to the child in her womb. The term electrophorus seems much more suitable today than that of any other, such as cyborg. The term “cyborg” seems to have been hijacked by science fiction novels and movies to mean “part machine, part human”; this would be more relevant to bionics as opposed to implantable devices.
So why the requirement for implantable devices when the same devices could apparently be worn? Two opposing arguments have come from the same institution. Chief futurologist, Ian Pearson, of British Telecom (BT) is not convinced that implants will take the place of wearable components, whereas x-BT researcher, Peter Cochrane is convinced otherwise. Pearson’s argument is that “[t]here is nothing you can do with embedded chips that you can’t do with wearable ones” (LoBaido 2001, part 1, pp. 2f). Pearson however, does believe in the pervasive nature of the chips predicting that by 2006 wearable identity chips will be implemented. Only one year prior to this interview, Peter Cochrane told McGinity (2000, p. 17) that there “‘…will come a day when chips are not just worn around the neck, but are actually implanted under a human’s skin.’ When I scoffed at such an idea as merely science fiction, Cochrane offered up that he himself would be testing out such a human chip and looked forward to the opportunity.” And who could ever doubt such a possibility after Warwick’s 1998 Cyborg 1.0 trial? After the microchip implant Warwick was able to walk around his rigged up building in the Cybernetics department and be recognised as being “Kevin Warwick.” As he walked through the doorways, the radio signal energised the coil in the chip, produced current, and gave the chip the ability to send out an identifying signal (Witt 1999, p. 2). Warwick and Cochrane are not alone in their efforts.
“Many theorists see people carrying embedded technology as mobile computing’s next “killer application”... Instead of just implanting machines into humans to reconstruct joints or regulate heartbeats, they imagine the addition of sensors and chips in bodies which will make people better, stronger and faster” (Mieszkowski 2000, part 1, p. 2).
8.2.1. The Role of Auto-ID
Shortly after the commotion of the Warwick implant (1998) wore off and Cochrane launched his Tips for Time Travellers (1999), Applied Digital Solutions (ADSX) was founded. The company first announced its VeriChip solution on December 19, 2001. RF/ID, traditionally used in contactless smart cards, tags and keys, and transponders interwoven into clothing, was now being marketed as a suitable identity verification chip for a variety of security, financial, emergency service and healthcare applications for humans (see exhibit 8.7). In a press release the company announced that the VeriChip would be “…available in several formats, some of which [could] be inserted under the skin” (ADS 2002c, p. 2). The Chief Technology Officer (CTO) of ADSX told Scheeres (2002a, p. 1) that “[t]he chip… is injected into the subject’s forearm or shoulder under local anaesthesia during an outpatient procedure and leaves no mark.” Furthermore VeriChip is expected to sell at a low two hundred US dollars with the Digital Angel service packaged at a monthly $29.95 US dollars with a one year minimum contract (Associated Press 2002c; Farrell 2002). Scanners that could identify the VeriChip, very similar to those used to identify pet implants would cost between one thousand and three thousand US dollars. More recently ADSX have begun to aggressively market their products, attracting a lot of publicity as both young and old have opted for the chip implant. The “Get Chipped™” promotion and the ChipMobile™ that roams the US have increased the awareness level of the general public. ADSX has scheduled visits to “recreation and stadium events, health clinics, nursing homes” among other locations (ADS 2002d).
8.2.2. The Impact of Mobility
The added function of networking to wearable computing components and implantable devices has acted to create an extremely powerful platform for monitoring and tracking humans “anywhere, anytime”. Starner (2001b, p. 54) identified three network communication levels: (i) off the body to a fixed network (e.g. wireless-enabled wristwatch); (ii) between different wearable devices on the body (e.g. between intelligent eyeglasses and belt buckle); and (iii) near the body between the user and objects (e.g. between a gesture pendant and a television set). Location has always been an important attribute in people-centric applications but it is only now that the capability exists to query this data in real-time. Krikelis (1999, p. 13) calls this “context information” and this is exactly what is set to revolutionise daily consumer and business activities. Future fourth generation (4G) mobile services base their core value proposition around being able to retrieve this type of data. A typical 4G service example could be as follows (Michael, K. 2002f, p. 293):
An employee who works for a multinational company is travelling from Sydney to China and making a stopover in Singapore. While on his way to Sydney airport, the employee encounters a major traffic accident on the Harbour Bridge. Traffic stops to a standstill, while police and ambulance treat people at the scene. A camera on the bridge tracks all delays, alerting the roads and traffic authority (RTA). The RTA (with additional police information) estimates that the delay will be in excess of two hours and sends this information to the central information bureau. The employee is alerted by the wireless service provider that they will most likely miss his flight and will have to stay at Sydney’s Airport Hilton overnight waiting for the next available flight which is scheduled to depart in the morning. The employee replies to the message and updates are made to his itinerary as detailed on his reply message. The panic of having to reorganise everything is removed from the traveller. Though he will end up missing the first meeting in Singapore, he is relieved with the almost instantaneous knowledge that he will be leaving Sydney in time for subsequent meetings.
Throughout this scenario a number of smart devices are being used to execute operations seamlessly. These may include a RF/ID device in the car of the employee travelling to the Airport, a wireless mobile phone carried by the individual to be able to send and receive information (either by voice or data), a smart wristwatch which contains itinerary information about flights, hotels and forthcoming meetings. Somewhere in amidst all this would be a GPS-enabled trigger that lets the respective service providers know where the individual is located and grants them the ability to calculate estimated times of arrival. This kind of service however would require mass cooperation between the various stakeholders.
8.2.3. Global Positioning System (GPS) Tracking
Having established the importance of network communications to wearable and implantable devices let us consider the role of the Global Positioning System (GPS). Ferrero (1998, p. 87) ponders: “[i]magine GPS in your wallet, cell phone, or watch to tell you where you are.” Well, one does not have to imagine that any longer, there are services being offered right now, beyond the smart car navigation systems. Companies like Wherify, Gen-Etics, Pro-Tech, Sky-Eye and Digital Angel/ADSX are taking advantage of what GPS has to offer and using it to track living and non-living things. In terms of people tracking, this is done for a variety of reasons including: child safety, reducing the incidence of kidnapping of high profile persons, for those suffering from Alzheimer’s disease who may become disorientated, for those suffering from mental illness, for parolees, for prison inmates, for military personnel, for emergency services, or just for peace of mind. Wherify’s “GPS Locator for Children” for instance, states:
[c]hildren have a natural urge to explore. Parents have a natural desire to know their children are safe. That’s why Wherify created the world’s first Personal Locator to help you determine your child’s location in minutes. Wherify’s GPS Locator technology helps keep loved ones safe by combining Wherify’s patented technology with the U.S. Department of Defence’s multi-billion dollar Global Positioning System (GPS) satellite plus the largest 100% digital, nationwide PCS wireless network. So relax. Now you can have peace of mind 24 hours a day while your child is the high tech envy of the neighbourhood!
The watch worn by Wherify users contains a built-in pager, an atomic synchronised clock, an emergency 911 button, a lock button and an on-board GPS technology (see exhibit 8.8 on the following page). One pitfall of the Wherify technology is that it can be seen, thus alerting a perpetrator to the possibility that their location will be found out. The evolutionary vision therefore is a technology that is fully implantable and cannot be seen by an attacker or anyone else for that matter. “Enter the Digital Angel: according to CEO Richard Sullivan, the solution combines GPS wireless communications with biosensors, powered by body heat in the form of a dime-sized chip, which can be embedded in a watch, bracelet or medallion, even under your flesh…” (Mieszkowski 2000, part 2, p. 2). Now given GPS only works outdoors, other wireless systems must cater for in-building/in-container solutions, most of which take advantage of RF/ID tags and transponders, video cameras, sensors or infrared signals. The WhereNet company offers a number of different configurations for its Real-Time Locating System (RTLS).
8.2.4. Towards a Unique Identifier for Universal Personal Telecommunications
The requirement for integrated messaging services, also known as unified messaging, increasingly demands that an individual be reached through one unique identifier, though for the time being disparate identifiers are still being utilised. As individuals become used to taking advantage of their idle time ‘on-the-go’ and end-user terminals become more sophisticated, the need for message centralisation will grow. Currently persons are receiving and sending messages from numerous clients including home and work fixed telephones, mobile phones, home and work fixed computers, laptops, PDAs and even facsimiles. Synchronising all these points of interaction is almost impossible without a service that allows the individual to manage their personal needs. For instance, a message to an employee’s work email should be accessible by mobile phone without that individual having to dial into a separate message databank. They should be able to check their mobile messages and have their email message converted using text-to-speech functionality. There are two developments that are enabling this capability: the first is the blurring between what is wireline and wireless, the second is that the new IP-based networks have turned the traditional notion of voice and data up-side-down. These two changes are setting the stage for a global platform that requires that an individual have a unique lifetime identifier (from the time of birth), either in the form of a number or email address. Some of the early protocols such as SIP (Session Initiation Protocol) and H.323 rely on such an identifier. In 2000, Bell Canada launched the “Find-Me Number Service” which allowed phone calls to follow an individual (Bell 2000). With time this service will extend to incorporate emails and other forms of messaging. While not apparent what we are moving towards is a “universal” Personal Communications Service (PCS) model that is hybrid in approach, utilising the best of both worlds (both GPS and PCS) when it is required. The question perhaps is how interlinked will this scheme be to a potential unique lifetime identifier chip implant? I.e., a unique identifier for communications could be just as suitable for personal user authentication. Farrell (2001) predicts, “[o]ther uses could be to replace keys and ATM cards with implanted chips, making it possible for a single implant to unlock your house, start your car and give you money from a cash point.”
8.3. Case Study: Auto-ID Adapted for Medical Implants
Auto-ID devices, in particular implantable devices like smart microchips and passive and active RF/ID transponders, have found themselves being utilised in medical applications for completely different purposes to what they were originally invented. Apart from serving their designated purpose as an automatic identification device they have somehow found themselves to be integral components of life-saving devices, in some instances either to the prevention or cure of particular disabilities, ailments or diseases. The devices have of course been further developed and adapted to match the requirements of the specific medical application. The evolutionary path indicates that due to developments in auto-ID and microcircuitry in general, today’s medical implantable devices have much overtaken the “humble” pacemaker. It is at this point that a landmark study in the field must be mentioned as supporting evidence. Catherine M. Banbury has written an excellent historical account of technological innovation in the pacemaker industry (1959-1990). In her chapter on pacemaker technology she describes the incremental changes that took place in the pacemaker industry stating that it was first in 1963 that “the market for internal pacemakers had settled on a dominant design” (Banbury 1997, p. 52). It is not surprising that Banbury’s findings on innovation in the pacemaker industry are similar to those of the findings on auto-ID in this thesis (see especially section II on “pacemaker technology”, pp. 47-72). It is worth quoting her in full below (Banbury 1997, p. 54):
Many technological innovations that occurred in the pacemaker industry during the incremental era changed some aspect of the product and how it was used. These changes resulted from innovations in pacing technology or from innovations in input technologies, where the research and development could have been conducted by pacemaker firms or firms and research institutions external to the pacemaker industry. Innovations in semiconductor technology and in surgical procedures used to implant pacers are examples of external innovations that were later adapted to pacing technology. Innovations in electrode and lead technologies are examples of innovations developed by firms both inside and outside of the industry. However, developments in the pacing mode, the core technology, were introduced by pacing firms.
Taking this into consideration the following innovation cases point to a future path for auto-ID as something more than an identification technology, a new electrophorus paradigm that is set to revolutionise the way humans consider technology- no longer as a separate entity but as a life-enhancing artefact carried within the body. Now we turn our attention to innovations that are paving the way for this paradigm shift.
8.3.1. Biochips for Diagnosis and Smart Pills for Drug Delivery
It is not unlikely that biochips will be implanted at birth in the not-too-distant future. “They will be able to diagnose disease with precision, pre-determine a patient’s response to treatment and make individual patients aware of any pre-disposition to susceptibility” (Wales 2001). With response to treatment for illness, drug delivery will not require patients to swallow pills or take routine injections; instead chemicals will be stored on a microprocessor and released as prescribed (see exhibit 8.9 below). The idea is known as “pharmacy-on-a-chip” and was founded by scientists at the Massachusetts Institute of Technology (MIT) in 1999 (LoBaido 2001, part 2, p. 2). The following extract is from The Lab (1999):
Doctors prescribing complicated courses of drugs may soon be able to implant microchips into patients to deliver timed drug doses directly into their bodies.
Microchips being developed at Ohio State University (OSU) can be swathed with a chemical substance like pain medication, insulin, different treatments for heart disease, or gene therapies, allowing physicians to work at a more detailed level that is possible today (Swissler 2000, p. 1). The breakthroughs have major implications for diabetics especially who require insulin at regular intervals throughout the day. Researchers at the University of Delaware are working on “smart” implantable insulin pumps that may relieve people with Type I diabetes (Bailey 1999, p. 1). The delivery would be based on a mathematical model stored on a microchip and working in connection with glucose sensors that would instruct the chip when to release the insulin. The goal is for the model to be able to simulate the activity of the pancreas so that the right dosage is delivered at the right time. The implantable chips are also being considered as a possible solution to clinical depression and/or for patients who are incapable of committing to a prescribed regime. Gasson (1998, p. 1) first named this capability as “cyberdrugs”/ “cybernarcotics” with a well-meaning intent. Professor John Santini of MIT, knowing the possible implications of such an innovation, however, has repeatedly outlined that the focus is strictly “therapeutic”, a better way to treat diseases (LoBaido 2001, part 2, p. 2). Scientists at universities are not the only ones researching biochips or smart pills according to Wales (2001) production is quickly becoming a big business as genomic-based medicine is the next buzz-word. Some of the more well-known players include: Affymetrix, Motorola Life Sciences Codelink Division, Packard BioScience, Agilent, and Hitachi.
8.3.2. Cochlear Implants- Helping the Deaf to Hear
More than thirty-two thousand people worldwide already have cochlear implants (Manning 2000, p. 7D). Cochlear implants can restore hearing to people who have severe hearing loss, a form of diagnosed deafness. Unlike a standard hearing aid that works like an amplifier, the cochlear implant acts like a microphone to change sound into electronic signals. Signals are sent to the microchip implant via RF stimulating nerve fibres in the inner ear. The brain then interprets the signals that are transmitted via the nerves to be sound. For a closer look at the cochlear implant see the Clarion and Nucleus product innovations (see exhibit 8.10). Another company, Canadian-based Epic Biosonics, has teamed up with Professor Chris Toumazou of Imperial College. Toumazou has made significant inroads to cutting the costs of cochlear implants and making them more comfortable for the individual. Most cochlear implants today require power packs worn on belts with connecting wires generated by battery power that generally do not look aesthetically good, Toumazou is trying to change this impracticality (Imperial College 1999, p. 2). For now, cochlear implants are being used to overcome deafness, tomorrow however they may be open to the wider public as a performance-enhancing technique. Audiologist, Steve Otto of the Auditory Brainstem Implant Project at the House Ear Institute in Los Angeles predicts that some day “implantable devices [will] interface microscopically with parts of the normal system that are still physiologically functional” (Stewart 2000, p. 2). He is quoted as saying that this may equate to “ESP for everyone.” Otto’s prediction that implants will one day be used by persons who do not require them for remedial purposes has been supported by numerous other high profile scientists. The critical question is whether this is the ultimate trajectory of auto-ID devices.
8.3.3. Retina Implants- on a Mission to Help the Blind to See
The hope is that retina implants will be as successful as cochlear implants in the future. Like cochlear implants cannot be used for persons suffering from complete deafness, retina implants are not a solution for totally blind persons but rather those suffering from aged macular degeneration (AMD) and retinitis pigmentosa (RP). Retina implants have brought together medical researchers, electronic specialists and software designers to develop a system that can be implanted inside the eye (Ahlstrom 2000, p. 1). A typical retina implant procedure is as follows:
[s]urgeons make a pinpoint opening in the retina to inject fluid in order to lift a portion of the retina from the back of the eye, creating a pocket to accommodate the chip. The retina is resealed over the chip, and doctors inject air into the middle of the eye to force the retina back over the device and close the incisions (Datamaster 2001, p. 1).
Brothers Alan Chow and Vincent Chow, one an engineer the other an ophthalmologist, developed the artificial silicon retina (ASR) and began the company Optobionics Corp in 1990. This was a marriage between biology and engineering, first conceived of over a Thanksgiving dinner. “In landmark surgeries at the University of Illinois at Chicago Medical Centre on June 28, the first artificial retinas made from silicon chips were implanted in the eyes of two blind patients who have lost almost all of their vision because of retinal disease.” In 1993 Branwyn (p. 3) reported that a team at the National Institute of Health (NIH) led by Dr. Hambrecht, implanted a 38-electrode array into a blind female’s brain. It was reported that she saw simple light patterns and was able to make out crude letters. The following year the same procedure was conducted by another group on a blind male resulting in the man seeing a black dot with a yellow ring around it. Joseph Rizzo of Harvard Medical School’s, Massachusetts Eye and Ear Infirmary has cautioned that it is better to talk down the possibilities of the retina implant so as not to give false hopes. The professor himself has expressed that they are dealing with “science fiction stuff” and that there are no long-term guarantees that the technology will ever fully restore sight, although significant progress is being made by a number of research institutes (Wells n.d., p. 5). Among these pioneers are researchers at The John Hopkins University Medical Centre in Baltimore, Maryland. Brooks (2001, pp. 4f) describes how the retina chip developed by the medical centre will work:
…a kind of miniature digital camera… is placed on the surface of the retina. The camera relays information about the light that hits it to a microchip implanted nearby. This chip then delivers a signal that is fed back to the retina, giving it a big kick that stimulates it into action. Then, as normal, a signal goes down the optic nerve and sight is at least partially restored…”
8.3.4. Tapping into the Heart and Brain
If as far back as 1958, two transistors the size of an ice hockey puck were successfully implanted in the heart of a 43 year old man (Nairne 2000, p. 1), what will become possible by 2058 is bound by the imagination alone. Heart packemakers are still being further developed today, but for the greater part, researchers are turning their attention to the possibilities of brain pacemakers. In the foreseeable future brain implants may help sufferers of Parkinson’s, paralysis, nervous system problems, speech-impaired persons and even cancer patients. While the research is still in its formative years and the obstacles so great because of the complexity of the brain, scientists are hopeful of major breakthroughs in the next twenty to fifty years. The brain pacemaker endeavours are bringing together even more people from different disciplines headed mainly by neurosurgeons. By using brain implants electrical pulses can be sent directly to nerves via electrodes. The signals can be used to interrupt incoherent messages to nerves that cause uncontrollable movements or tremors. By tapping into the right nerves in the brain, particular reactions can be achieved. Using a technique that was first founded, almost accidentally in France in 1987, the following extract describes the procedure of “tapping into” the brain:
Rezai and a team of functional neurosurgeons, neurologists and nurses at the Cleveland Clinic Foundation in Ohio had spent the next few hours electronically eavesdropping on single cells in Joan’s brain attempting to pinpoint the precise trouble spot that caused a persistent, uncontrollable tremor in her right hand. Once confident they had found the spot, the doctors had guided the electrode itself deep into her brain, into a small duchy of nerve cells within the thalamus. The hope was than when sent an electrical current to the electrode, in a technique known as deep-brain stimulation, her tremor would diminish, and perhaps disappear altogether (Hall 2001, p. 2).
There are companies that have formed like Medtronic, Incorporated (Minneapolis, Minnesota) that specialise in brain pacemakers. Medtronic’s Activa implant has been designed specifically for sufferers of Parkinson’s disease (Wells n.d., p. 3).
126.96.36.199. Attempting to Overcome Paralysis
In more speculative research surgeons believe that brain implants may be a solution for persons who are suffering from paralysis such as spinal cord damage. In these instances, the nerves in the legs are still theoretically “working” (below the lesion point), it is just that they cannot make contact with the brain which controls their movement. If somehow signals could be sent to the brain, bypassing the lesion point, it could conceivably mean that paralysed persons regain at least part of their capability to move (Dobson 2001, p. 2). In 2000 Reuters (pp. 1f) reported that a paralysed Frenchman [Marc Merger] “took his first steps in 10 years after a revolutionary operation to restore nerve functions using a microchip implant… Merger walks by pressing buttons on a walking frame which acts as a remote control for the chip, sending impulses through fine wires to stimulate legs muscles…” It should be noted, however, that the system only works for paraplegics whose muscles remain alive despite damage to the nerves. Yet there are promising devices like the Bion that may one day be able to control muscle movement using RF commands (D. Smith 2002, p. 2). Researchers at the University of Illinois in Chicago have:
…invented a microcomputer system that sends pulses to a patient’s legs, causing the muscles to contract. Using a walker for balance, people paralysed from the waist down can stand up from a sitting position and walk short distances… Another team, based in Europe… enabled a paraplegic to walk using a chip connected to fine wires in his legs” (Brooks 2001, p. 3).
These techniques are known as functional neuromuscular stimulation systems.
188.8.131.52. Granting a Voice to the Speech-impaired
Speech-impairment microchip implants work differently to that of the cochlear and retina implant. Whereas in the latter two, hearing and sight is restored, in implants for speech-impairment the voice is not restored, but an outlet for communication is created, possibly with the aid of a voice synthesizer. At The Emory University, neurosurgeon Roy E. Bakay and neuroscientist Phillip R. Kennedy have been responsible for the latest breakthroughs. In 2002, Versweyveld (p. 1) reported two successful implants of a neurotrophic electrode into the brain of a woman and man who were suffering from Amyotrophic Lateral Sclerosis (ALS) and brainstem stroke, respectively. In an incredible process, Bakay and Kennedy have somehow replicated the ability to explicitly capture the patient’s thoughts to a computer screen by the movement of cursor. “The computer chip is directly connected with the cortical nerve cells… The neural signals are transmitted to a receiver and connected to the computer in order to drive the cursor” (Versweyveld 2002, p. 1). This procedure has major implications for brain-computer interaction (BCI), especially bionics. Bakay predicts that by 2010 prosthetic devices will grant patients that are immobile the ability to turn on the TV just by thinking about it and by 2030 to grant severely disabled persons the ability to walk independently (Dominguez 2000, p. 2). Despite the early signs that these procedures may offer long term solutions for hundreds of thousands of people, some research scientists believe that tapping into the human brain is a long-shot. The brain is commonly understood to be “wetware” and plugging in hardware into this “wetware” would seem to be a type mismatch according to Steve Potter, a senior research fellow in biology working at the California Institute of Technology’s Biological Imaging Centre in Pasadena. Instead Potter is pursuing the cranial route as a “digital gateway to the brain” (Stewart 2000, p. 1). Others believe that it is impossible to figure out exactly what all the millions of neurons in the brain actually do- but it should be reminded that this is the exact same argument that was presented when there were initial discussions about the Human Genome Project. See exhibit 8.10 for a diverse range of transponder-based medical applications that were discussed in section 8.3.
8.4. Onward the Quest for Immortality
Up until now this chapter has focused on implants that are attempts at “orthopaedic replacements”, corrective in nature, required to repair a function that is either lying dormant or has failed altogether. Implants of the future however, will attempt to add new “functionality” to native human capabilities, either through extensions or additions. Globally acclaimed scientists have pondered on the ultimate trajectory of microchip implants. The literature is admittedly mixed in its viewpoints of what will and will not be possible in the future; but one thing history has taught is that if an idea has been conceived the probability that it will come into fruition is high; perhaps not tomorrow but eventually. One need only consider attempts to clone humans. Warwick’s Cyborg 2.0 project for instance, intended to prove that two persons with respective implants could communicate sensation and movement by thoughts alone (Dobson 2001, p. 1). The prediction is that terminals like telephones would eventually become obsolete if thought-to-thought communication became possible. Warwick describes this as “putting a plug into the nervous system (Dobson 2001, p. 1) to be able to allow thoughts to be transferred not only to another person but to the Internet and other mediums. While Warwick’s Cyborg 2.0 may not have achieved its desired outcomes, it did show that a form of primitive Morse-code-style nervous-system-to-nervous-system communication is realisable (Green 2002, p. 3). Warwick is bound to keep trying to achieve his project goals given his philosophical perspective. And if Warwick does not succeed, he will have at least left behind a legacy and enough stimuli for someone else to succeed in his place, even if, as Berry (1996) says, the prediction will come true 500 years from now, ultimately it is inevitable that it will happen.
8.4.1. Towards Electrophoresis
The predictions that are being made today can be traced back to the rather crude elucidations of the 1950s and 1960s. The only difference between the pronouncements of today and yester-year is that today scientists have the means to describe the finer details because of the technological advancements that have taken place since. Compare Ellul (1964) with Kaku (1998) and Warwick (2002, 2003) for instance in the following extracts:
Knowledge will be accumulated in “electronic banks” and transmitted directly to the human nervous system by means of coded electronic messages. There will no longer be any need of reading or learning mountains of useless information; everything will be received and registered according to the needs of the moment. There will be no need of attention or effort. What is needed will pass directly from the machine to the brain without going through consciousness… (Ellul 1964, p. 432).
Is it possible to interface directly with the brain, to harness its fantastic capability? Scientists are proceeding to explore this possibility with remarkable speed. The first step in attempting to exploit the human brain is to show that individual neurons can grow and thrive on silicon chips. Then the next step would be to connect silicon chips directly to a living neuron inside an animal, such as a worm. One then has to show that human neurons can be connected to a silicon chip. Last… in order to interface directly with the brain, scientists would have to decode millions of neurons which make up our spinal cord (Kaku 1998, p. 112).
On the 14th of March 2002 a one hundred electrode array was surgically implanted into the median nerve fibres of the left arm of Professor Kevin Warwick. The operation was carried out at Radcliffe Infirmary, Oxford, by a medical team headed by neurosurgeons Amjad Shad and Peter Teddy. The procedure, which took a little over two hours, involved inserting a guiding tube into a two inch incision made above the wrist, inserting the microelectrode array into this tube and firing it into the median nerve fibres below the elbow joint (Warwick).
In particular extra memory and processing capabilities could be a possibility. A person’s brain could be directly linked to a computer network (Warwick).
In essence, Ellul, Kaku and Warwick are pointing to a path which seems to have taken root of its own accord. They are all saying the same thing, except what one can observe is a continual refinement of thought from ‘this is probably what will happen’ (Ellul), to ‘this is how you might go about it’ (Kaku), to ‘this is how the experiment was implemented’ (Warwick).
184.108.40.206. The Soul Catcher Chip
The Soul Catcher chip was conceived by Peter Cochrane about the same time his father died (Coughlin 2000, p. 1). At the time, P. Cochrane led a R&D staff of 660 biologists, engineers and physicists at British Telecom, working on futuristic technologies (Pickering 1999, p. 1). This was the perfect platform in which to launch his ideas to the world. The idea of Soul Catcher was all about the preservation of a human, way beyond the point of physical debilitation. Cochrane and fellow Extropians believe that there is a way to live forever on Earth. Warwick insists he is “engaged in a noble mission: to save humankind” (C. Jones 2000, p. 1). US Navy Commander Shaun Jones, manager of advanced biotech research programs for DARPA was quoted as saying: “[o]ur generation may be the last to have to accept death as inevitable” (Stephan n.d., p.1). And Dr. Chris Winter of British Telecom who leads an AI team of eight scientists at Martlesham Health Laboratories near Ipswich is even more adamant that “[t]his is the end of death.” Winter predicts that by 2030:
…it would be possible to relive other people’s lives by playing back their experiences on a computer. “By combining this information with a record of the person’s genes, we could recreate a person physically, emotionally and spiritually… It would be possible to imbue a new-born baby with a lifetime’s experiences by giving him or her the Soul Catcher chip of a dead person,” Dr. Winter said. The proposal to digitise existence is based on a solid calculation of how much data the brain copes with over a lifetime… Over an eighty year life, we process 1- terabyte of data (Uhlig 2001).
The Soul Catcher chip to be implanted in the brain, will act as an access point to the external world (Grossman 1998 p. 1). Consider being able to download the mind onto computer hardware and then creating a global nervous system via wireless Internet (Fixmer 1998, p. 2). By 2050 Cochrane has predicted that downloading thoughts and emotions will be commonplace (LoBaido 2001, part 2, p. 1). He imagines a world where chip implants are as commonplace as mobile phones (McGrath 2001, p. 1). The chip would also complement human memory. Billinghurst and Starner (1999, p. 64) predict “that artificial intelligence will augment human intelligence to make information management as natural as any other physiological function, freeing human intellect to focus on creative rather than computational function.” For a futuristic scenario of an organic-based Soul Catcher chip to be implanted in the brain see table 8.1.
Table 8.1 Brain Implants Futuristic Scenario
 “Unlike the crude relics of the 20th century science fiction, the microcomputer exhibited her- not unlike the one implanted in your head- looks organic. Down to the last atom, it seems something only nature could devise. But this chunk of synthetic post-human “brain” was designed in a laboratory. It may not look like much, but this amorphous mass is the spring to leap to the level beyond human.”
 “Current implant allow individuals to communicate directly with low-orbiting satellites (with which was once called wireless telecommunications technology) and thus, telepathically with post-humans anywhere on Earth.”
 “In addition to such information, current implants are able to impact sensory experiences formerly reserved for “real life” encounters. For instance, while you may not have the physical ability to visit the Pacific Ocean, implants interacting with the human nervous system allow you to hear and see the waves, smell the salt air, feel the sand under your feet- all from your static, landlocked location.”
 “Altering body parts was one thing, humans thought at the time. But the brain? The human mind? Consciousness itself? Somehow, our ancestors thought, that was different. Nonetheless, in the late 1990s, some doctors were already using brain implants to stop epileptic seizures.”
 “On the eve of the 22nd century, once computer science was reduced to the atomic level, microscopic computers were designed to monitor the activity of individual neurons, and then to replicate or interact with the neural network at its own scale. This new technology seemingly eliminated the need for silicon chip implants…”
* Source: See http://www.futurefantastic.net/exhibits/implants/implants.htm (2001).
8.5. The Evolutionary Paradigm
You could be forgiven for thinking that maybe all this talk of brain implants belongs to science fiction but the evidence is there that it is certainly not science fiction. When well-known universities in North America and Europe fund brain implant projects and big companies like British Telecom and Nortel Networks support ideas like the Soul Catcher chip and sponsor cyborg experiments, and government departments like DARPA and NASA discuss future possibilities openly, we can be assured that this is not science fiction but increments of science fact. McGrath (2001, p. 1) alludes to the German poet Rainer Maria Rilke who makes the observation that the “future enters into us long before it happens”.
The future is entering us. We eat genetically modified food. We submit to implanted devices that go well beyond the familiar heart pacemaker. We tinker with human tissue, developing artificial bone and skin for transplantation. We are on the verge of “smart” prosthetics, such as retinal implants that restore vision in damaged eyes. Such devices will ultimately be networked, allowing, say, a subcutaneous chip to transmit a person’s entire medical history to a physician far away… Rodney Brooks, the director of the Artificial Intelligence Laboratory at the Massachusetts Institute of Technology, goes even further. Over time, he says, “we will become our machines” (McGrath 2001, p. 1).
With reference to Kurzweil’s prediction of humans merging with robots, Danny Hillis predicts that the change would happen so gradually that we would sooner or later get use to it as if it had been there all along (Joy 2000, p. 3). Steve Mann (1997d, p. 31) uses an excellent analogy to express this: “Someday, when we’ve become accustomed to clothing-based computing, we will no doubt feel naked, confused, and lost without a computer screen hovering in front of our eyes to guide us”, just like we would feel naked without conventional clothes today. It is in the same light that Warwick remarked about his Cyborg 1.0 implant, “I don’t see it as a separate thing… It’s like an arm or a leg” (Witt 1999, p. 2).
Bartholomew (2000, p. 1) has pointed out this type of evolutionary paradigm in a simplistic yet nevertheless effective manner: “PalmPilots. Windows CE. Car phones. Cell phones. Armband computers for warehouse management. Bar-code readers. Pagers. Geophysical positioning devices. Where will it all end?” His compelling question “where will it all end?” is rather rhetorical. Science believes there is no end. To Bartholomew’s list we could add: RF/ID transponders. VeriChip ID. Cochlear implants. Retina implants. Brain implants. Soul chips… the list can go on and on, bound only by the limits of the imagination. “I think it’s one more step in the evolution of man and technology… There are endless possibilities for this,” says fourteen year old Derek Jacobs who was one of the first to be implanted with the VeriChip (Scheeres 2002b, p. 2). No doubt the teenager is correct in his analysis. But what kind of evolution is Derek really talking about? That which has its foundations in Darwinian theory perhaps? Darwin’s natural evolution theory is based on the premise of slow and steady change for better adapted creations. But certainly what is happening today cannot be considered “slow and steady change”. Rather as Kurzweil observes: “[w]e are now entering that explosive part of the technological evolution curve...” (Kurzweil n.d., p. 2). Kurzweil’s Law of Accelerating Returns states that “[t]he evolution of biological life and the evolution of technology have both followed the same pattern: they take a long time to get going, but advances build on one another and progress erupts at an increasingly furious pace.” Fixmer (1998, p. 1) described this plight as humanity’s attempt to accelerate its own evolution (see exhibit 8.11). This paradigm shift is explained well by Steve Mann (1998):
An important observation to make, with regards to the continued innovation, early adopters (military, government, large multinational corporations), and finally ubiquity, is the time scale. While it took hundreds of years for the stirrup to be adopted by the masses, and tens of years for guns to be adopted by the masses, the spread of computer technology must be measured in computer years… this decreasing of the time scale over which technology diffuses through society will have decreased to zero, resulting in a new kind of paradigm shift that society has not yet experienced.
In conclusion it has been shown that auto-ID devices have a trajectory that is somewhat different from the intent of the inventors. Initially attached to non-living things and later adopted to be carried by humans, it now seems inevitable that the devices will become one with humans. Converging disciplines are making the realm of the “impossible”, “possible”. For the first time, the attribute of mobility is being linked to automatic identification and wearable computing components, and being applied to completely non-traditional areas of electronic commerce. Of course some resistance will be experienced initially but as society continues to change becoming more and more techno-centric, it will decide what auto-ID will be used for, even if it has little to do with what it was originally designed (Branwyn 1993, p. 6). Society continues to be increasingly dependent on the promise of technology and it is difficult to see who and how many will resist the ultimate hope of “living for ever”. It is important to note here that the accomplishment is not in the rise of the computer/information age, it is as Grier (2000, p. 83) puts it, in “the vision, it has maintained” (Grier 2000, p. 83). When the ENIAC was publicly announced in 1946, no one could predict its ultimate impact. The founder of IBM forecasted a worldwide market of five computers (Coughlin 2000, p. 1)! The same could be said for brain implants today but we should at least pay some good respect to the lessons of history. Perhaps what we really need to do is start afresh considering the implications that such developments may have without discounting them outright as science fiction (see chapter nine). One reason this chapter depended so heavily on quoting current research was to actually dispel the myth that this type of dialogue is premature. It obviously is not. The force of the momentum is such that continual attempts will be made to go beyond that which has been achieved. It is not enough to begin discussing possible implications when the technology reaches the early adoption stage- by then the technology would have taken root- as it seems to have done already to some degree. Ultimately humanity will have a choice, and as Warwick has openly stated, hopefully it will be an individual choice- for those who would like to remain mere humans and those who would like to continue to evolve.
 A summation of this chapter was published as a case study in Lawrence et al. (2002), titled “the automatic identification trajectory”, (K. Michael 2002c, pp. 131-134, 136).
 For an in-depth discussion on the paradigm of digital convergence see Covell (2000), Baldwin et al. (1996), and Greenstein and Khanna (1997, pp. 201-226).
 In 1968 Ivan Sutherland discussed a head-mounted display, though the device was not mobile and could only be used from a fixed location (S. Mann 1997d, p. 25). Throughout the 1980s and 1990s Steve Mann continued to develop head-mounted ‘wearable’ units. For a brief history of wearable computing, see http://wearables.www.media.mit.edu/projects/wearables/timeline.html (2001). For a list of educational resources on wearable computers see Billinghurst and Starner (1999, p. 60).
 See Starner 2001, p. 44 for an extensive definition. See also Baran (1996, p. 36). “Wearable computing represents an unusual intersection of science, engineering, design and fashion...” (Starner 2001b, p. 60).
 See also (L. Cooper 1999, p. 3) who quotes Bass about wearable computing characteristics: both hands must be free, should be integral to a person’s clothing not just attached, the user must maintain control, and it must be constant. For the ideal attributes of wearable computing, see Starner (2001, p. 46).
 See “The PC in your wallet” (Levin 1994, p. 29).
 See Starner (2001a, p. 44). See also Siewiorek (1999, p. 82) who also believes that wearable computers do in fact incorporate such things as pagers and cell phones that “…have already achieved wide public acceptance.”
 See also http://www.research.philips.com/pressmedia/releases/000801.html (2000).
 McGinity (2000, p. 18) describes how computers are becoming more and more wearable, “more meshed with the body” as “processing power builds while device size shrinks”.
 One of the earliest prototypes of campus-aware wearable computing was the Metronaut. Smailagic & Martin (1997, p. 116) described the Metronaut as “a novel wearable computer which captures information, sense position, provides wide range communications, consumes less than one watt of power, and weighs less than one pound. Metronaut employs a bar code reader [visitor position], a two-way pager for communications, and an ARM processor for computation.”
 Schiele et al. (2001, p. 44) describes “[a] personal computer [that] should be worn like eyeglasses or clothing and continuously interacts with the user on the basis of context”.
 Most believe that these wearable devices will augment human memory by providing access to information when it is needed from any location. Steve Mann is a proponent of this idea especially.
 “Most importantly, wearing a computer must be possible without altering or in any noticeable way interfering with the wearer’s appearance and manner toward others” (Lukowicz et al. 2001, p. 16). See the MIThril System using the WearARM processing core. It is recommended that future wearable computers be worn underneath at least the top level of the wearer’s clothing. “Target outfits for the MIThril include a vest for warm weather and casual attire, a jacket for cold weather and more formal dress, and a sash for use in nations where Western attire is inappropriate” (p. 18). See also the VibraVest (i.e. the BlindVision project) that incorporates the VibraTach. The VibraVest is worn as a garment.
 “The goal of an audio environment for a wearable computing system is to convey relevant information to a nomadic listener based on the context of her tasks and the timely nature of her messages…” (Sawhney & Schmandt 1997, p. 171). See also Furui (2000, p. 3735-3738). Another example of how wearable computing has taken advantage of auto-ID techniques is in the Xerox PARCTab solution. “The PARCTab is essentially a PDA with an active badge attached to it, which is in continuous wireless communication with a central server” (Brown et al. 1995, p. 6/1).
 Of course one needs to always keep an objective view of what “we can all afford them” actually means; let us not forget those lesser developed countries who have occupants that are homeless and hungry.
 See Baig (2000, p. 3D).
 See Bloomberg (2001, p. B9).
 See high-tech watches (Alphonso 2000, p. A1).
 Miastkowski (2000, p. 1) does point out however, that there is a stark difference between today’s mobile devices and wearable computers of the future. He says, “[y]ou may wear a pager or a cell phone on your belt. And if you’re a genuine gadget freak, you might even wear one of those oh-so-stylish Leatherman multi-purpose tools in its own holster. But are you ready to wear a computer, complete with a head-mounted display?” Rummler (2001, p. 1) on the other hand believes that we have become so use to carrying automatic devices and that society has become so fully integrated with information that “implanting a computer chip/ processor internal to the human body doesn’t seem that strange.”
 See McGahan et al. (1997, pp. 230-235) on the development and evolution of the PDA. It is important to note that the rest of the chapter also highlights convergence across disciplines.
 See Toshiba’s mobile PC-phone (Stoddard 1997). As Gorlick (1999, p. 114) pointed out however, this does not mean that we should expect to wear only one device “[i]nstead we will wear a multitude of digital devices, some general purpose and some specialised. All of these devices will require power and their utility is greatly amplified if they can intercommunicate.” In other words, what can be integrated or can be converged will be; the other pieces however will still be separate, a bit like the notion of add-ons.
 See also Ebringer et al. (2000, p. 54) and http://www.nokia.com/ (2003). Some of the future wireless applications are also discussed in Martin (1995, pp. 3-4). See also IBM’s wireless handheld device for airline check-in. “About the size of a deck of cards, the handheld marries three different technologies: an IBM badge computer, an AiroNet IEEE 802.11 wireless LAN card, and an RF reader… travellers do not need boarding passes” (Wilcox 2000, p. 1). “Gate readers cause digital photographs of passengers to appear during boarding for security check. Flight attendant PDAs and reception desk laptops receive digital photographs and flight records of passengers…” (Zimmerman 2001, p. 1224). See also Nike’s versatile sporty MPEG-3 player, http://special.scmp.com/mobilecomputing/Mobilemad.html (2000), Compaq’s Pocket PC, and Sony’s Clié.
 See Cohen, J. (1994, pp. 231-232) “from hardware to hardwear”.
 For a variety of industrial wearable device research and development see the following sources. Billinghurst and Starner (1999, p. 59) that describe an experiment that took place at Carnegie Mellon University. “In their experiments, a technician wore a head-mounted display and camera that let a remote expert see the technician’s workplace. As the expert viewed what the technician saw, he sent relevant manual pages, which appeared on the technician’s head-mounted display….” See Lewis et al. (1998, p. 1) for Honeywell’s novel displays: “[t]he wireless device can either be worn on the wrist or pocketed for occasional hand held viewing… The second display system… is worn and used like binoculars.” See Boronowsky (2001, p. 163) for an overview of the Winspect project, “an application of wearable computing in an industrial inspection process.” For a case study on applying wearable computing to field archaeology see Baber et al. (2001, pp. 169-170). For a comprehensive overview of industrial wearable computing components see http://www.xybernaut.com/ (2003). “Right now Xybernaut is marketing to corporate users… The current system is tailored to maintenance, inspection and troubleshooting work. A typical user may be a phone technician who spends part of his day up a telephone pole” (Hoper 2000, p. 2). For a more futuristic factory environment that takes advantage of “[u]sing light headsets and hand-held or worn computing equipment, [that allows] users [to] roam their daily work environment while being continuously in contact with their computer systems, see Klinker (2000, p. 37). This concept is known as augmented reality (AR).
 See McCarthy, J. (1997, p. 43) who showcases the IBM wallet pocket PC.
 See Nechamkus (2000), where Xybernaut announced that Dr. Steve Mann joined their board of advisors.
 Other advanced wristwatches include: Matsucom’s OnHand™ PC wristwatch; and Nintendo’s Gameboy turned prototype WearBoy, see especially the two applications ActiveJewel and BubbleBadge (Ljungstrand et al. 1999). IBM has also launched the WatchPad prototype, a “watch [that] is capable of synchronising data and images with a portable computer or PC via wireless connections… the WatchPad is capable of handling text, photos, and animation” (Wilcox 2000, p. 1).
 For applications of the Digital Angel for consumers, commercial and medical, see http://www.digitalangel.net/consumer.asp (2002). “The purpose of Digital Angel is to monitor the location of a person as well as selected biological functions, find a person, animal or object anywhere in the world at anytime, and to advise subscribers of precise geographical location and biological and other sensory data on a real time basis” (Raimundo 2002, p. 1). For answers to frequently asked questions see Digital (2002a, pp. 1-5). See also Harrison (2000).
 See Fowler (1997, pp. 24-34).
 See also Microgistics that developed WalkMate, “…the device [is] used by college students to alert campus police if they’re in danger…” (Mieszkowski 2000, part 2, p. 3).
 Thinking Tags are example of low-end wearable platforms that have been built and tested (Borovoy et al. 1999). Passive RF/ID tags and transponders could also be considered low-end wearable devices.
 See also WhereNet technology “that has licensed its technology to companies that make bracelets worn on the wrist or pager-like devices carried in the pocket or purse” (Mieszkowski 2000, part 2, p. 3). See also the Skyaid Watch at http://www.skyaid.org/LifeWatch/life_watch (2003).
 See Small et al. (2000, pp. 355-358), Berger (2002), http://news.bbc.co.uk/1/hi/health/1336840.stm (2001), http://news.bbc.co.uk/1/hi/health/1667050.stm (2001), and http://news.bbc.co.uk/1/hi/health/1631893.stm (2001).
 “Clothing is with us nearly all the time and thus seems like the natural way to carry our computing devices. Once personal imaging is incorporated into our wardrobe and used consistently, our computer system will share our first-person perspective and will begin to take on the role of an independent processor, much like a second brain- or a portable assistant that is no longer carted about in a box… Such computer assistance is not far in the future as it might seem” (S. Mann 1997d, p. 25). See also http://www.wearcam.org/computing.html (Starner 1995).
 For a case study on smart pants see Laerhoven and Cakmakci (2000, p. 77).
 “Ultimately, wearable computers are clothes. A user might wear a display in a pair of sunglasses, keep a computer in a belt buckle, and have a keyboard woven into a jacket” (Starner 2001a, p. 49). Suspenders are also considered by some to be an effective bus transmitting information to and from one wearable device and another, e.g. from eyeglasses to belt buckle (Gorlick 1999, p. 114-121).
 S. Mann calls smart clothes, underwearables. “The ‘underwearable’ is a computer system that is meant to be worn within or under ordinary clothing. The first ‘underwearables’ were built in the early 1980s, and have evolved into a form that very much resembles a tank-top” (S. Mann 1997a, p. 177).
 For some very innovative example applications of smart clothes that are bound to broaden your imagination, see Kastor (2000, p. 1). “Your clothes don’t talk to you now but someday they may.”
 Other sources on wearable computer-related issues include: Farringdon et al. (1999), Newman and Clark (1999), Hahn and Reichl (1999), Martin and Siewiorek (1999), Starner and Maguire (1998), Tan and Pentland (1997), Özer et al. (2001), Starner et al. (1998), Cheng and Robinson (1998), Fickas et al. (1997), Dey et al. (1999), Furui (2000) and http://wearables.www.media.mit.edu/projects/wearables/ (2001).
 For wearable computers mounted in eyeglasses see the MicroOpticals Eyeglass System and Sony GlassTron (Spitzer et al. 1997).
 Infrastructure for the Aware Home have sensors on the floors, RF transmitters that provide location information (i.e. rooms), cameras to pick up movement and microphones to track sound. The notion of ubiquitous architecture is based on contextual awareness principles. See also Al-Muhtadi et al. (2001) who discuss the smart and active physical space.
 See Starner et al. (2000, p. 87) and Starner (2001, p. 47). See also Toney for the novel method for joint motion sensing on a wearable computer, featuring a data glove. “To be effective, wearable computers require an interface that reacts to the gestures and motions of the person wearing the computer” (Toney 1998). In addition, see Polat et al. (2001, p. 35), “[t]racking body parts of multiple people in a video sequence is very useful for face/gesture recognition systems as well as… HCI interfaces”.
 See the Ring Sensor 24-hour patient monitoring device (Rhee et al. 1999, p. 786).
 See McCarthy, J. (1997, p. 38) for a case study on the Australian Army and its vision for wearable IT which could be taken into battle.
 See http://news.bbc.co.uk/hi/english/sci/tech/newsid_8060000/806440.stm (2000). Computerised shoes were first conceived by Tom Zimmerman (S. Mann 1997d, p. 28).
 Mark Weiser of Xerox PARC first predicted that computerised shoes could be used by shoppers in a store to guide them to the merchandise they needed via an electronic floor plan (S. Mann 1997d, p. 28).
 An electrophorus can also be considered a “carrier of electricity”. The root “electro” comes from the Greek word meaning “amber” and “phorus” means to “wear, to put on, to get into”. To electronise something is “to furnish it with electronic equipment” and electrotechnology is “the science that deals with practical applications of electricity”. The World Book definition of electrophorus is “a simple device for producing charges of electricity by means of induction.” The term “electrophoresis” has been borrowed here, to describe the act that an electrophorus is involved in. The following dictionaries and link were consulted for the above definitions: World Book Dictionary A-K (1973, pp. 672-673), the Penguin Agglo-Ellenikon Lexicon (1981, p. 270), Eleftheroudakes Mega Elleno-Agglikon Lexicon (1960, p. 1614), and www.vdivde-it.de/innonet/doks/multhaupt.pdf (2002). Consider the following: “…electricity is in effect an extension of the nervous system as a kind of global membrane” (McLuhan, E. et al. 1995, p. 94).
 There seems to be some disagreement to the definition of cyborgs or the study of cyborgs, cyborgology. Chris Hables Gray maintains that anyone who has been immunised to some degree is a cyborg. “Cyborgology is the study of systems that include both living and dead elements, or you could say natural and artificial. Or you could say invented and evolved” (I. Walker 2001, p. 1). Contrast this definition with the early definitions of cyborg in the 1960s given by Manfred Clynes and Nathan Kline, related to a human-computer symbiosis (Starner 2001a, pp. 44f). Contrast the above definitions with that of the term cybernetics: “[t]he study of nervous system controls in the brain as a basis for developing information-processing and communications technology” (Hansen 1999, p. 72).
 “Saffo, director of the Institute for the Future, does not doubt that people may become a race of cyborgs- “part man and part machine.” “We put all sorts of implants in [our bodies] today,” says Saffo. “If we have metal hips, it only makes sense to have chips in, too” (Eng. 2002).
 “Peter Lipton, a science history professor at Cambridge, says that Warwick may be overstating the revolutionary magnitude of his body’s bionic component” (Trull 1998, p. 2). For Lipton it seems the textbook “cybernetics” definition should apply rather than Warwick’s assumed cyborg terminology. Only when Warwick is able to successfully demonstrate his Cyborg 2.0 project would Lipton probably acknowledge that Warwick has made a major step forward. Lipton said, Warwick 1.0 trial was “…similar to the sort of clicker we use to lock and unlock our cars. The fact that the chip now goes under the skin should not be exaggerated as a breakthrough” (Trull 1998, p. 2). Personally, I do not consider Warwick’s Cyborg 1.0 to have been bionics, as Lipton states, but an implantation of a RF/ID transponder, almost identical in nature to that of pet ID implants. In Cyborg 1.0, Warwick was an electrophorus but for less than 10 days only. He was advised to remove the chip so that it would not become attached permanently to the subcutaneous layer of skin. See exhibit 8.6 above.
 Another who has offered himself as a “guinea pig” for chip implants is Dr. Richard Seelig, “a former surgeon but now a medical consultant for ADSX, became the first to embed a VeriChip in his arm and hip on Sept. 16. He says his decision to become a willing guinea pig came when he saw World Trade Centre rescue workers scrawl information on their skin as an identifying marker should they get hurt in the wreckage” (Eng. 2002). See also Goldman (2002), Streitfeld (2002), and Gargano et al. (1997).
 “Now having a personal chip is becoming, well, not quite the norm but a ready possibility” (Mieszkowski 2000, part 1, p. 2). See also Wakefield (2002).
 See also Nelson (1999, pp. 1-3).
 One of the earliest papers written on the topic was by Branwyn appearing in Wired 1.4 in 1993. Another interesting paper was by Hutchins http://www.standford.edu/~gmh/hw8 (2000) on “implantable PDAs”. One may consider the idea far-fetched, i.e. how could a PDA be implanted into the human body? However, the smaller PDAs are getting the more possible it will become, especially given that the implant would not require a visual display unit but rather some audio device. According to Hutchins work-in-progress paper (2000) “[t]here are no full featured implantable products currently on the market. However, many of the necessary pieces already exist. Most of the devices are currently aimed at replacements for missing senses… First and foremost is our limited understanding of how to interface with the brain… Another problem is long lasting power sources. Current implant devices have rather limited lifetimes.”
 For a full range of applications see ADS (2002d). The VeriChip is even being marketed as a way to withdraw funds from ATMs given the increase in fraudulent activities (p. 3).
 See also ADS (2002a; 2002b) and Newton (2002).
 “A doctor would insert the chip with a large needle-like device” (Associated Press 2002c).
 “The capsule-shaped VeriChip, 11.1 mm long by 2.1 mm in diameter, has a 125 KHz RFID chip, tiny electromagnetic coil and a small memory, all encapsulated in a biocompatible enclosure” Murray (2002).
 See also Gordon Bell (1997) who should be awarded with the idea of the ‘guardian angel’ for emergency services. The Digital Angel is especially being marketed to reduce the incidence of the kidnapping of children. Matthew Cossolotto of ADSX reports on the advantages of an implanted device: “[w]ith an implanted device, the child doesn’t have to remember to wear it. It can’t be lost or stolen or stripped away. And it’s totally concealed” (Farrell 2002). ADS (2002a) is also working on a new product called the Personal Location Device (PLD).
 Additional fees and charges apply on a transaction basis with the Digital Angel service. It is the opinion of this researcher that the service fee to the end-user almost mimics that of a standard telephone operator or Internet service provider (ISP). A typical revenue model for a service provider is one based on a connection fee (i.e. for installation), plus a monthly fee (i.e. for rental) plus a usage fee (i.e. on the number of transactions). Digital Angel it seems may soon be bundled in with other household services.
 See ADS (2002f). This web page shows a close-up of the VeriChip that is implanted in the subdermal layer of the skin. See also exhibit 8.7 on the previous page.
 See (ADS 2002e) for VeriChip pre-registration and a picture of the ChipMobile on the move.
 It is thus not surprising to see the British Army involved in an experimental microchip ID program called APRIL (Army Personnel Rationalisation Individual Listings). According to LoBaido 2001 (p. 1) the army is hoping to be able to not only identify its soldiers but to also track their movement. The chip implant would aid with administration and bureaucratic matters. “The microchip is placed in the back of the neck… While the chip is active, soldiers would be tracked by the central electronic management system or “ERMS” in Glasgow… Ministry of Defence officials in London told WorldNetDaily that the “entire British Army could be microchipped by the year 2010,” if the trial program is successful… “The chip, which is implanted in the neck, would have many uses,” explained British Col. M.W. Jones, “one of which would be to replace the current ID card. This would protect the identity of those in the armed forces and prevent lost ID cards falling into the wrong hands. A continual database would show the whereabouts of every serving member of the armed forces giving commanders much greater control on the battlefield” (LoBaido 2001, pp. 1-2). Please note, as far as this researcher can verify this story is authentic, however, very little has been found to support the case that a British Col. M.W. Jones does exist and that the reports made above are actually true. Yet, based on the fact that the information was reported by the WorldNetDaily, a respectable Internet news source, there is little reason to doubt the report. There is also nothing in the report contrary to scientific fact. See also Icke (2001).
 “The ultimate goal of mobile multimedia systems is to assist their users all the time and everywhere by providing the right information at the right place in the right manner” (Krikelis 1999, p. 13). See also The Sun-Herald, (2002 p. 27).
 For wireless personal and multimedia communications see Kato (1997, pp. 1-4). See also two companies specialising in this form of communication, RIM and Blackberry.
 Echoing many other writers, Pentland (2000, p. 107) says, “[s]mart environment, wearable computers, perceptual user interfaces, and ubiquitous computing generally are widely thought to be the coming of “fourth generation” computing and information technology. Because these embedded computing devices will be everywhere- clothes, home, car, and office- their economic impact and cultural significance are expected to dwarf previous generations of computing. At a minimum, they are among the most exciting and economically important research areas in information technology and computer science.”
 It is important to note at this point that 3G mobile systems have promised much and delivered little at the present time throughout the world. While 3G licenses have been purchased in many countries for big dollar figures, for the present installing a full 3G network seems cost prohibitive with payback periods in excess of twenty-five years in some cases. A realistic article has been written by Zeng et al. (2000, pp. 94-104) on the harmonisation of 3G mobile systems which is worthwhile reading just to gain a balanced perspective. Zeng writes: “[t]he global 3G standard is expected to play an important role in the multimedia society of the next millennium and reshape the worldwide telecommunication infrastructure.” See also the importance of standards by Starner (2001b, p. 56).
 See Starner (2001a, p. 48) for other typical 4G mobile service scenarios.
 “The Global Positioning System is a 24-satellite constellation that can tell you where you are in three dimensions. GPS navigation and position determination is based on measuring the distance from the user position to four GPS satellites, it is possible to establish three coordinates of a user’s position (latitude, longitude and altitude) as well as GPS time… Originally developed by the Department of Defence (DOD) to meet military requirements, GPS was quickly adopted by the civilian world” (Pace et al. 1996, pp. 237f). See also The Times (1994).
 For a comprehensive overview of the origins and evolutionary developments of GPS see Pace et al. (1996). “GPS had a military origin, and its technology arose from projects designed to support strategic nuclear and tactical military missions” (p. 196). In 1983 GPS was opened up to civilians as well, whereas previously it was just for US military use (pp. 247-250). This change in 1983 had major implications as it meant that other countries could also use the capabilities of GPS.
 See the palm-sized GARMIN GPS. See also Fleming (2000, p. 6D) for the NatTalk integrated cellular phone, GPS and pager.
 Some of these companies like Sky Eye specialise in non-human tracking, rather tracking the location of cars, locomotives or containers throughout the globe. “Sky Eye supplies valuable information on logistics, car handling or car health. [They] can provide container diagnostics, as well as load and locomotive monitoring. By using this information to pinpoint bottlenecks, [one] can reduce out-of-service time, thereby improving operational efficiency and boosting productivity.” See http://www.sky-eye.com/en/index.html (2002).
 There are over one hundred thousand missing persons in the U.S. alone (M. Miller 2001).
 “Foreign executives and other individuals who are frequent kidnapping targets in Latin America will soon be able to use implantable ID chips and personal GPS devices in an attempt to thwart their abductors” (Scheeres 2002a, p. 1). “Cunha Lima, a veteran politician, has served in public office for more than 22 years… ‘I believe this technology will contribute to public safety and security of Brazilians… I believe this technology will act to deter the shocking rise of kidnapping of the children of businessmen’ (Scheeres 2002c, pp. 1f). See also Horn (2000, p. 1).
 For instance, “[T]he Digital Angel Corporation has had overwhelming response from the Alzheimer’s community. Since our original intention was to develop a technology that would help save lives, the logical conclusion was to develop products that would help protect persons who may wander- such as people afflicted with Alzheimer’s or dementia; autistic children or adults; or young children” (Digital 2002a, p. 1).
 “ADS’s Nov. 7 press release had stated: ‘Using Digital Angel’s advanced location and monitoring technologies, State authorities will be able to monitor the location of parolees on a real-time basis… The system is designed not only to monitor the location of parolees, but also provide the appropriate authorities with an advanced warning when violations occur’” (Gossett 2002, pp. 1f).
 We are informed that “[P]ro Tech is currently the only company having units actively tracking convicted offenders. Their products are used in 120 jurisdictions in 27 states” (Gossett 2002, p. 2).
 “In a few weeks, 14-year-old Derek Jacobs of Boca Raton, Fla., will have a computer chip implanted in his left arm. A doctor will inject it through a syringe, into an anesthetized area, probably near his shoulder. There, the radio frequency chip will act as a sort of “human bar code,” identifying Derek through his skin to anyone nearby equipped with an appropriate electronic reading device” Murray (2002). See also Chen (1998, pp. 82-84) for wireless communications as applied to the field of biomedical applications.
 “According to Bolton, the company has already started experimenting with combining the VeriChip with another ADS product called the Digital Angel. That pager-sized device allows caregivers and parents to monitor the health and whereabouts of seniors and children through the use of space-based Global Positioning Systems (GPS) satellites. In the migration path, those two products can be bundled together…” (Eng. 2002).
 The only way to unlock the watch is using an external key fob. This has major implications for the end user. There is always the possibility that an overprotective parent will make their child wear this watch even till the time they become a teenager for other reasons separate from safety concerns. In addition a kidnapper could cause bodily injury to remove the watch from the possession of the user so tracking can cease, whether the watch is locked in place or not.
 In a compelling manner, “[t]he company says the device, which is the size of a wristwatch-face and could eventually be made even smaller, could be used to find kidnapped children, locate young kids who wander away from parents and track teens who participate in risky behaviour” (Farrell 2002).
 “Most embedded chip designs, such as ADS’s VeriChip, are so-called passive chips which yield information only when scanned by a nearby reader. But active chips- such as the proposed Digital Angel of the future- will need to beam out information all the time. And that means designers will have to develop some sort of power source that can provide a continuous source of energy, yet be small enough to be embedded with the chips” (Eng. 2002).
 The technical hurdles the GPS technology must overcome is the size of the GPS receiver chip, legal, privacy and regulatory issues (e.g. the FDA in the U.S.).
 See Rungsarityotin and Starner (2000, pp. 61-68) for how to find a location using omnidirectional video on a wearable computing platform. See also Khan et al. (2001, pp. 331-336) for human tracking using multiple cameras, also Davis et al. (2000, pp. 171-178), Skelly and Chizeck (2001, pp. 59-68) and Bobick and Davis (2001, pp. 257-267). An excellent prototype which takes advantage of real-time face recognition is the PersonSpotter- “a fast and robust system for human detection, tracking and recognition. Real-time face recognition system which is able to capture, track and recognise a person walking toward a stereo CCD camera” (Steffens et al. 1998).
 For intelligent environments and active camera networks see Trivedi et al. (2000, pp. 804-809).
 It has been frequently maintained in recent times that “…a paradigm shift is underway to move us towards Universal Personal Communications [UPC], which is built upon a person-oriented communications infrastructure and promises personalised services” (Kripalani 1994, p. 25). For “[P]ersonal Communications systems provide the capability to communicate anywhere, anytime with anybody by featuring personal, person-based, and personalised services to serve different customers’ different requests” (Kato 1997, p. 1).
 Bhattacharya (2000, p. 1578) judges well when he concludes, “[t]he convergence of wired/wireless telecommunication networks and IP-based data networks to form a seamless global personal communication system has set up the stage for a network independent universal location service.”
 As Kipreos correctly predicted in 1993, when excitement about UPT was initially stirred, “[i]t is a very successful business- one that can fulfil the dreams of millions and millions of people the world over for anywhere, anytime, direct communication based on dedicated phone numbers, whether for business or personal use, for everyday use as well as in emergencies” (p. 334).
 “There are essentially two representations [for location space positioning]: (i) geometric, where the location is specified as an n-dimensional coordinate, and (ii) symbolic, where the space is divided into named zones. For example, position tracked by the… GPS is geometric, whereas the cell-id broadcast received by a mobile phone is essentially symbolic information” (Bhattacharya 2000, p. 1578). The requirement in the future will be to bridge the gap between these two systems/networks so that advanced services can be offered without interruption, independent of the location of the user.
 “For example, in a demonstration after Florida resident Leslie Jacobs was implanted with the VeriChip, Applied Digital Solutions CTO Keith Bolton ran a scanner over her arm which displayed not just an identification number, but her name, telephone number and a condition known as ‘mitral valve prolapse,’ a heart murmur. This information could be helpful to medical professionals,” said Bolton, “in the event she can’t speak, to save her life” (Gossett 2002, pp. 4f).
 It is often announced and with ever increasing confidence, “[m]icrochip implants may be the next best thing to a miracle cure for disabilities” (Brooks 2001, p. 1).
 It was with great surprise that I came across this source in early 2002 which is why it has been omitted from the literature review.
 Now compare Banbury with the following extract regarding the VeriChip: “Applied Digital Solutions originally planned to sell the chip to people with pacemakers or other internal medical devices as a way of transmitting health information- such as allergies- to hospital workers in emergency situations. The second product… combines a watch and a device the size of a pack of cigarettes that clips onto a waist band or a belt like a pager for Alzheimer’s patients or others who stray in mind” (Scheeres 2002a, pp. 1f).
 “Biochips are glass slides, or other substrates, upon which biomolecules are placed and immobilised. They perform the job of a miniaturised biochemistry. Biochips enable faster and more efficient gene detection of gene mutation” (Wales 2001). See also PC Magazine (1999) that describes how technology companies are getting involved in biochip developments. For a graphical demonstration of a smart pill see http://www.biomems.net/index.html (2003) and http://www.sun-sentinel.com/graphics/news/smartpill (2003).
 See Lee, senior technology advisor with the National Cancer Institute in Bethesda, Marylands who predicts that biochips could be implanted in people to provide early warning cancer signs (ÓhAnluain 2001, p. 2).
 See Rodda (2000, p. 9) for computers that keep you healthy from the inside.
 See hormone producing implants like the Norplant used for birth control. Recently there has been much press about women blaming doctors for implants that did not work, resulting in unplanned pregnancies. See Teutsch (2003).
 See polymarase chain reaction (PCR) amplification in silicon (Zhan et al. 2000, p. 25).
 “Researchers at… [MIT] in the US have created a prototype chip around the size of a 10-cent coin that contains tiny chemical reservoirs each sealed with a gold cap. At a preprogrammed time, a memory chip melts the cap by applying a small electrical voltage, releasing the chemical stored inside” (The Lab 1999).
 “The use of microscopic chips will take heart disease treatment to the next level,” said Dr Robert Michler, director of the research and chief of cardio-thoraic surgery at University Medical Centre at OSU” (Swissler 2000, p. 1).
 “Engineers have developed a technique that might be used to glue cells or DNA to the surfaces of computer “biochips,” a technology aimed at making diagnostic devices to be implanted in the body or used to quickly analyse food and laboratory samples” (Venere n.d., p. 1). See also Moore (2001).
 See Thomson (1999, pp. 1-3) who features Professor Michael Cima, John Santini and Professor Robert Langer on non-conventional methods of drug delivery.
 See also NZ Herald (2001, p. B2) for an implant that is intended for women with orgasmic dysfunction. Medtronic is expected to start clinical trials of Meloy’s prototype. Also the unofficial Zorn Device, similar to Meloy’s invention, is supposedly available as an untested underground product built by a neurohacker (Branwyn 1993, p. 4).
 For a case study on a deaf patient, Mukunda Kantamneni, who received UIHC’s (University of Iowa Health Care) first auditory brain stem implant in 1996, see http://www.uihealthcare.com/news/pacemaker/pacemaker96/pmapr96.html (1996, pp. 1-2).
 “Sound is received by a microphone, processed by a minicomputer… and the electrical signals are transmitted to the implant by radio-frequency communication. After decoding of the signal, the multiple electrodes directly activate nerve cells that communicate information about sound to the brain” (Wells n.d., p. 4).
 For the company that developed Nucleus see http://www.cochlear.com (2001). For the actual nucleus device see http://www.Cochlear.com/euro/nucleussystems/ci24m.html (1999).
 Mieszkowski (2000, part 2, p. 2) states the argument plainly- “[p]icture a system that would constantly monitor a heart disease sufferer’s pulse rate or a diabetes patient’s sugar levels and notify medical help when things were looking dangerous. We accept pacemakers as a necessary and important technology to extend and enhance the quality of lives. How is this any different?”
 See Dagnelie and Massof (1996).
 However, as it has been admitted, “[t]here are very many complex engineering problems in this project… We had to consider biocompatibility of the device and how to provide a reliable power supply. We also had to design an electrical circuit that conforms to the biological specifications” (Frontiers 1997, p. 1). For biocompatibility issues see Sood (2000, p. 1).
 For an overview of how retinal implants work see McCabe (2001, pp.1-2).
 See Ganey (2001, pp. 1-5) and Butler (2000, p. 1).
 See Wyatt and Rizzo (1993, 1996, 1997) for published papers in IEEE Spectrum and The Neuroscientist.
 “[T]he researchers face many challenges. Researchers are uncertain whether the retina will tolerate the foreign device for a long period of time, as well as the fact that the microelectronics of the epiretinal chip will soak in the salty vitreous humor… Another concern is the constant motion of the eye… Other challenges for researchers involve the degree of the electric charge generated by electrodes. An electrical charge strong enough to stimulate the neurons sufficiently may generate so much heat that it burn retinal tissue” (OE. 2001, pp. 1f).
 For a thorough overview of the top academic institutions working on retinal microchip implants and their progress see Nighswonger (1999, pp. 1-7).
 For an overview of the John Hopkins/NCSU retinal prosthesis project see http://www.ece.ncsu.edu/erl/faculty/wtl_data/boston98/boston98.html (2001) and http://www.ece.ncsu.edu/erl/erl_eye.html (1999).
 “Dr. Mark Humayun, who with Dr. Eugene de Juan, Jr., leads a group at John Hopkins University’s Wilmer Eye Institute, reported the stimulation of patients’ retinas using hand-electrodes, enabling them to “see a simple geometric shape, like a box or the letter H. It’s very encouraging” (Hurley 1999, p. 1).
 Other methods are being used by Richard Normann’s research group at the University of Utah that is developing an implant for the brain that will interface with the visual cortex. The researchers believe this is the best way to stimulate the brain into seeing possibly curing profound blindness (Brooks 2001, p. 4).
 For information on pacemakers and defibrillators including the history of pacemakers, the benefits of pacemaker technology and design considerations see http://www.bae.ncsu.edu/bae/courses/bae465/1995_projects/scho/ (1995).
 See Ryan et al. (1989, pp. 7.6.1-7.6.4), Starner (2001a, p. 50) and Young (2000, p. R7).
 “Hearts have long been regulated by electronic implants. Now it’s the brain’s turn” (Hall 2001, p. 1).
 See Greengard (1997, p. 1) who reported on neuroscientist Theodore Berger who was one of the first to begin researching brain implants. The article which appeared in Wired highlights the “simple problem” that has a “complex solution” stating that between one and ten billion neurons would require identification.
 See also Melody Moore’s work at Georgia State University. “The key technology involved is called a neurotrophic electrode, which is about the size of the head of a ballpoint pen, and is designed to be directly inserted into a brain so it can interface with neurons” (Morehead 2000, p. 1).
 “Stanford University neurosurgeon Gary Heits… patients suffer from debilitating tremors, similar to those caused by Parkinson’s disease. These tremors can prevent them from being capable of doing anything themselves… But when they receive a chip implant in the area of the brain known as the hypothalamus, everything changes. Flick a switch to turn the chip on and it emits signals that interfere with the brain signals causing the tremor…” (Brooks 2001, p. 2).
 See also Karen Davis at the University of Toronto who is researching Parkinson’s. Visit http://www.discovery.com/area/technology/virtualtech/issue1/manmachine.html (2000).
 “Current prosthetic devices for amputees can read electrical impulses from the remaining muscles and operate mechanical hands, arms and legs, but devices have not been developed yet for those paralysed from the neck down, said John W. Stephenson, clinical coordinator of the prosthetics department at the Texas Scottish Rite Hospital for Children in Dallas” (Dominguez 2000, p. 2).
 “The aims of this new project are to analyse cellular and molecular mechanisms after peripheral nerve injury and nerve repair; to analyse the tissue reactions after implantation of biomaterials in soft tissue; and to combine this knowledge in order to functionally integrate a nerve chip in a nerve trunk… Regenerating nerve fibres are capable of growing through an implanted sieve electrode made out of silicon (nerve chip)… If a nerve chip functionally can be integrated in a nerve trunk this device could be used to control an artificial limb” (Danielsen et al. 1999, p. 1). See also Irwin (1998, p. 2) who quotes Bakay: “Our hope is that soon we will be able to get to the point that we can connect the neural signals to a muscle stimulator in the patient’s paralysed limb and have them move that limb using the same principle that they use to move the cursor.”
 Mussa-Ivaldi said “[T]he problem is when you think about your arm, it’s not just receiving commands. It also sends information back because you have touch information being sent” (Dominguez 2000, p. 2).
 Dr. Bakay also works for the Rush Presbyterian Hospital in Chicago. See (Dominguez 2000, p. 1).
 See research at Northwestern University Medical School. The team lead by Sandro Mussa-Ivaldi wired up a lamprey’s brain to a set of wheels. “The brain was able to move the wheels. It was the first demonstration of how an animal’s nervous system and a machine could work together in the future” (Brooks 2001, p. 3). See also Bohringer et al. (2001) regarding an experiment to implant a standalone microcomputer into the brain of a marine mollusc, to allow multi-site intracellular recording and stimulation in a live, freely behaving animal.”
 “In March 1998, neuroscientists at Emory University in Atlanta implanted two glass-enclosed electrodes in the primary motor cortex of Ray’s brain. Nourished by neurotrophic growth factors, his cortical cells infiltrated one of the 1.5-mm glass cones to form artificial synapses. When the motor neurons fired, the impulses were transmitted to an external receiver hooked to a computer. Now, when Ray wants to converse with visitors, he has trained himself to manoeuvre the cursor around the computer screen, tap a virtual keyboard and “speak” in basic phrases by clicking on icons that activate a voice synthesiser- all simply by thinking about the movement he wants to accomplish” (Weber 2000, p. 1). See also Irwin (1998, p. 1).
 Jessica Bayliss at the University of Rochester is trialling a virtual model developed “to read a key brain signal that essentially gives an overwhelming “thumbs up” or “thumbs down” to a choice presented to it. Sensors on the subject’s head pick up the signal, called the P300; a computer connected to the sensors reads the intention- on or off- and then sends the appropriate command to the TV” (Keck 2000, p. 1).
 Some research towards neural interfacing has also been conducted at Stanford University.
 Bakay has said that “it will be a long time before artificial limbs can be made to perform more complex tasks such as throwing a ball, he said” (Dominguez 2000, p. 1). Yet “NASA… has developed a robotic hand that is almost as dexterous as our own. Systems that rely on the power of thought to move artificial limbs have already been developed. Their potential was recently demonstrated by a monkey in Brooklyn, New York, whose brain signals were used to control a robotic arm located in North Carolina” (D. Smith 2002, pp. 1f).
 The military is especially interested in ‘control-by-thought’ applications (Dobson 2001, p. 2). According to Colombo (2000, p. 2), NASA published numerous articles in its Tech Briefs magazine about marrying up “solid-state technology with human biology. The reason for NASA’s interest hinges on the prospect of direct astronaut interface with on-board computer and communications systems.”
 See also LoBaido (2001, part 2, p. 1). “Researchers have wired the brains of monkeys to control robotic arms- a feat that could one day allow paralysed people to move artificial arms and legs merely by thinking” (Dominguez 2000, p. 1).
 There are many obstacles, nevertheless, most researchers are not discouraged and are poised for eventual success. “However, more than 1000 connections between the brain and a bionic device would be needed to communicate the data necessary to produce a complex action like walking. Professor Craelius [biomedical engineer at Rutgers University in New Jersey], predicts this will soon be achievable, even if most of the computer processing is done outside the person’s body at first” (D. Smith 2002, p. 2).
 See Rummler (2001, p. 1).
 With reference to Warwick’s experiments at the Department of Cybernetics, University of Reading, journalist McCarthy (1999, p. 1) writes: “So do we really have the audacity to question a leading academic? No, of course not- but we simply can’t bring ourselves to believe that what he says is possible. It’s one thing to make predictions of what the future holds but quite another to set up heavily publicised experiments.” See also BBC (2002a, p. 2): “But his [Warwick’s] pronouncement have irritated some in the fields of robotics and artificial intelligence who feel his remarks are far too speculative.”
 See Harris (1998) and Rantala and Milgram (1999).
 “The signals from Warwick will be converted to radio waves and transmitted to a computer which will re-transmit them to the chip in Irena. Warwick believes that when he moves his own fingers, his brain will also be able to move Irena’s” (Dobson 2001, p. 1).
 “The main part of the silicon chip consists of a battery, radio transmitter, receiving and processing unit. Pins connected to the chip will pierce the membrane surrounding Warwick’s nerve fibres” (Kellan 2000, p. 1). See also the description given by BBC (2002a, p. 1): “[a] silicon square about three millimetres wide was inserted just above the professor’s left hand and its 100 electrodes, each as thin as a hair, connected to the median nerve… Wires from the square come out of the forearm and are being linked to a transmitter/ receiver so nerve messages can be radioed to a computer.”
 See Pearson (2000, pp. 1-2) for the future of relationships.
 Scannell (1996) wrote that IBM “…demonstrated technology that uses the natural salinity in the human body as a conductor to send and receive data electronically… IBM showed a prototype transmitter called the Personal Area Network (PAN). About the size of a deck of cards, it has an embedded microchip along with a slightly larger receiving device… The present prototype allows data to be sent between as many as four bodies that touch each other… It could be used to exchange data between personal information and communications devices carried by users, such as cellular phones, pagers, personal digital assistants, and smart cards, which could constitute a sort of LAN that most users wear physically.”
 Kevin Warwick “…plans to have a chip implanted in his arm that will transmit signals from his nervous system to a computer. By relaying signals back to his nerves, he hopes to see whether the same movements, sensations or even emotions can be evoked… His latest plan was aimed at provoking debate about the risks and benefits of computer-enhancement of healthy humans. A more practical aim of the research is to help disabled people” (D. Smith 2002, p. 1).
 See also Kellan (2000, p. 1).
 See footnote 50 that pertains to the definition of “electrophorus” and “electrophoresis”.
 It is worthwhile consulting other predictive ideas or studies on the same theme: Mumford (1934, 1961); M. Shelley (Frankenstein); Asimov (1950); Turing (1950); Ellul (1964); McLuhan (1964); the JCR1960 (Man-Computer Symbiosis); Toffler (1980, 1981); Minsky (1985); Moravec (1988, 1999); Mark Weiser (1991, 1993); Dery (1994); Negroponte (1995); Gates (1995); Allenby (1996); Mazlish, Barlow, N. Stenger, M. Benedikt, Licklider, (cited in Slouka 1996); Gershenfeld (1999); Kurzweil (1999); Johnscher (1999), Cochrane (1999). See also Innis, Wiener, S. Hawking, Stelarc and Burroughs.
 See also Cochrane (1999, pp. 3-4). Cochrane believes that the death of people like Albert Einstein and Richard Feynman was a terrible loss to humanity that could have been avoided. With reference to these great minds he says: “[t]heir only echo is a book” (Pickering 1999, p. 2).
 “The key point is: we occupy that space between our ears- that is where we live. Our carcass serves only as a transport and life-support mechanism; just a vehicle needing maintenance and repair. Who cares if the spare parts are manufactured?” (Cochrane 1999, p. 2). See also Miriam English who told I. Walker (2001, p. 12) in an interview: “[n]ot so much to leave my body, I like my body, but it’s going to die, and it’s not a choice really I have. If I want to continue, and I want desperately to see what happens in another 100 years, and another 1000 years… I need to duplicate my brain in order to do that.”
 “There are some people though who are finding it difficult waiting for the day when we can trade in what they call our “meat wet bodies” for something more durable, more updatable, when we can upload our minds into a computer to be backed up for all eternity… The most passionate among them call themselves Extropians. Most of them live in California, their passions are life extension, body sculpting, cryonics, smart drinks, funny handshakes, and a new philosophy they call “transhumanism.” Mostly it calls for boundaryless optimism about the future” (Walker, I. 2001, p. 8). See the Extropy Institute web site http://www.extropy.org (2001). Among the most famous Extropians are Max More (1997, 2001), Nick Bostrom (1998a, 1998b. 2001), Alexander Chislenko (1997), Robin Hanson, Hans Moravec, Natasha Vita More and Marvin Minsky. See also the World Transhumanist Association at http://www.transhumanism.com (2001).
http://www.xontek.com/Advanced_Technology/Bio-chips_Implants/The_End_of_Death (2001). Compare this estimate with the virtual-rat project which has faces numerous technical challenges in the collection of data coming in at 2.3 megabytes per second, per channel- “enough to max out a multigig hard disk in one afternoon. If you were using the same technology to record input from every neuron in a human brain you’d get 150 million Gbytes of data per minute- enough to fill a 194-mile-high stack of CD-ROMs in 60 seconds.” Stewart (2000, p. 2)
 Couple the microchip with the biological and a whole new world is discovered (Brooks 2001, p. 6).
 Consider the notion of WPANs (Wireless Personal Area Network) and how they fit into the Soul Catcher concept (Gorlick 1999, p. 114). “The proliferation of mobile computing devices including laptops, personal digital assistants (PDAs), and wearable computers has created a demand for wireless personal area networks (WPANs)” (Zimmerman 1999, p. 1).
 “It is in the realm of reality. It is not science fiction any more,” said Duke University researcher Miguel Nicolelis” (Dominguez 2000, p. 1). “It’s already science fact, not science fiction”, said Professor Albert Pisano at the University of California Berkeley” (Swissler 2000, p. 2). In contrast see Hoffman et al. (2001, p. 76): “It’s scary to have a major film premised on your profession, your love… AI lets a new crowd squirm in the glory of misrepresentation. It’s not fun, especially when one’s field suffers from waves of innovation-hype- backlash.” Innovation-hype it may be but that does not necessarily make it altogether impossible even if we disagree with the ultimate direction; rather it is one path of many that AI may follow.
 Contrast Kurzweil’s prediction with Kosko (2001, p. 1): “[t]he question is not whether robots will replace humans but whether chips will replace brains.”
 In describing how implants are currently being used, Hall (2001, p. 3) points out that it is mostly for patients for whom there is no other alternative, i.e., as a last resort. But this idea is evolving to cater for those who wish to also extend their functions. “I equate it to where heart pacemakers were in the 1950s. Back then, you would tell someone, ‘I’m having a pacemaker put in,’ and people would go, ‘What’s that?’ Now everyone knows what a heart pacemaker is. I think that it will be a similar situation for brain pacemakers in 10 or 20 years.”
 For the evolution of computer systems see Sadiku and Obiozor (1996, p. 1472-1474).
 “Gould and Eldredge observed that the fossil record doesn’t support a belief in steady evolution. Instead, they saw long periods of equilibrium with little activity, separated by short bursts of evolution” (Singh 2001, p. 6).
 “Why do we see exponential progress occurring in biological life, technology and computing? It is the result of a fundamental attribute of any evolutionary process, a phenomenon I call the Law of Accelerating Returns. As order exponentially increases (which reflects the essence of evolution), the time between salient events grows shorter. Advancement speeds up. The returns- the valuable products of the process- accelerate at a nonlinear rate” (Kurzweil n.d., p. 3). See http://www.sciam.com/specialissues/0999bionic/0999kurzweil.html (2003).
 Cochrane uses the example of the electronic calculator to describe this phenomenon. “Once electronic calculators arrived, it was all over- logarithms and slide rules went the same way as the mechanical typewriter- only faster. What had lasted 350 years was wiped out in three” (Cochrane 1999, p. 1).
 Cochrane believes “the human brain is reaching its evolutionary peak. Computing power increases a thousandfold each decade. By comparison human brains have remained largely unchanged for 50,000 years…” (Pickering 1999, p. 2). See http://www.salon.com/tech/feature/1999/10/20/cyborg/index1.html (1999).
 http://www2.cyber.rdg.ac.uk/kevinwarwick/FAQ.html. This was the focus of an ethnographic study conducted by Sheridan et al. (2000, pp. 195-196): “Our goal in this investigation was to identify current themes or aspects of social interactions among non-cyborgs and cyborgs.”