The Auto-ID Trajectory - Chapter Two Literature Review



2.    Literature Review

The primary purpose of the literature review is to establish what relevant research has already been conducted in the field of auto-ID innovation. It is through this review of the broader research topic that a specific proposal can be accurately formulated. First, a critical response to the literature on technological innovation is required. Second, a thorough evaluation of research on auto-ID technologies is necessary. Third, an attempt to locate works that deal with both innovation and auto-ID will be made. If these works are scant, then the question of whether this warrants a sufficient gap for further research will be posed. Can this thesis act to fill the void in the literature by offering a first attempt at understanding the innovation of technologies in the auto-ID industry? Can a new contribution to knowledge be made specific to the notion of the auto-ID trajectory? Finally, some space will be dedicated to reviewing literature that is focused on technological forecasts in IT with a view to adopting an acceptable narrative style with which to make predictions regarding future auto-ID trends and possibilities. In this manner, the chapter seeks to satisfy objective one identified in the Introduction (section 1.3.1). The literature review will also serve to:

1.      identify and understand widely accepted definitions, concepts and terms, born from past innovation research as a guide for further research;

2.      review theories, theoretical frameworks and methods adopted by other researchers doing similar innovation studies (preferably in the area of information technology) in order to choose an appropriate approach for this thesis; and

3.      understand what aspects of auto-ID technology have already been explored by researchers and what aspects have been neglected and to discover any similarities or differences in existing findings.

Previous research will be examined in this chapter using a two-tiered approach; topical at the surface layer and chronologically organised therein. This type of analytical strategy is advantageous because similar patterns, trends, or findings can be uncovered and organised into clusters over time. First, innovation studies will be reviewed, followed by auto-ID studies and finally advanced technology forecasts.[1] Diagram 2.1 shows how the literature will be dissected for review.

Diagram 2.1       Organisation of the Literature Review

Each study will be categorised according to the theory and research method used by the author(s). Additionally, findings of each study will be briefly highlighted for comparison.[2] Landmark studies will be treated at a greater length than minor studies and can be found in sections 2.3.1, 2.4.1, 2.5.2, 2.6.7. The same emphasis will be attached to reporting accurate summaries, and responding critically to previous research. Overall, greater consideration will be given to reviewing contemporary innovation literature, as opposed to outdated research that was never conducted with the knowledge of information technologies.


2.1.   Fundamental Definitions in the Innovation Process

2.1.1.      Invention, Innovation and Diffusion

This section will be dedicated to defining the fundamental links between invention, innovation and diffusion as it applies to this thesis.[3] Sahal (1981, p. 41) makes the distinction that an invention is the creation of a “new device” and an innovation is the “commercial application” of that device. Similarly Braun (1984, p. 39) argues that “ invention is merely an idea for a prototype of a new product or process and does not become an innovation until it reaches the market [diffusion].[4] Most inventions never become innovations, they fall by the wayside on the long road from idea to marketable product.” Invention: Mutation, Recombination, Hybrid

As suggested by Jacob Schmookler, a patentable invention is a new product or process that shows a significant degree of originality and has some future use (1966, p. 6). A question often asked is, do all inventions fall into the same category? The answer according to Farrell (1993) and Mokyr (1996) is no: inventions may differ depending on how their formation came about. Table 2.1 shows that invention can be classified into three types, mutation, recombination, and hybrid (Mokyr 1996, p. 69). Without reference to Farrell or Mokyr, Edquist (1997, p. 1) states, “[i]nnovations are new creations of economic significance... [that] may be brand new but are more often new combinations of existing elements.”

Table 2.1      Types of Inventions

Type of Invention | Description

Mutation | “Of course, mutations are variations on existing material. Most of the genetic material in every mutant is not new...”

Recombination | “Because technological recombination is multiparental, the opportunities for innovation through novel combinations of existing knowledge are a function of the complexity and diversity of the economy.”

Hybrid | “The difference between a hybrid and a recombinant invention is that a hybrid is a combination of two (or more) artefacts, rather than the information embedded in them... In most cases, hybrid inventions require complementary types of invention that are necessary if the pieces are to work together and the new device is to be made operational.”

*The definitions for the different types of inventions have been taken from Mokyr (1996, pp. 70-73). Innovation: Radical versus Incremental

Generally, an innovation can be described as “...a process or a product, a technical or an organisational change, an incremental improvement or a radical breakthrough” (Deideren 1990, p. 123 quoted in Lindley 1997, p. 20). Since the 1900s the term innovation has undergone many revisions with the emergence of new theories in the field of economics. Schumpeter’s (1939, p. 87f) well-known definition of innovation is directly linked to neoclassical economic theory by means of the production function:[6]

...we will simply define innovation as the setting up of a new production function. This covers the case of a new commodity as well as those of a new form of organisation... this function describes the way in which quantity of product varies if quantities of factors vary. If, instead of quantities of factors, we vary the form of the function, we have an innovation... we may express the same thing by saying that innovation combines factors in a new way, or that it consists in carrying out New Combinations.

As Saviotti (1997, p. 184) comments, for Schumpeter new combinations gave rise to new products and processes that were qualitatively different from those preceding them. However, while the phrase ‘new combinations’ is still widely used today within the evolutionary theory of economics by researchers such as Lundvall (1992, p. 8) and Elam (1992, p. 3), innovation is no longer attributed to the setting up of a new production function. Rather, innovation is a natural process of ‘technical progress,’ a technology-specific process of ‘learning by experience’.[7] Nelson and Rosenberg (1993, pp. 4f) interpret innovation broadly, adding an optional geographic context:[8] encompass the processes by which firms master and get into practice product designs and manufacturing processes that are new to them, whether or not they are new to the universe, or even to the nation.

In selecting a preferred definition of innovation for this thesis, the more contemporary and balanced definition given by Edquist (1997, p. 16) is appropriate: “[t]echnological innovation is a matter of producing new knowledge or combining existing knowledge in new ways- and of transforming this into economically significant products and processes.”

It is often important to classify the impact of a product or process innovation in a way that can be useful for comparing one or more technologies. Admittedly this is very difficult, since the extent of an innovation is dependent upon the perspective taken by the researcher. However, using Braun’s terse definitions (see table 2.2) one can distinguish one type of innovation from the other. Similarly Landau (1982, p. 54) believes that there are “...fundamentally two kinds of innovation: 1. The ‘breakthrough’; 2. The ‘improvement’.”

Table 2.2      Types of Innovations

Type of Innovation | Description

Radical | “[a] cluster of related innovations which together form a technology which differs considerably from previous technologies...”

Incremental | “[o]ne which offers a relatively small technical improvement without changing the nature of the technology.”

The definitions for the two types of innovations have been taken from Braun (1984, p. 42).


2.1.2.      The Innovation Process: Product versus Process

Throughout innovation literature there is some confusion over the terms innovation process, product innovation and process innovation. First, the innovation process can refer to either products or processes. It is the stages or phases involved with getting an idea for an invention to a ‘finished product’ or ‘finished process’ to operation. Braun (1995, p. 61f) outlines these phases as: the idea or invention; development of the product; prototype; production; and marketing and diffusion (see exhibit 2.1 on the following page).  An example of a product innovation is the semiconductor microchip. An example of a process innovation is the automated assembly line, set up at an automobile manufacturing plant. However, it is not always clear whether a given innovation should be categorised as either a product or process.[9] Irrespective, today it is more relevant to be concerned with the actual system of innovation. Now that the relationship between invention, innovation and diffusion has been presented it is necessary to allocate space to the actual innovation studies themselves. By reviewing the various types of conceptual frameworks and methodologies applied by other researchers, an appropriate approach can be chosen for this thesis. This will assist in meeting objective two (see section 1.3.1). Significant findings in the way of emergent patterns or events in the innovation process will also be highlighted and explained.

Exhibit 2.1 The Innovation Process of the Smart Card


2.2.   Setting the Stage- Karl Marx on Technology

In his classic work Capital, Karl Marx[10] (1818-1883) writes about the importance of products in the labour process. He stated that the result of this process was “...a use-value, a piece of natural material adapted to human needs by means of a change in its form...” (Marx 1976, p. 287). Products were “...not only the results of labour, but also its essential conditions.” For instance, a finished product could assist in the innovation process of another new product. In this manner technological change was the force behind process changes within existing institutions or the force behind the establishment of new institutions. It seemed obvious to Marx that for economic growth to be achieved product and process innovations were required. In commenting on the capitalist system he wrote that:

...the capitalist has two objectives: in the first place, he wants to produce a use-value which has exchange-value, i.e. an article destined to be sold, a commodity; and secondly he wants to produce a commodity greater in value than the sum of the values of the commodities used to produce it, namely the means of production and the labour-power he purchased with his good money on the open market. His aim is to produce not only a use-value, but a commodity; not only use-value, but value; and not just value, but also surplus-value (Marx 1976, p. 293).

Marx’s long-lasting contribution was recognising that product and process innovations had a social impact. Technology could be used to oppress a class or to empower an individual.[11] Accompanied by Friedrich Engels (1825-1895), Marx conducted historical research on process innovations in England. He used original factory documents to draw conclusions on the life of a worker who was driven by the capitalist to create surplus value. Modern interpretations of Marx’s ideology have sought to reassess parts of his labour theory of value as a means to reveal its limitations.[12]


2.3.   Neoclassical Economic Theory (1870 - 1960)

As defined by Cohendet and Llerena (1997, p. 226) neoclassical economics “...examines the way through which market mechanisms select new technologies and eliminate those that have become obsolete.”[13] One shortcoming of studies using neoclassical economic theory is that they focus primarily on business process innovations. Neoclassical studies are concerned with the manner in which technological innovations can enhance the productivity of a firm and decrease employment per unit of output (Edquist 1997, p. 22). A fundamental weakness of neoclassical economic theory is that “[e]xchange takes place without any specification of its institutional setting. Only prices and volumes matter” (Edquist & Johnson 1997, p. 48).


2.3.1.      Joseph Alois Schumpeter

In the year that Marx died Schumpeter[14] was born (1883-1950). While he was to eventually share many of Marx’s beliefs,[15] particularly in the self-destruction of capitalism, he focused his efforts on the statistical analysis of the capitalist process. What he is best remembered for perhaps are his studies on the production function and his book titled Theory of Economic Development (1934). Schumpeter was a neoclassical economist who attributed higher amounts of capital per worker to technological change, which resulted in more for the profit receivers. Throughout his professional career, he was preoccupied with innovation as the main agent for entrepreneurial profit. Like Marx he too focussed on business process innovations. The production function that Schumpeter wrote about “...expresses the relationship between various technically feasible combinations of inputs, or factors of production, and output... it is a specification of all conceivable modes of production in the light of the existing technical knowledge about input-output relationships” (Sahal 1981, p. 16). However, discontent with the production function has led many economists to abandon the neoclassical approach. This thesis also, does not lend itself to this type of rigid analysis as the questions it asks are more exploratory than computational. Neoclassical economic theory allows only for purely economic factors to be considered, neglecting other significant aspects of innovation.


2.4.   Evolutionary Economic Theory (1980 - 1990)

Evolutionary economic theory has not achieved the degree of articulation corresponding to neoclassical economic theory (Saviotti 1997, p. 181). For one simple reason, it is more recent. It is an alternative to understanding technical change as something other than an attempt to maximise profits (Nelson & Winter 1982). Unlike neoclassical economics, evolutionary theory is suited to both process and product innovations. It is also more contemporary, developed with an understanding of modern technological innovations. It has been applied to product innovations, such as semiconductors and satellites in the high-technology (high-tech) industry and is more suited to the investigation of auto-ID technologies. Numerous recent theoretical and empirical studies performed, have been conducted with the notion that technical change is an evolutionary process. One of these landmark studies will be critically analysed in section 2.4.1.

Devendra Sahal in his book Patterns of Technological Innovation (1981, p. 64) commented that evolution was not just a matter of ‘chop and change’; it related to the “...very structure and function of the object.” He stated that innovation was “...inherently a continuous process that [did] not easily lend itself to description in terms of discrete events” (1981, p. 23). Sahal is best remembered for his quantitative diffusion analytical strategies. While he made excellent evolutionary theoretical discoveries his methodology differs from contemporary researchers in innovation. In that same year Richard Nelson also suggested that due to the randomness and the time-consuming nature of innovation processes, evolutionary models of technological change were more realistic in understanding innovations than the models provided by neoclassical economics (Nelson 1981, 1059f).

Perhaps one of the most significant publications, as suggested by Saviotti (1997, p. 181) was Nelson and Winter’s, An Evolutionary Theory of Economic Change (1982). It was not until this time, that a researcher had concretely stipulated that “technical change [was] clearly an evolutionary process” (Nelson 1987, p. 16). While many innovation studies had challenged neoclassical economic assumptions during the 1970s, none had been so game as to suggest that evolutionary economic theory was more appropriate. At the time, Nelson (1987, p. 16) believed that the innovation generator kept making technologies superior to those in earlier existence. However, as later clarified by Charles Edquist (1997, p. 6) “...only superior in a relative sense, not optimal in an absolute sense.” Edquist affirms that “...technological change is an open-ended and path-dependent process where no optimal solution to a technical problem can be identified” (Edquist 1997, p. 6).[16] This idea is quite different and challenging.[17] Brought together with recent observations by Jerome Swartz[18] (1999) it presents a whole new approach to understanding auto-ID technologies. He writes (p. 21):

[n]ot long ago, I recall the heated debates about which technology was best- which would bring the most benefits, prove the most reliable or the cheapest. The implied question was, “Which will emerge as the real winner at the end of the day?” I believe that “competitive” framework asked all the wrong questions and clouded a better understanding of how the technologies could exist side-by-side.

These most insightful observations serve as a calling for further research to be conducted in the field of auto-ID innovation using evolutionary economic theory.


2.4.1.      Typical Research Style

A landmark study becomes obvious to the researcher who has read a plethora of literature in the field he or she is studying. Margaret Sharp’s, Europe and the New Technologies: Six Case Studies in Innovation & Adjustment (1985), is one of these landmark studies. She later follows up with Strategies for New Technologies: Case Studies from France and Britain (1989) that is equally impressive. Space will be dedicated to the former because it was without a doubt a paragon for future research in the field. Sharp uses evolutionary theory and a case study methodology to examine six new technologies in Europe. The methodology chosen for this study is advantageous in that it gives Sharp and her fellow contributors the flexibility to explore the many diverse issues surrounding the central thesis. New industrial activities are examined rather than individual industries or sectors. The research which was focussed on computer-aided design (CAD), advanced machine tools and robotics, telecommunications, videotext, biotechnology and offshore supplies was very successful,[19] and finally conclusions were drawn from recurring themes identified in the case studies. In summary, Sharp (1985, p. 271) concludes that:

[t]he process of change is evolutionary- new industrial activities emerge from the body of old industrial activities, the decisions are incremental as firms adjust their product/process mix to opportunities which present themselves, and, as this happens, so firms progressively redefine the nature and boundaries of the industry itself.


More precisely, Sharp believes that the concept, technological trajectory, is useful in the context of her case studies. Her discovery is very significant and is quoted in full below.

A new technology very often, and certainly, in the cases we have been studying, is subject to continuous improvement over a number of years. Firms which develop the capability to make these continuous improvements, that is to move along the trajectory, are often the most successful. As well as continuities, there are discontinuities. Major new technical or marketing innovations present such discontinuities. A discontinuity halts progress along the existing trajectory but simultaneously opens up a new one. In assimilating major technological change, a firm in effect changes gear and shifts to a new trajectory. In this sense, the discontinuity may be regarded as a revolution. Whereas evolution, development along the trajectory, is an everyday occurrence, revolutions are quite rare (Sharp 1985, p. 272).

Sharp’s study is an excellent model for this thesis. It shows how a research project such as the one that this researcher is undertaking, is likely to lead to some valuable results. And results, that are applicable to more than just that group of technologies (in this case auto-ID) being investigated. The key terms that Sharp uses, evolutionary, continuous improvement, discontinuity, technological change, technological trajectory, will be used throughout the main body of this thesis.[20] They are particularly relevant to objectives three, four and five stated in section 1.3.1.


2.4.2.      Fundamental Concepts

As identified by Carlsson and Stankiewicz (1995, p. 23) the major strength of the evolutionary approach is its “...ability to bring within a single framework the institutional/organisational as well as cognitive/cultural aspects of social and economic change”. This corresponds with the second objective of this thesis, in that many issues, not just economic will be analysed to identify the factors influencing auto-ID innovation. The key terms used in this framework are highlighted below in table 2.3.

Table 2.3      Important Terms and Concepts

* Terms sourced from Sahal (1981), Dosi (1982), von Hippel (1988), and Westrum (1991).  Technological Trajectories

The term technological trajectories,[21] also known as natural trajectories, can be attributed to Dosi (1982) as being a pattern of innovation.[22] While there have been several definitions given for this term, in the context of this thesis von Hippel’s interpretation (1988) is appropriate:

[t]echnological trajectories consist in the continuous improvements of products in terms of performance and reliability and in the tailoring of products to specific users’ needs, within specific application contexts.

What should be highlighted here is the focus on products and their continuous improvements, tailored to specific users’ needs for specific applications. Each firm follows a technological trajectory in search of improvements to their existing products (Breschi & Malerba 1997, p. 146f). In this manner a firm’s technological understanding is enhanced and one dominant design can emerge. Each firm pursues “...a single technical option and, over time, become[s] increasingly committed to a single technological trajectory” (Saxenian 1994, p. 112). The case study that Saxenian examines is the regional economy of Silicon Valley. In this instance, learning and technological change are cumulative in nature.[23] Firms secure their knowledge base and then attempt to build upon it seeking new opportunities. In contrast the emerging auto-ID industry has a knowledge base that is still in its early development and the future impact of auto-ID product innovations is still very much speculative. Thus the need has arisen to look ahead and propose a map or attempt to understand the path dependency of these technologies. Foresight is necessary because it also allows us to see the potential long-range effects of technical change (Westrum 1991, p. 344).  Selection Environment and Other Terms

Having revealed the importance of technological trajectories, it is now appropriate to understand the concept of selection environment. As the term suggests it is the process that involves the interaction between the product and its environment. Lindley (1997, p. 25) phrases it well when she states that:

[t]he selection environment acts to influence the path of innovation and the rate of diffusion generated by any given innovation, and at the same time generate feedback to strongly influence the direction and type of R&D programs that firms might invest in.

Sahal can be credited with the popularisation of the term technological guidepost. He stated that the basic design of a technological innovation acts as a guidepost charting the course of future innovation activity. To prove this he used an arbitrary example, highlighting that one or two early models of a product or process usually stand out above all the others in the history of an industry and their design becomes the foundation for the evolution of many innovations. “In consequence, they leave a distinct mark on a whole series of observed advances in technology” (Sahal 1981, p. 33).[25] This led Sahal to the principle of creative symbiosis, the case where “...two or more technologies combine in an integrative fashion such that the outline of the overall system is greatly simplified... when it [happens], totally new possibilities for further evolution present themselves” (Sahal 1981, p. 75). This phenomenon has occurred in the auto-ID industry and will be discussed in the main body of the thesis.


2.5.   The Emergence of the Systems of Innovation Approach (1990 - )

The systems of innovation (SI) approach[26] is a conceptual framework that can be used to study technological innovations.[27] The approach, admittedly not an established theory, has been gaining prominence in the last decade.[28] SI defines innovation as an evolutionary process, not as a process for achieving optimality.[29] SI is described well in the ‘Innovation Systems and European Integration Policy Statement’ (Edquist et al. 1998, p. 3f):

The Systems of Innovation (SI) approach for understanding innovations in the economy stresses that firms do not normally innovate in isolation but in interaction with other organisational actors (other firms, universities, standard setting organisations, etc.) within the framework of existing institutional rules (laws, norms, technical standards, etc.). Institutions are not organisations. Rather, they constitute the rules of the game or framework conditions for interaction. In contrast, organisations are the entities (actors) that interact. From this perspective, innovation is a matter of interactive learning.

The origin of this approach is well documented as proceeding from theories of interactive learning and evolutionary theories of technical change. Among researchers like Carlsson and Stankiewicz, Nelson and Rosenberg, as well as Lundvall and his colleagues, there is support for this approach stemming from its close affinity with the evolutionary theory (Edquist 1997, p. 7).[30] The “system” that Sahal once referred to has been defined in the SI approach (Edquist 1997, p. 14f):

One way of specifying ‘system’ is to include in it all important economic, social, political, organisational, institutional, and other factors that influence the development, diffusion, and use of innovations. Potentially important determinants cannot be excluded a priori if we are to be able to understand and explain innovation. Provided that the innovation concept has been specified, the crucial issue then becomes one of identifying all those important factors. This could- in principle- be done by identifying the determinants of (a certain group of) innovations. If, in this way, innovations could be causally explained, the explanatory factors would define the limits of the system. The problem of specifying the extent of the system studied would be solved- in principle.

Using these definitions it becomes a simpler task to understand auto-ID innovation. These factors could be causally explained, limiting the scope of the actual system and thus presenting a conceptual framework within which to perform the research. This would satisfy objective two discussed in section 1.3.1.


2.5.1.      The Value of the SI Approach

The attractiveness of the SI framework is that it is a “holistic and interdisciplinary” approach which “encompasses all or most determinants of innovation” (Edquist et al. 1998, p. 20). Fundamental to its doctrine is that “history matters” since innovation processes take time to evolve.[31] By understanding the past one is better equipped for the current and future patterns of innovation.[32] In a paper titled ‘Unfaithful offspring? Technologies and their trajectories’, S. Hong (1998, p. 262) challenges the notion that a technology’s trajectory is autonomous or its development unpredictable and uncontrollable.[33] He concurs with an underlying philosophy of SI; that it is rather an “imperfect historical understanding of a technology [that] largely contributes to the idea that the technological trajectory is uncertain, and, therefore, autonomous.” In examining the high tech industry in Sweden, McKelvey et al. (1998) traced the historical changes that occurred in the Swedish mobile telecommunication system from the 1970s till the 1990s to attain a better understanding of the industry dynamics. In like manner this thesis will explore auto-ID innovation.[34]

The SI approach successfully brings together the conventional teachings of various experts in the innovation field from all over the world. It has not only been adopted by the Europeans but also by researchers in Asia and North America. To understand its origin one must first look at the many case studies that have been conducted using a systems view of evolutionary theory. The term ‘national systems of innovation’ was first used by Chris Freeman in 1987. Lundvall then used it as a chapter heading in Dosi (1988). In 1989, Brown and Karagozoglu published a paper titled, ‘A systems model of technological innovation’. Following in 1992, Lundvall titled his book, National Systems of Innovation: Towards a Theory of Innovation and Interactive Learning. Perhaps the research that acted to launch the SI framework was National Systems of Innovation: A Comparative Study (1993) by Nelson and Rosenberg that will be critically analysed in section 2.5.2. Lipsett and Smith (1995) then published a paper titled, ‘Cybernetics and (real) National Innovation Systems’ challenging some of the ways that Nelson and Rosenberg treated their subject matter. The former made some very good points that are worthwhile noting but were supportive of the approach used overall. They called for researchers using the national systems of innovation to improve their information base and chosen metrics of analysis, and for more participants to get involved in discussions.[35] They also emphasised the need to understand that system dynamics are different when humans are involved and relationships are established.[36] In 1995, Carlsson and Stankiewicz completed defining the technological systems (TS) approach. The TS program focused on both theoretical studies and empirical studies. At the fifth International Conference on FACTORY 2000, Keating, Stanford and Cope (1997) also contributed to the idea of systematic technology innovation.

In retrospect most of these publications carried the word ‘national’ in their titles.[37] However, it is not the focus of this thesis to make an inquiry at a national level but of that specific auto-ID ‘technology system’ (TS) level. More recently, the validity of national innovation studies has been questioned anyway (Edquist 1997, p. 11).[38] What is so special about SI is that it “...allows for the inclusion not only of economic factors influencing innovation but also of institutional, organisational, social, and political factors. In this sense it is an interdisciplinary approach” (Edquist 1997, p. 17). SI should be looked at as a whole system because its elements are directly or indirectly related to each other. One advantageous aspect of SI is that each part of the system can be examined on its own or in relation to another one. To study one subsystem can also contribute to the understanding of the whole, to deal with individual elements whether they are technological, educational, organisational, social, cultural, economic or institutional in nature.


2.5.2.      Typical Research Style

National Systems of Innovation: A Comparative Study (1993), edited by Nelson and Rosenberg, was another landmark study propelling innovation thought forward. Like Sharp, Nelson and Rosenberg used a case study methodology but instead of choosing specific new technologies, the national systems of innovation of fifteen countries were investigated. Most of the case studies were conducted by resident researchers in each country. This may have complicated matters a little because different authors held different interpretations of the same concept of ‘national system of innovation’ (Edquist 1997, p. 4). Nevertheless, the book was intended to emphasise empirical evidence first, then to confirm theory. This research spanning fifteen countries was a much larger undertaking than Sharp’s earlier projects in 1985 and 1989. The resources and efforts required, to collect the data and present it in a coherent way with a set number of features for each country was a mammoth task, obviously outside the scope of this thesis.

Findings from the case studies suggested that it did make sense to think of national innovation systems although there was some problem with identifying national borders (Nelson & Rosenberg 1993, pp. 20, 506). Other difficulties included comparing and contrasting countries with varying economic and political circumstances, and distinguishing cross-border operations such as with transnational firms. At the conclusion, there was evidence to suggest that the technological capabilities of a nation’s firms do have an impact on its ability to remain competitive globally. However not all research tasks can produce results at this macro-level, nor is it a requirement.  In 1998, Choung authored an article titled, ‘Patterns of innovation in Korea and Taiwan’, examining thirty-four technical fields among which were telecommunications; semiconductors; electrical devices and systems; and calculators computers and other office systems. While Choung identifies Korea and Taiwan as the geographic settings for his study, the focus is on the thirty-four technical fields not on the actual countries. Micro-level projects have their benefits also. They are likely to uncover more detailed results and recommendations that are easier to implement within a firm or industry. The issue with a national comparative study is, who are the results aimed at and what types of organisations are willing to react to the findings? Consequently the term systems of innovation without the word ‘national’ has become more acceptable as a framework, giving the researcher the flexibility he or she needs to work at any level; national, regional, sectoral or industry-specific. Evidence of the SI framework is particularly apparent throughout Nelson and Rosenberg’s (1993, p. 15) introduction:

Technological advance proceeds through the interaction of many actors. Above we have considered some of the key interactions involved, between component and system producers, upstream and downstream firms, universities and industry, and government agencies and universities and industries. The important interactions, the networks, are not the same in all industries or technologies.

Contrary to both studies conducted by Sharp and, Nelson and Rosenberg, this thesis will focus on a single industry, i.e. auto-ID. Nelson and Rosenberg (1993, p. 13) themselves deem this to be positive, acknowledging that “...there are important interindustry differences in the nature of technical change, the sources, and how the involved actors are connected to each other...” Carlsson and Stankiewicz (1991, p. 111) are also in agreement with a technology and industry specific focus. As mentioned, they have termed this idea “technology system” (TS) where upon a “...greater emphasis is placed on the way specific clusters of firms, technologies, and industries are related in the generation and diffusion of new technologies and on the knowledge flows that take place among them” (Breschi & Malerba, p. 130). For the various levels of SI, refer to diagram 2.2.

Diagram 2.2       Levels of Innovation Systems Frameworks SI Empirical Studies at the SIS or TS Level

In Henry (1991), some very interesting papers were published in Forecasting Technological Innovation; proceedings of the Eurocourse lectures delivered in Italy the previous year. Many of the case studies presented used a sector-level approach and investigated high technologies such as advanced materials (Malaman), machinery and automation (Parker), electronics and information technology (Drangeid), telecommunications (Bigi & Cariello), transport (Marchetti) and biotechnology (Roels). The study on telecommunications is the one with which this forthcoming auto-ID study will be most closely aligned stylistically. Rosenberg (1994) also uses a sectoral approach in his case studies of technological change in energy, chemical processing, telecommunications, and forest products. Banbury’s 1997 study however, Surviving Technological Innovation in the Pacemaker Industry, 1959-1990, is that study that can be considered a precursor to this thesis. Banbury’s work shows the validity of a micro-level investigation on an individual industry and the excellent results which can be achieved.

In so far as the method is concerned for this thesis, case studies will be more exploratory in the style of Sharp than Nelson and Rosenberg. The results of the thesis are not meant to be quantifiable (e.g. how many patents, how many firms) but rather qualitative (e.g. what are the organisational and institutional dynamics). In terms of the structure of the literature review, this now concludes the space dedicated to the review of theories, theoretical frameworks and methods adopted by other researchers doing similar innovation and diffusion studies.


2.6.   Auto-ID Technology Studies

As was pointed out at the beginning of this chapter the principal purpose of this review was to find if indeed there was a gap in the literature to warrant this study. From the detailed analysis of innovation studies above, only a few researchers were found to have touched upon the topics of advanced technologies and innovation. None however, had studied auto-ID technologies in the context of the innovation field; the closest any researchers got to auto-ID was in the study of microchips and the experiences of Silicon Valley (Saxenian 1994), and on the pacemaker industry (Banbury 1997). It is now appropriate to present what research has been done on auto-ID and what has been neglected. Given the dynamic nature of complex technologies, it was to be expected that the majority of relevant publications were not to be found in books but in scant articles.[39] However, due to space limitations larger volumes of books will be given preference over articles in the review below. The focus below will be in understanding auto-ID from the perspective of a technology system as defined by the SI framework. It will be shown that apart from a few sporadic efforts and one book on the topic of smart card innovation (Lindley 1997), there is very little in the way of “pure” innovation research on auto-ID.


2.6.1.      Bar Code

Quite a number of books have been written on the bar code, mostly by practitioners who wish to share their experiences with others who are considering implementing automatic identification systems. Most of these present a brief history of the bar code and then delve straight into the notion of symbologies, standards and applications. A lot of space is dedicated to describing all the various components of the bar code system without concern for the innovation process or future trends. Behind Bars (Grieco et al. 1989) serves its original intent to be descriptive but is not critical. In The Bar Code Book, Palmer (1995) acknowledges that there are alternative methods of data entry techniques. He also does well to mention some of the legal aspects surrounding bar code and a few trends. Collins & Whipple (1994) in Using Bar Code: Why it’s taking over, dedicate a chapter to ‘Sources and Resources’ which in effect sets the stage to understanding the bar code business. Various actors are classified into groups, trade organisations are identified as well as journals, conferences and standards bodies. By far, however, the landmark study on the topic of bar code, as relevant to this thesis, is Revolution at the Checkout Counter: the explosion of the bar code (Brown 1997). The author is able to capture to some degree the dynamics between bar code organisations and institutions. And he does this by tracing the interactions of the two main actors over time. Similar to the SI approach, history matters to Brown as well. On quoting Hajo Holborn, Brown’s method is clearly set out, “we study history, not only to be more clever today, but to be wiser forever” (Brown 1997, p. xi). This is also one of the only sources on bar code that alludes to innovation concepts. A whole section in the Introduction of Brown’s study refers to the importance of economic analysis and the future of bar code.


2.6.2.      Magnetic-stripe Card

Finding sources devoted solely to magnetic-stripe media was quite difficult.[40] Most references to magnetic-stripe cards appear in books which present changes in banking.[41] More often than not, works published after the 1980s allude to smart cards as a more viable longer term option than magnetic-stripe. In fact most of the smart card-related books presented in 2.6.3 have dedicated at least a paragraph to the magnetic-stripe card. If the amount of specific literature published on the topic is any indication of its future prospects, then at present the issues surrounding magnetic-stripe card have not been identified or addressed properly.[42] In The Virtual Banking Revolution, Essinger (1999) dedicates chapter 3 to tracking the changes in banking. Among those sections relevant to this thesis are ‘Why things changed’, ‘The origins of virtual banking’, and ‘The three evolutionary phases of virtual financial services’.[43] Chapters 4 and 9, ‘The cash machine comes of age’ and, ‘The card that thinks for itself’ are also relevant to this thesis. Essinger does not shy away from asking questions regarding the future of banking and how it will be affected by various technologies. The layout of this book is to be commended. A case study example is given in each section to illustrate the main points.


2.6.3.      Smart Card

Jerome Svigals[44] (1985) can be credited with the first book on smart cards titled, Smart Cards: The New Bank Cards. Overall, his book can be considered a thorough introduction to smart cards. It served to highlight some very important issues relevant to this thesis as well. For instance, the chapter on ‘Magnetic stripe evolution or smart card migration’, although very brief, contained some important points. Yet, in terms of innovation and the auto-ID industry there was little to be captured. In 1988, Bright published a book similar to that of Svigals. Of interest to this thesis are chapters 8 and 9, where Bright presents data on suppliers and discusses the formation of international forums. He also mentions how related technologies may converge or compete. Bright placed importance on technological trends but does not spend much space discussing them. Thus, Svigals speaks of migration and Bright of convergence but both without alluding to innovation literature. By 1989 the momentous proceedings from the International Conference on Smart Card 2000 were published (Chaum & Schaumuller-Bichl eds. 2000).[45] Accepted papers included those on technical issues, product developments, applications, and multiapplication cards. Although innovation issues were not discussed the collaboration and sharing of ideas that took place at the conference was important. This theme will be given attention, particularly in chapter six.

The 1990s witnessed an influx of books on smart cards. Integrated Circuit Cards, Tags and Tokens (Hawkes et al. 1990) was a series of papers which introduced the idea of smart devices. For the first time RF/ID tags and biometrics were discussed in the same domain as smart cards. Smart Cards (McCrindle 1990) briefly examined contactless smart cards but was more concerned with applications. McCrindle also refers to the ‘Evolution of the smart card’ in chapter 2 but as with the other authors fails to explore the idea further. By the time Zoreda and Oton (1994) had published their book (also titled Smart Cards) many articles had been written about the potential of smart cards and associated trials all over the world. Zoreda and Oton’s book is the first technical type of its kind. This book was different in that it showed how smart cards actually worked, how they were designed and programmed. Additionally, the book does not fail to mention the ‘new trends’ in the form of biometrics and potential integration opportunities with the smart card as well.[46]

Kaplan (1996) offers some important insights into smart card innovation using a marketing approach. The author writes of the ‘Smart card revolution’ (ch. 1) instead of evolution, that is of some significance. He notes that the term may seem “heavy-handed” to some but he is brave enough to argue the point (p. 3). Kaplan writes of the rivalling alternative card solutions as “competing technologies” that do impact on smart card diffusion.[47] This notion will be important when the auto-ID selection environment will be presented in chapter seven. Kaplan is mixed in his approach to understanding innovation in the smart card business. In terms of innovation, the author highlights the importance of intellectual property and patents. The book is more useful in its presentation than previous texts as it deviates from the commonplace material. Another book worthy of note is that compiled by the Smart Card Forum containing brief articles, Smart Cards: Seizing Strategic Business Opportunities (1997). It brought together all the different issues the Forum is concerned with into a single reference volume.[48] Part One (especially chapters 3 and 4) is relevant to the thesis in that the topic of smart cards and innovation is touched upon several times over a number of pages.

Hendry’s book (1997) could have been considered ordinary, if not for the chapters on system components (ch. 9), current trends and issues (ch. 17), and the way forward (ch. 19). The final chapter, though brief, is important as it considers the various actors in the smart card industry. The section in this chapter titled, ‘Beyond smart cards’ is also significant to this thesis, mentioning the continuous development towards a biometric-based card that would store a person’s DNA (deoxyribonucleic acid) for identification purposes. This idea deserved more space than it received. Hendry’s book fulfilled its aims but like many other publications did not take that next leap forward to discuss future possibilities at length. Smart Card Handbook (Rankl & Effing 1997) is the most thorough text on actual smart card architecture available. It can be described as an engineer’s guide to smart cards and is relevant to this thesis in so far that it allows the researcher to understand some of the more complex issues facing developers.

Smart Cards: a case study (Ferrari et al. 1998) contains an important chapter titled ‘Card selection process’ (ch. 4). The intent of the chapter is to bring to the fore, all the important questions one should ask before making a card technology selection for a given application. While this concept is a little different to that of the selection environment in evolutionary economic theory, it nevertheless can shed light on the state of play. To date, the final book written on smart cards is by Dreifus and Monk (1998). It is appealing in that it is a guide to building and managing smart-card applications. This book also refers to continuous improvements as the ‘Evolution of the smart card industry’ (ch. 1). Chapter three is dedicated to standards and specifications that are also relevant to this thesis. The book concludes with the chapter titled, ‘The future of the industry’ and once again, like many other texts, does not go far enough in discussing what these are or what these may be. One section to take note of in that chapter is ‘Merging disciplines’ which can be considered to be denoting convergence. Publicly available material on the state of the smart card industry is plentiful but at the same time it is elementary and/or piecemeal. Basic fundamentals are repackaged over and over and marketed as new research.[49] At the high-end of available resources, substantial market research reports with rich competitive information and forecasts are being sold for thousands of dollars.


2.6.4.      Biometrics

One criticism of literature published on the topic of biometrics is that it is mostly of a technical nature. Improvements in algorithms, novel recognition techniques[51] and experimental results, typify model publications.[52] This is probably due to the fact that biometric systems are not yet as commercially prevalent as other auto-ID techniques. Biometrics: Personal identification in networked society (A. Jain et al. 1999) however, while including relevant technical literature also touches upon a diverse number of issues. The ‘Introduction to biometrics’ chapter is perhaps the most complete of any available on the topic.[53] Chapter 18 on ‘Smartcard based authentication’ is also important in that it presents the integration of biometrics onto a smart card. The final chapter titled ‘Biometrics: identifying law and policy concerns’, is pioneering not only as related to biometrics but also to the wider auto-ID field. Woodward makes a unique contribution in that he raises issues related to privacy concerns,[54] laws, cultural issues,[55] religious[56] objections and philosophical questions.[57] In the past these questions have been largely ignored. It is hoped that this thesis will serve to not only raise similar questions but address these in more detail as well.


2.6.5.      RF/ID Tags and Transponders

Ron Ames[58] can be credited with the compilation of short articles in Perspectives on Radio Frequency Identification (1990) which focused on RF/ID technology. Contributing writers briefly touched upon standardisation, the RF/ID manufacturing process, and technical changes. Towards the end of the book there is an invitation by Ames for firms to interact and collaborate more, if rapid industry-wide progress is desired. Ames offers insights about auto-ID at large that will be explored in more depth in this thesis. The next volume of work on RF/ID was written by Gerdeman (1995). While the majority of the book presents case studies on RF/ID applications, chapters five to seven explore specific standards and chapter eighteen is dedicated to vendors. Subsequently, Finkenzeller (1999) has written what is considered among many in the field to be the complete RF/ID handbook. Interestingly enough the Introduction begins with an overview of automatic identification systems. Chapter five looks at radio licensing regulations, chapter nine and the appendix at standardisation and chapter fourteen presents a market overview. All this is very important when analysing the institutional dynamics of RF/ID. Geers et al. (1997) differ from the other RF/ID books in that it is concerned with one main application, the electronic identification, monitoring and tracking of animals. The book is technical but informative in so far as future trends are concerned. The last chapter on existing and future devices is also applicable to human monitoring and tracking, which is one of the application case studies being researched in this thesis. In understanding the auto-ID trajectory such developments as documented above are paramount.


2.6.6.      Auto-ID and Other Technologies

The first complete book to be written specifically on auto-ID technologies is Jonathan Cohen’s (1994) Automatic Identification and Data Collection Systems. While Cohen does not seem to use any type of theoretical framework within which to discuss auto-ID, he does indirectly allude to technical change throughout as an evolutionary process. Commenting on the purely technical changes in auto-ID devices he writes:

[a]utomatic identification technology has progressed with leaps and bounds during recent years. Existing technologies are still being advanced and refined, while extensive research and development is being invested in new methods of automatic identification. Clear trends can be defined in the furthering of these technologies: increased data capacity; increased data storage efficiency; lengthening optical reading distances; greater resilience of non-optical reading; increased reliability; greater data recovery; reduced cost; greater diversity of identification possibilities; greater integration of automatic identification technologies (Cohen 1994, pp. 222-223).

Cohen introduces numerous auto-ID technologies and identifies various types of applications in a loose case study format, before exploring future trends and developments in the industry. Like LaMoreaux (1998),[60] Cohen spends the greater part of his book presenting information on bar code technology and does not grant equal space to other auto-ID technologies.

Two more contributions that are worthy of special mention are Brodsky (1995) Wireless: The revolution in personal telecommunications and Gellersen (1999) Handheld and Ubiquitous Computing. They are important in that they deviate from the commonplace content found in auto-ID literature. Brodsky especially is challenging in her perspective. She does not merely focus on a particular technology but on a technology system, in this instance wireless personal telecommunications. In the same manner, this thesis will hope to be all-encompassing in its investigation of auto-ID. In her concluding chapter Brodsky claims that “although technology often succeeds in unexpected ways, there are some interesting signposts” (p. 240). It is these signposts that must be identified in auto-ID as well to shed light on the future direction of the industry. Following on from personal telecommunications is the broader idea of ubiquitous computing. The first international symposium in 1999 was held on this topic. Papers presented covered issues such as “everywhere messaging”, location-based services, wearable computing and context-aware applications. Auto-ID will most likely be affected by these converging peripheral systems which will be explored in the auto-ID trajectory (see ch. 8).


2.6.7.      Landmark Studies on Auto-ID and Innovation Studies

Only one author has written an extensive work directly on the topic of innovation as related to one auto-ID technology, i.e. smart cards. R. A. Lindley uses socio-technical theory and a case study method to come out with her overall conclusions in her book Smart Card Innovation (1997). It is an exploratory study that thoroughly examines the interaction between smart card users, the technology and the organisation. Lindley is not the only researcher who believes that there is a growing need to develop our understanding of new and complex technologies within the scope of the field of innovation. However, she is the first to put forward a concise volume on innovation and any type of auto-ID technology. My investigation takes the next step forward in exploring that cluster of innovations known as auto-ID. The need for this study is ever-increasing as many researchers begin to compare one auto-ID technology to another. This can be seen within the context of magnetic stripe cards and smart cards. Lindley (1997, p. 18) writes:

There is also now little doubt among leaders in the banking industry that smart card will take over from magnetic-stripe card technology because of its ability to reduce fraud. The main advantage of smart card compared to other technologies is that it does provide a large range of design and service options with a high degree of security which is required when monetary or secret information exchanges are to occur. The old card technologies are rapidly being made obsolescent as the rate and level of sophistication of fraudulent use are rapidly approaching unacceptable levels. It is therefore now seen by many as only a matter of when, and how the services will be differentiated. 

Thus, examining auto-ID innovations is beneficial in understanding the industry trends. Hewkin (1989)[61] and Swartz (1999) realised this need earlier than most and were compelled to write about it. Swartz in particular provided some helpful insights (1999, p. 21). Today, many of us see Auto ID technologies as “complementary,” with each filling a space in the market defined by the fit between its strengths and weaknesses, and the requirements of target applications. And looking forward, I believe we’ll evolve from a “coexistence” model to one that leverages the many converging opportunities around the intersections and in the gaps between those technologies. This thesis is timely, in that the converging model[62] that Swartz is referring to is happening now.


2.7.   Forecasting

Since the purpose of this thesis is to “characterise” as well as to “predict” the path of auto-ID (see objective six in section 1.3.1), some space must be given to that body of literature encompassing the prediction of future events.[63] In this thesis to ‘predict’ means to look at ‘past’ and ‘present’ trends and use these for providing a road map of future possibilities.[64] It is not the intention of the researcher to predict extraordinary things,[65] it is to predict with the use of reliable evidence that is accessible today.[66] Auto-ID forecasts may not eventuate[67] but are still more likely to happen than ‘predictions in their pure form’ and for this reason they are more valuable.[68] Thus the term “forecaster” rather than such loaded terms as “futurist”, “visionary” or “secular prophet” is to be preferred.[69] It can be said that there are many forecasters and very few prophets. Forecasters, as it will be seen in the review of works below, usually use trends or patterns or present-day findings to make projections. Forecasters are more likely to make predictions about new innovations than new inventions. For the greater part, they raise challenging issues that are thought provoking, about how existing inventions or innovations will impact society. They give scenarios for the technology’s projected pervasiveness,[70] how they may affect other technologies, what potential benefits or drawbacks they may introduce, how they will affect the economy etc.[71]

Forecasters have diverse backgrounds. The contemporaries include a long list of scientists,[72] engineers, physicists, biologists,[73] mathematicians, entrepreneurs,[74] lawyers,[75] economists,[76] geographers, sociologists, historians, philosophers, religious thinkers,[77] science fiction writers,[78] culture critics,[79] ethicists[80] and others. While all of them cannot be mentioned here, some of the more prominent ‘technology-focused’[81] forecasters and their important works include: Ellul (1964), McLuhan (1964),[82] A. C. Clarke (1968, 1973), Toffler (1981), Minsky (1987), Moravec (1988, 1999), Gates (1995), Negroponte (1995), Kaku (1998), and Cochrane (1999).[83] In making predictions these forecasters are required to draw upon their expertise, and to also occasionally utilise other sources available to them (i.e. outside their area of expertise),[84] in order to offer a more complete picture of the future.[85] This also is important in the context of the SI approach. Often forecasters need to use an interdisciplinary approach to successfully bring together related projections.


2.7.1.      From “Electronic Banks” to “Digital Money”

When Jacques Ellul predicted the use of “electronic banks” in his well-researched book, Technological Society (1964, p. 432), he was not referring to the computerisation of financial institutions, ATMs or EC (electronic commerce). Rather it was in the context of the possibility of the dawn of a new entity- “the coupling of man and machine”. Ellul was predicting that one day knowledge would be accumulated in electronic banks and “transmitted directly to the human nervous system by means of coded electronic messages… What is needed will pass directly from the machine to the brain without going through consciousness…” As unbelievable as this “man-machine” complex[87] may have sounded at the time, thirty years later forecasters are still predicting such scenarios will be possible by the turn of the 22nd century. Today, of course they have a better understanding of the issues at hand and write with a clearer road map of how to get there.[88] One can trace the predictions of these forecasters over time and see that they are evolving as new discoveries are made to defend or attack a given prediction  (see  table  2.4).   In  like   manner   this   is   how  this  researcher  wishes  to

Table 2.4      Evolving Thought in Artificial Intelligence

make predictions about auto-ID; by using existing findings as a ‘launch-pad’ for building likely future scenarios. The whole point of this example is to show that some very credible persons have made what many may believe (or used to believe) to be some very incredible predictions about the future.[89] In terms of auto-ID several forecasters have made relatively high-level predictions about technologies and applications. Gates, Negroponte and Kaku all agree that auto-ID technologies, especially smart cards and biometrics, will have a great impact on society in the next twenty years.  Gates places much emphasis on the wallet PC, Negroponte on wearable devices and Kaku on ubiquitous computing (see table 2.5). Just by analysing these three positions one can trace an evolution of ideas. From devices that one carries, to those that one wears, to those that are everywhere. Additionally in terms of auto-ID-related EC applications, all three forecasters are in agreement that these are going to become increasingly interconnected over the Internet.[90] Thus, there is an underlying synergy between the predictions, that makes what the forecasters are saying quite likely events.

Table 2.5      Synergy of Forecasters

It is the intention of this researcher however, to offer more detailed evidence for the predictions regarding the auto-ID trajectory. For instance, Gates (1995, p. 77) stated that “[t]he smart card of the future will identify its owner and store digital money, tickets, and medical information…” Kaku (1998, p. 37f) agrees that “[t]here will be enormous economic pressure for people to convert to smart cards and digital money… In the future, smart cards will replace ATM cards, telephone cards, train and transit passes, credit cards, as well as cards for parking meters, petty cash transactions, and vending machines. They will also store your medical history, insurance records, passport information, and your entire family photo album. They will even connect to the Internet.” But the evidence they offer in their books for these happenings is quite scarce. Of course the scope of their books does not allow for such inquiry but this presents an adequate gap in the literature to be filled by this thesis, offering a unique contribution to knowledge.


2.8.   Conclusion

The fundamental purpose of the literature review was to affirm the need for a study on auto-ID innovation. A sufficient gap has been identified in the literature and this thesis will act as a first attempt to present new findings. It will use key terms and concepts defined by researchers in evolutionary theory, to describe trends that are occurring in the auto-ID industry. In approach, it will incorporate the interdisciplinary SI framework to identify the factors affecting innovation and diffusion in the auto-ID industry. As shown from the literature review, the framework of SI is very flexible. It allows the researcher to include and exclude data dependent on the scope of the problem, macro or micro in level. By collecting the data using a case study methodology, this thesis will also be adopting the example of many other researchers (e.g. Sharp, Nelson and Rosenberg) in the field of innovation. Case studies are also common in auto-ID texts. As shown above, many researchers have used this method for presenting data on different types of auto-ID applications (e.g. Cohen, Hendry, Lindley). Case studies are compatible with the kind of research that requires exploration.[91] A computational method would impede the results of a thesis such as this. Indicative of the widespread applicability of the SI framework is the support it has gained from researchers all over Europe especially. Researchers in business, economics, sociology, industrial management and information technology have contributed to its development (see Edquist 1997). While some may regard this to be a disadvantage, interdisciplinary research has boomed in the 1990s and has even been adopted by researchers in Asia and North America.

Until recently, publications in auto-ID have been ordinary in that the same elementary issues are discussed over and over again and in a piecemeal fashion. The reader is left with a lot of valuable data linked to a given aspect of auto-ID but with a limited ability to interpret the findings. In most instances, inadequate guidelines or evidence is provided for the reader to understand or be illumined by what is being explored. In addition, researchers in auto-ID have refrained from using valid conceptual frameworks or structured methodologies within which to discuss their narratives. They have not only failed to adopt major principles of evolutionary theory- such as trajectories and selection environment- but very few empirical studies have considered forecasting as a logical next step to their research. If for instance, one is analysing the trajectory or the path-dependency of a technology, then issues surrounding forecasting should also be considered.

Finally, this extensive literature review can be considered a contribution in itself. Its comprehensive bibliography and the bringing together of innovation thought and auto-ID technology has not been previously presented. It is hoped that other researchers may be able to use these resources to conduct further study in the field. Now to the next chapter which will describe the case study approach and the applied SI framework in which the five auto-ID technologies will be studied. A research design is always an essential part of a methodology. All too often researchers fall into the misconception that case studies are not scientific so they do not have to be carefully planned. The opposite is true, however, but this is a key point to be dealt with in chapter three. Here, among other issues, the following questions will be answered. What is the proposed research strategy for this study? What is the architecture of the research design? What is the methodology being adopted? What types of data will be gathered and how will the data be managed? And how will data analysis be performed?



[1] The number of innovation studies being conducted, rose after the seventeenth century, while a number of countries in Europe were slowly adapting to mechanisation for the purposes of industrial expansion. Studies on innovation became especially prominent when a group of dramatic changes happened, known as the period of the Industrial Revolution between the mid eighteenth century and World War I (Britannica 1972, Vol. 6, p. 229). See also Mumford (1934, ch. 5), ‘The Neotechnic Phase’ and appendix on inventions after the eighteenth century (pp. 441-446).

[2] From this, it may be concluded whether some results are contingent upon the type of theory or method used by the researcher.

[3] The three terms invention, innovation and diffusion are different, however, as Lindley (1997, p. 19) observes, the terms are also closely allied.

[4] For an excellent introduction into the diffusion of innovations see Rogers (1995).

[5] See Westrum (1991, p. 150).

[6] “The production function indicates the maximum amount of product that can be obtained from any specific combination of inputs, given the current state of knowledge. That is, it shows the largest quantity of goods that any particular collection of inputs is capable of producing” (Baumol et al. 1992, pp. 507-510).

[7] See also Sahal (1981, p. 37).

[8] There is some value in this geographic perspective either at the local, regional, or national level. In National Innovation Systems, Nelson and Rosenberg (1993, p. 3) state that “[t]here is a new spirit of what might be called “technonationalism” in the air, combining a strong belief that the technological capabilities of a nation’s firms are a key source of their competitive prowess, with a belief that these capabilities are in a sense national, and can be built by national action.”

[9] This is the case especially in the IT sector (Edquist et al. 1998, p. 12). Some innovations can be classified as either product or process, dependent on how they are used.

[10] Marx was a philosopher, sociologist and economist considered by many to be one of the most influential persons of all time. One of his most famous works is The Communist Manifesto which he co-authored with Friederich Engels. See Westrum (1991, ch. 2) on ‘Marx’s Theory of Technology’.

[11] This is an important observation that Marx has made about the power of technology. When considering auto-ID today this question is still relevant. In the Introduction, the growing dependence of humans on auto-ID devices was highlighted for this very reason. In discussing the evolution of digital technologies, Covell (2000, p. 5) notes a fundamental shift in the nature of the evolution. “The difference is that digital technology is now being applied to enhance and extend human interaction.”

[12] See Habermas (1981, p. 159).

[13] This approach cannot assist in the exploration of the auto-ID selection environment. It is too limiting in its analysis, using the market mechanism as its fundamental guide.

[14] Schumpeter was an exceptional economist and sociologist who influenced economic theory via his many publications. He is most known for his book Capitalism, Socialism and Democracy.

[15] Commentators are divided when classifying Marx’s contributions into an economic theory. Nevertheless, he has been placed alongside Schumpeter because of his writings on the labour process.

[16] This perspective is embraced throughout this thesis. It has very important implications as it shapes the context in which auto-ID is to be understood. Rather than concentrating on the progression from one auto-ID technology to the next in terms of ‘superiority’, the question is more about the actual path taken to develop, by firms, government and consumers. Who drives this path and the dynamic interaction between the stakeholders then becomes of interest.

[17] In contrast see Darwin’s (1960, p. 53) writings (ch. 4) on ‘Natural Selection; or the Survival of the Fittest’. His fundamental argument is “...[t]hat as new species in the course of time are formed through natural selection, others will become rarer and rarer, and finally extinct. The forms which stand in closest competition with those undergoing modification and improvement will naturally suffer most.” Refer also to the sections in ch. 4 on “divergence of character” (p. 53) and “convergence of character” (p. 62). When this Darwinist approach is applied to economic affairs, Allaby (1996, pp. 130-132) calls it “social Darwinism”. He believes that this theory is deeply flawed. “Its first error lies in its equation of evolution with progress, the idea that later forms are better than earlier ones. This is a value judgement, for what do we mean by ‘better’? If we mean ‘better at surviving’ we are being tautologous.”

[18] Dr Jerome Swartz is the founder and CEO of Symbol Technologies. A veteran in the auto-ID industry with 20 years experience, he has first-hand knowledge of the past and present trends.

[19] This is a landmark study also in its investigation of complex technologies. Considering that the desktop computer was launched in 1984 (and the research was conducted and published by 1985), dealing with topics like CAD, videotext and the like is quite impressive in itself. In analysing individual auto-ID devices and applications, it is hoped that the same level of granularity can be achieved as in Sharp’s studies.

[20] Friedman (1994), like Sharp (1982) applies evolutionary concepts to his study on the IT field. Of particular interest is his emphasis on the technological trajectory of IT which he breaks down into four phases of historical change: 1) hardware capacity constraints, 2) software productivity constraints, 3) user relations constraints, and 4) the future. Friedman also makes the useful distinction between a technology field and technological paradigm. “First, the focus of the technology field is on people, institutions, and organisations. The focus of the technological paradigm is on designs or patterns of solutions. The technology field encourages a much wider set of people to be analysed. The technological paradigm contains practitioners working in organisations supplying the technology, and possibly scientists and technologists working in associated research institutes and universities (Clark, 1987).”

[21] Hirooka (1998) makes the distinction between the technological trajectory of a product and its diffusion trajectory. The paper focuses on three cases of innovation paradigm, synthetic dyestuffs, electronics and biotechnology. Through these examples Hirooka provides evidence that the technological trajectory of an innovation spans about 20-30 years upon which point it joins the diffusion period. See also Banbury (1997, pp. 14-15).

[22] For a thorough explanation of the term technological trajectory see Durand (1991). He makes the important link between the terms technological trajectory and technological forecasting which is extremely important to this thesis. In quoting Dosi (1982), Durand makes the distinction between continuous changes along the same paradigm versus discontinuities which are associated with a new emerging paradigm.

[23] Banbury (1997, p. 13) writes “[t]he concept of a technological paradigm enables us to delineate the boundaries of technological change cycles (paradigms) and to delineate the direction of change (trajectories)”.

[24] The question of why it is important to forecast and what is to be gained by it, is very important to this thesis. This researcher believes that forecasting is essential, even if the predictions arrived at may not eventuate or even if some unexpected events happen that were not anticipated. It is better to make some logical predictions based on the evidence one has and be prepared for what lies ahead, than to find oneself completely unprepared. In this sense, forecasting is also linked to planning (see Mintzberg 1981, 1994).

[25] See Anderson and Tushman (1990).

[26] See Systems of Innovation: Technologies, Institutions and Organisations, edited by Edquist (1997) and Systems of Innovation: Growth, Competitiveness and Employment, edited by Edquist and McKelvey (2000).

[27] See especially the Department of Technology and Social Change web site (1999) at Linköping University, TEMA, in Sweden.

[28] For a comparison of other contemporary frameworks and approaches used to study innovation, see chapter three in Williams (2000). Here Russell and Williams make some excellent observations on how frameworks for investigating innovation studies are being used by researchers. In comparing the social shaping of technology (SST) framework with evolutionary economic theory the authors denote: “[i]n many respects the boundary… has become less clear: their preoccupations overlap strongly and their findings are often consistent.” See also MacKenzie and Wajcman (1999), Pool (1997), Fox (1996), MacKenzie (1996), Bijker (1995), Bijker and Law (1992), Molina (1989) and Elliot (1988).

[29] Edquist et al. (1998, p. 21) explain that “the notion of optimality is absent from the SI approach. The notion of optimality stems from static equilibria and therefore is not applicable to processes of technological change… [this] is a major contribution of evolutionary theory, which the SI approach has adopted.”

[30] Well-known proponents of SI in Europe include Charles Edquist, Maureen McKelvey, Leif Hommen, Bjorn Johnson, Franco Malerba, Keith Smith, Tarmo Lemola, Lena Tsipouri, Thomas Reiss, and Pier Paolo Saviotti.

[31] Of particular interest may be Queisser’s (1985) historical account of The Conquest of the Microchip. In the Foreword of this book, Robert Noyce (p. viii) writes regarding the microchip: “[t]hose of us who have been involved in the development of this technology recognise that the terms technological revolution and breakthroughs are used to attract public attention to the progress being made, in reality progress is almost seamless, with pieces of the puzzle continually being put in place until a coherent picture emerges.” Noyce is suggesting that the microchip revolution happened through evolutionary steps, i.e. the notion of revolution through evolution. These two words are often used at different levels by technologists in auto-ID. One expert may refer to the smart card revolution and another may refer to the evolution of card technology from magnetic-stripe to smart card. What this actually shows is congruence with Noyce’s argument.

[32] This approach is also used by some futurists like Adrian Berry. He writes “[s]o that my subsequent chapters will be intelligible, I must explain what has been happening in the past five hundred years and how it relates to the present” (Berry 1996, p. 19).

[33] Hong (1998) chooses to examine three technologies including the Triode, the numerically controlled (NC) machine tool and the Internet.

[34] SI approaches embody nine characteristics, among which is that they may “employ historical perspectives” (Edquist 1998, p. 8).

[35] For an excellent guide to methods applied in cross-national research, see Hantrais and Mangen (1996).

[36] This is particularly relevant to this thesis. Some of the questions posed in the Introduction relate to the manner in which consumers will react to auto-ID technologies if the trajectory does not change in the future. In terms of innovator insights, see the paper titled, ‘Human Factors and the Innovation Process’ by Livesay et al. (1996, pp. 173-185).

[37] See also the ISE policy statement (Edquist et al. 1998, p. 20): “[i]nitially the SI approach was dominated by the national level. However, other systems of innovation than those defined by a country criterion, should be, and are being, identified and studied… Leaving the geographical dimension, we can also talk about ‘sectoral’ systems of innovation (i.e. ‘technological’ systems that include only a part of a regional, national or international system).”

[38] The terms sectoral innovation system (SIS) and technological system (TS) are often used interchangeably by researchers. The reason for this is that both focus on the firm as the central actor. There is however a subtle difference, TS is in actual fact a subset of SIS. For instance, auto-ID is the TS system being analysed in this thesis and the industry belongs to the larger sector of electronics and IT. See examples of SIS case studies in section two of Henry (1991, pp. 129-239) and part three of Rosenberg (1994, pp. 159-250). It is the opinion of this researcher that TS studies can influence organisations more directly than national studies which are often aimed at government bodies. It is much easier to shape an industry than a whole nation.

[39] While this makes gathering data more difficult for the researcher it introduces a level of objectivity also. Details and the accuracy of information can be cross-examined. Magazines specific to the cause of auto-ID technologies include: ID Systems Magazine, Card Technology Today, Card International, EFT Today, Report on Smart Cards, Smart Card and Systems Weekly, The Biometrics Report, Biometrics Technology Today, RFID News, Frontline Solutions and Automatic ID News. Testament to the significance of auto-ID is the fact that the IEEE even dedicated a whole issue to biometrics. See Proceedings of the IEEE, 85(9), September 1997.

[40] Two articles dedicated completely to magnetic-stripe card technology include: Smith et al. (1996) and Chu et al. (1995).

[41] On the changing face of banking see also Banking Technology Handbook (Keyes 1999), Electronic Payment Systems (O’Mahony et al. 1997), and Presenting Digital Cash (Godin 1995).

[42] See de Bruyne (1990) ‘New technologies in credit card authentication’, Higgins (1996) ‘Electronic cash in a global world’, and Wahab (1999) ‘Biometrics electronic purse’.

[43] See Yan et al. (1997) ‘Banking on the Internet and its applications’.

[44] Jerome Svigals was an excellent ambassador for smart card, with foresight regarding its vast application potential. He gained first-hand experience, especially during the 1980s when he worked as a consultant to major banks on the future of electronic banking.

[45] The proceedings are significant because for the first time company representatives and university researchers were brought together to discuss their findings and the future direction of smart cards.

[46] Unfortunately, that is all that most of these books do mention; simply a phrase or brief section on various devices or future trends. See Sanchez-Reillo (2001) ‘Smart card information and operations using biometrics’, Noore (2000) ‘Highly robust biometric smart card design’, and Sanchez-Reillo and Gonzalez-Marcos (2000) on hand geometry verification and smart cards.

[47] This idea can be traced back to the neo-classical economic approach described in section 2.3. Still, Kaplan’s reasoning is still relevant to the question posed in the Introduction of this thesis.

[48] It did well to boost the image of the Forum and emphasised its importance to the success of smart card.

[49] Still it is the opinion of the researcher that new research can take the form of new compilations of existing information but for the greater part most of these books touch on exactly the same things, using exactly the same source material.

[50] Perhaps it is a case that one pays to be empowered with scarce knowledge. Four smart card reports have offered some light in terms of the link between innovation and diffusion as discussed earlier in this chapter. The first is published by Ovum (1997) titled The Balance of Power: Uncertainty and Opportunity in the New Smart Card Market. The second, is published by The Insight Research Corporation (1998), titled Beyond Payphones. The third, is published by Datamonitor (1996) titled Global Smart Cards. The fourth is Prepaid Service in Asia Pacific, and is published BIS Shrapnel (1999). The latter two are the most worthwhile in terms of industry insights. Similarly, the four most prominent auto-ID journals are so costly that they are hardly affordable outside an enterprise environment.

[51] Some of these novel techniques include: thermographic imaging of the subcutaneous vascular network of the back of the hand (Cross & Smith 1995), dermatoglyphics (Rodriguez 1996), palmprint (Shu & Zhang 1998), human gait model (Cunado et al. 1998), locating facial features with colour information (Zhang et al. 1998), automatic ear recognition by force field transformations (Hurley et al. 2000) and an integrated biometric database (Carter & Nixon 1990).

[52] Three such examples include Automatic Systems for the Identification and Inspection of Humans (Mammone & Murley 1994), Audio- and Video-based Biometric Person Authentication (Bigun et al 1997), and Intelligent Biometric Techniques in Fingerprint and Face Recognition (L. C. Jain et al. 1999). They serve to fulfil their original intent but do not offer much other than a technical contribution.

[53] See also Liu and Silverman (2001) for a practical guide to biometric security. Lawton (1998) gives a brief overview of different types of biometric systems and a discussion on the lack of standardisation. Miller (1994) goes into more detail than Lawton in a special report on Biometrics in the IEEE Spectrum.

[54] See Whitaker (1999), Cuddy (1999), Brin (1998), Branscomb (1994), Davies (1992, 1996).

[55] See Tapscott (1998), Ermann et al. (1997) and Sims (1994).

[56] See Stahl (1999), Noble (1999), Hensley (1998), E. Lucas (1996), Fischer (1990).

[57] See Ihde (1991, pp. 3-10), ‘Introduction: philosophers and technology’.

[58] Ames who had worked as a consultant and practitioner in the field of computing for twenty-five years understood the need for collaboration to help make RF/ID successful. His 1990 insights were very accurate, except they were probably too premature for the industry to react accordingly.

[59] The researcher must be aware of all the current technical capabilities of a given product innovation because that may influence the manner in which it may be applied in the future.

[60] LaMoreaux also discusses bar codes at length, covering a broad range of issues from application case studies to symbologies to management and personnel considerations. Again, very informative but the book should just be titled “Bar Codes”, not Bar Codes and Other Automatic Identification Systems.

[61] Hewkin saw the industry-wide need for an understanding of the auto-ID innovation process but presented scattered thoughts and did not follow up with other complimentary publications in the field. He also used a neo-classical model of interpretation based on the price mechanism. He did however allude to the future evolution of new auto-ID technologies.

[62] Three excellent books that should be consulted regarding the notion of “convergence” in technology studies include: Digital Convergence (Covell 2000, ch. 7), Competing in the Age of Digital Convergence (Yoffie 1997, ch. 5), and Convergence: Integrating media, information and communication (Baldwin et al. 1996, ch. 5). The term “convergence” means different things to different people and can be used at different levels. In some instances, the term has been used too loosely in high-tech with reference to digital or technological convergence, e.g. the “combination of computing, communications, and digital media technologies” (Covell 2000, p.161). It is not that Covell’s definition is wrong but it is perhaps a little too all-encompassing for the study of auto-ID innovation. Convergence in the context of this thesis is not anything and everything coming together. A more preferred definition for convergence is that given by Greenstein and Khanna (1997, pp. 203-205). They suggest there are two kinds, convergence in substitutes and convergence in complements. “Two products converge in substitutes when users consider either product interchangeable with the other. Convergence in substitutes occurs when different firms develop products with features that become increasingly similar to the features of certain other products… Two products converge in complements when the products work better together than separately or when they work better together now than they worked together formerly. Convergence in complements occurs when different firms develop products or subsystems within a standard bundle that can increasingly work together to form a larger system… [D]epending on the level at which a computing system or communications system is analysed, a particular instance of convergence may be construed as being of either kind. It may be interpreted as a convergence in substitutes at one level of analysis and, equally appropriately, as a convergence in complements at a different level.”

[63] Braun (1995, p. 133) writes that “[f]orecasts do not state what the future will be... they attempt to glean what it might be.” See also Schot and Rip (1996), Grin and van de Graaf (1996), Henry (1991), Porter (1980), Chen (1980), Barron and Curnow (1979), Bright (1968).

[64] Such a projection can take the form of an extensive technology assessment (TA) or technology forecast. TA as defined by Braun (1995, p. 129) “ the activity of describing, analysing and forecasting the likely effects of a technological change on all spheres of society, be it social, economic, environmental or any other.” TAs usually are armoured with their own methodologies and are conducted over a short period involving a group of experts, government policy officials, and other interested parties from the field. Unfortunately, the scope of this thesis does not allow for genuine TAs to be conducted. However, it is within the bounds of the thesis to perform auto-ID technology forecasting. “Here the emphasis is on predicting the development of the technology and assessing its potential for adoption, including an analysis of the technology’s market” (Westrum 1991, p. 328).

[65] Extraordinary in the sense that professionals working in the area of auto-ID would be astonished by some of the predictions being made. The predictions will not be incredible as such but rather more explicit, more coherent, supported by detailed evidence unlike the majority of previous studies.

[66] Kaku (1998, p. 14) advises, “[i]n making predictions about the future, it is crucial to understand the time frame being discussed, for, obviously, different technologies will mature at different times… These are not absolute time frames; they represent only the general period in which certain technologies and sciences will reach fruition”.

[67] Even Gates (1995, p. 274) realises that his predictions may not come true. But his insights in the Road Ahead are to be commended, even though they are broad. “The information highway will lead to many destinations. I’ve enjoyed speculating about some of these. Doubtless I’ve made some foolish predictions, but I hope not too many.” Perhaps without realising, Gates is alluding to the fact that predictions are integral to the innovation process. In the quest to prove or disprove forecasts or predictions, “[s]cientific understanding can lead to practical uses. With the first such application, the quest for further understanding intensifies, leading to even more advanced applications” (Quiesser, 1985, pp. viif).

[68] Allaby (1995, p. 206) writes “[f]orecasts deal in possibilities, not inevitabilities, and this allows forecasters to explore opportunities.” In speculating about the next 500 years Berry (1996, p. 1) writes, “[p]rovided the events being predicted are not physically impossible, then the longer the time scale being considered, the more likely they are to come true… if one waits long enough everything that can happen will happen.”

[69] Someone who predicts (in the pure sense) is being prophetic. In the traditional meaning, The Seer of Patmos, the author of the Book of Revelation, can be considered a prophet; a predictor of events, but a modern day forecaster like Nicholas Negroponte is not being prophetic. See also Sawyer (1993, pp. 16-18).

[70] See Twiss (1992, pp. 30-36). The technology life cycle has five stages: incubation, diversity, segmentation and growth, maturity, decline. Compare with the five stages in the adoption process of an innovation (Stanton et al. 1994, pp. 195-197) that include: innovators, early adopters, early majority, late majority and laggards.

[71] And it is here that a robust framework like the systems of innovation approach, described in section 2.5 can assist a researcher in making predictions, as it looks at the whole system.

[72] Kaku (1998, p. 5) argues, “that predictions about the future made by professional scientists tend to be based much more substantially on the realities of scientific knowledge than those made by social critics, or even those by scientists of the past whose predictions were made before the fundamental scientific laws were completely known”. He believes that among the scientific body today there is a growing concern regarding predictions that for the greater part come from consumers of technology (writers, sociologists etc.) rather than those who shape and create it. Kaku is correct in so far that scientists should be consulted as well, since they are the ones actually making things possible after discoveries have occurred. But to this researcher, a balanced view is necessary, encompassing various perspectives of different disciplines is extremely important. In the 1950s for instance, when technical experts forecast improvements in computer technology they envisaged even larger machines but science fiction writers predicted microminiaturisation. They “[p]redicted marvels such as wrist radios and pocket-sized computers, not because they foresaw the invention of the transistor, but because they instinctively felt that some kind of improvement would come along to shrink the bulky computers and radios of that day” (Bova 1988 quoted in Berry 1996, p. 18). The methodologies used to predict in each discipline should be respected. The question of who is more correct in terms of predicting the future is perhaps the wrong question. For example, some of Kaku’s own predictions in Visions can be found in science fiction movies dating back to the 1960s.

[73] See Wade (2001), Rantala and Milgram (1999).

[74] See Knoke (1996) Bold New World.

[75] See Davies (1992, 1996) Big Brother & Monitor.

[76] See Barnet and Cavanagh (1994), Global Dreams.

[77] See Relfe (1980, 1981) The New Money System and Cook (1999) The Mark of the New World Order.

[78] The predictions of science fiction writers have often been promoted through the use of print, sound and visual mediums, especially novels and movies. Some of the more notable predictions and social critiques are contained within the following novels: Frankenstein (Shelley 1818), Paris in the 20th Century (Verne 1863), Looking Backward (Bellamy 1888), The Time Machine (Wells 1895), R.U.R. (Kapek 1917), Brave New World (Huxley 1932), 1984 (Orwell 1949), I, Robot (Asimov 1950), Foundation (Asimov 1951-53, 1982), 2001: A Space Odyssey (Clarke 1968), Blade Runner (Dick 1968), Neuromancer (Gibson 1984), The Marked Man (Ingrid 1989), The Silicon Man (Platt 1991), Silicon Karma (Easton 1997). The effects of film have been even more substantial on the individual as they have put some form to the predictions. These include: Metropolis (Fritz Lang 1927), Forbidden Planet (Fred Wilcox 1956), Fail Safe (Sidney Lumet 1964), THX-1138 (George Lucas 1971), 2001: A Space Odyssey (Stanley Kubrick 1968), The Terminal Man (George Lucas 1974), Zardoz (John Boorman 1974), Star Wars (George Lucas 1977), Moonraker (Lewis Gilbert II 1979), Star Trek (Robert Wise, 1979), For Your Eyes Only (John Glen II 1981), Blade Runner (Ridley Scott 1982), War Games (John Badham 1983), 2010: The Year We Make Contact (Peter Hyams 1984), RoboCop (Paul Verhoeven, 1987), Total Recall (Paul Verhoeven 1990), The Terminator Series, Sneakers (Phil Alden Robinson 1992), Patriot Games (Phillip Noyce 1992), The Lawnmower Man (Brett Leonard 1992), Demolition Man (Marco Brambilla 1993), Jurassic Park (Steven Speilberg 1993), Hackers (Iain Softley 1995), Johnny Mnemonic (Robert Longo 1995), The NET (Irwin Winkler 1995), Gattaca (Andrew Niccol 1997) Enemy of the State (Tony Scott 1998), Fortress 2 (Geoff Murphy 1999), The Matrix (L. Wachowski & A. Wachowski 1999), Mission: Impossible 2 (John Woo 2000), The 6th Day (Roger Spottiswoode 2000). Other notable television series include: Dr Who, Lost in Space, Dick Tracy, The Jetsons, Star Trek, Batman, Get Smart, FarScape. See also Schirato and Yell (1996, p. 145, 214f) Communication and Cultural Literacy. The authors speak of cultural trajectories (p. 143). See also the album by Kraftwerk titled “The Man Machine” (Capitol 1978).

[79] See Out of Control (Kelly 1994), Facing the Future (Allaby 1995), Silicon Snake Oil (Stoll 1995), WAR of the WORLDS (Slouka 1995), Escape Velocity (Dery 1996), and high tech|high touch (Naisbitt et al. 1999).

[80] See Kass and Wilson (1998), Harris (1992).

[81] By technology-focused is meant those who are concerned with changes in computers, networks, digital media technologies, artificial intelligence and their impact on consumer and business processes.

[82] Other texts that may be useful in understanding McLuhan include: E. McLuhan and Zingrove (1995), M. McLuhan and Powers (1986).

[83] A spate of publications predicting future technical breakthroughs were published prior to the onset of the new millennium. Most of these touched upon topics to do with advancements in computer technology, cybernetics, economic change, cloning, and space exploration. The following works are quite challenging in terms of the predictions they present: Knoke (1996), Paul and Cox (1996), Berry (1996), Stork (1997), Robertson (1998), Cetron and Davies (1998), Gershenfeld (1999), Johnscher (1999), Canton (1999), Kurzweil (1999), Rantala and Milgram (1999).

[84] In providing evidence for the likelihood of their future predictions, they often use the work of other forecasters to support their stance. These works also track the changes that have occurred over time, setting their findings in the context of larger events in history, and then making predictions. See especially the exceptional timeline compiled and presented by Ray Kurzweil (1999, pp. 261-280) in The Age of Spiritual Machines.

[85] Within the field of science Kaku states, “the heyday of reductionism has probably passed. Seemingly impenetrable obstacles have been encountered which cannot be solved by the simple reductionist approach.”

[86] For instance the founding members of the Media Lab were made up “of a filmmaker, a graphic designer, a composer, a physicist, two mathematicians, and a group of research staff who, among other things, had invented multimedia in preceding years. We came together… [t]he common bond was not a discipline, but a belief that computers would dramatically alter and affect the quality of life through their ubiquity, not just in science, but in every aspect of living” (Negroponte 1995, p. 225).

[87] See Ellul (1964, pp. 395, 414, 430).

[88] Kaku (1998, p. 112) observes that “[s]cientists are proceeding to explore this possibility with remarkable speed. The first step in attempting to exploit the human brain is to show that individual neurons can grow and thrive on silicon chips. Then the next step would be to connect silicon chips directly to a living neuron inside an animal, such as a worm. One then has to show that human neurons can be connected to a silicon chip. Last… in order to interface directly with the brain, scientists would have to decode millions of neurons which make up our spinal cord”. The main obstacle at present is the complexity of the brain. “The brain’s wiring is so complex and delicate that a bionic connection with a computer or neural net is something that is, at present, seemingly impossible without causing permanent damage…. Nonetheless, this has not prevented some individuals from making certain conjectures about mind/machine links, which properly belong in the far future” (p. 115).

[89] But they can do this with authority because their predictions are supported by work that is being conducted in universities and commercial research laboratories around the world. Berry (1996, p. 5) is quite right when he comments that “[e]vents only seem extraordinary at the time when they are predicted, never after they have happened.”

[90] See also Turban and McElroy (1998) who discuss the Internet and smart cards in their paper titled ‘Using smart cards in electronic commerce’.

[91] While economic forecasts have been largely statistical in nature, this chapter has also served to show those studies that are qualitative in approach. There are numerous critics who believe that qualitative forecasting is based on experience, understanding or unconscious personal bias and thus is subjective and unreliable. To this, the researcher argues that the quantitative approach is also not exempt from these potential flaws. Any modeller must choose the parameters and variables they include in their model, as well as the assumptions and sensitivity factors. Additionally, not all events can be interpreted in the limited context of numbers. Also see chapter three for further discussion on this point.