1 introduction - inflibnetshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf ·...

35
1 1 Introduction 1.1 Introduction Mention Project Management and to most people the image that is conjured up is of large scale construction project. On personal level, we all have a number of projects ongoing pursing a course of study, buying a house or organizing a holiday. The level of complexity differs the underlying principle of delivering the result to a defined customer at a given point in time remains the same. At commercial level, the effectiveness of the project management process will determine whether or not those projects play a role in providing a source of competitive advantage for an organization. Businesses regularly use project management to accomplish unique outcomes with limited resources under critical time constraints. In parallel, the emergence of Information Technology has taken the world by storm as if the world never existed without it. Countless companies have taken birth which offer products/services in Information Technology. Strategies generating in the organization to remain competitive in today’s era do have Information Technology as its integral part with its role defined. These three topics: Project Management, Information Technology and Strategic management are so closely knitted to form an organization’s pillar. Briefly an attempt to understand each topic and the present challenges which are highlighted in the research in the area of Project Management in IT Industry with strategic perspective.

Upload: doantuyen

Post on 06-Feb-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

1

1 Introduction

1.1 Introduction

Mention Project Management and to most people the image that is conjured up is of

large scale construction project. On personal level, we all have a number of projects

ongoing – pursing a course of study, buying a house or organizing a holiday. The

level of complexity differs the underlying principle of delivering the result to a defined

customer at a given point in time remains the same. At commercial level, the

effectiveness of the project management process will determine whether or not

those projects play a role in providing a source of competitive advantage for an

organization. Businesses regularly use project management to accomplish unique

outcomes with limited resources under critical time constraints.

In parallel, the emergence of Information Technology has taken the world by storm

as if the world never existed without it. Countless companies have taken birth which

offer products/services in Information Technology.

Strategies generating in the organization to remain competitive in today’s era do

have Information Technology as its integral part with its role defined.

These three topics: Project Management, Information Technology and Strategic

management are so closely knitted to form an organization’s pillar. Briefly an

attempt to understand each topic and the present challenges which are highlighted

in the research in the area of Project Management in IT Industry with strategic

perspective.

Page 2: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

2

1.2 Brief History of Project Management

1.2.1 Introduction

The past several decades have been marked by rapid growth in the use of project

management by Organizations as a means to align and achieve their objectives.

Project management provides organizations with powerful tools that improve its

ability to plan implement and control its activities as well as the ways in which it

utilizes its people and resources. Of the many forces involved, three are paramount:

(1) The exponential expansion of human knowledge: The expansion of knowledge

allows an increasing number of academic disciplines to be used in solving problems

associated with the development, production, and distribution of goods and services.

(2) The growing demand for a broad range of complex, sophisticated, customized

goods and services: Satisfying the continuing demands for more complex and

customized products & services depends on our ability to make product design an

integrated and inherent part of our production and distribution systems.

(3) The evolution of worldwide competitive markets for the production and

consumption of goods and services: Worldwide market forces include cultural and

environmental differences in our managerial decisions about what, where, when and

how to produce and distribute the output.

All the three forces combine to mandate the use of teams now to solve problems that

could have been resolved earlier by individuals alone. These three forces combine to

increase greatly the complexity of goods and services produced plus the complexity

of the processes used to produce them. This, in turn, leads to a need for more

sophisticated systems to control both, outcomes and processes.

Project management has been practiced for thousands of years since the Egyptian

era, however, it has been about half a century ago that organizations start applying

Page 3: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

3

systematic project management tools and techniques to complex projects. In the

1950s, Navy employed modern project management methodologies in their Polaris

project. During the 1960s and 1970s, Department of Defense, NASA, and large

engineering and construction companies utilized project management principles and

tools to manage large budget, schedule-driven projects. In the 1980s, manufacturing

and software development sectors started to adopt and implement sophisticated

project management practices. By the 1990s, the project management theories,

tools, and techniques were widely received by different industries and organizations.

1.2.2 Four Periods of Project Management

13Snyder and Kline (1987) noted that the modern project management era started in

1958 with the development of CPM/PERT. 7Morris and Hough (1987) argues that the

origin of project management comes from the chemical industry just prior to World

War II. 7Morris and Hough (1987) further notes that the project management is

clearly defined as a separate discipline in the Atlas missile program, especially in the

Polaris project. Some literatures pointed the origin of project management to Henri

Fayol’s (1916) five functions of a manager: (1) to plan, (2) to organize, (3) to

coordinate, (4) to control, and (5) to direct or command. 4Kerzner (1998) observes

that project management is an ―outgrowth of systems management.‖

Four periods have been identified to better capture the history of modern project

management: (1) prior to 1958, (2)1958 – 1979, (3) 1980 – 1994, and (4) 1995 to

present. Table 1 summarizes four distinctive periods. Each period discusses the

history of (1) project management tools and practices and (2) representative actual

projects.

Page 4: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

4

Table 1 : Four periods of project management

Periods Theme Sub context

Prior to 1958 Craft system to Human Relations

Administration

Project

Management

Actual Projects

1958 – 1979 Application of Management

Science

1980 – 1994 Production Center: Human

Resources

1995 to

present Creating a new environment

PRIOR TO 1958: CRAFT SYSTEM TO HUMAN RELATIONS

ADMINISTRATION

Project Management The origin of the modern project management concept

started between 1900s and 1950s. During this time, technology advancement

shortened the project schedule. Automobiles allowed effective resource

allocation and mobility. Telecommunication system increased the speed of

communication. The job specification was widely used and Henry Gantt

invented Gantt chart. The job specification later became the basis of

developing the Work Breakdown Structure (WBS).

Actual Representative Projects

T.D. Juhah’s Project Plan for Building Pacific Railroad

In 14

T.D Judah’s (1857) "A Practical Plan for Building the Pacific Railroad,"

engineers and clerks at the project office prepared a formal report upon

arrivals of survey information from the field managers. Once the data has

been updated and analyzed, the project office forwarded orders to resident

engineers, and field managers initiated the project. The project office also

Page 5: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

5

dealt with relationship with investors, field survey, cost estimation, feasibility

study, and others. Project office simply functioned as an administrative office.

Hoover Dam (1931 – 1936)

In 1928, the congress passed the Boulder Canyon Act assigning $175 million

to the Hoover Dam. The ―Big Six‖ that consists of Utah Construction, Pacific

Bridge, H.J. Kaiser, W.A MacDonald and Kahn, Morrison-Knudsen, and J.H.

Shea formed a consortium to work as a general contractor. It was crucial for

the companies to have a detail project planning, controlling, and coordinating

plan because the project involved six independent companies. The

construction site was located in the middle of the desert with no

infrastructures. Boulder City was created to accommodate their workers to

stay near the construction site.

The project required both physical and human resources. The project

employed approximately 5,200 workers, and large amount of construction

resources including concrete, structural steel components, steel pipe, and so

on were required (Bureau of Reclamation 1985). The project was successfully

completed under budget and ahead of schedule 6Moore (1999). The Hoover

dam project is still one of the highest gravity dams in the U.S., which

generates more than four billion kilowatt-hours a year.

Manhattan Project (1942 – 1945)

The Manhattan project was the pioneer research and development (R&D)

project that designed and built the atomic bomb. The initial project was

proposed in 1939 to defend possible threats from Germany. In 1941, the

Office of Scientific Research and Development (ORSD) were established to

coordinate government-sponsored projects, and the Manhattan project

initiated in 1942. The OSRD coordinated universities and resources for the

research and development of the atomic bomb. The project was successfully

Page 6: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

6

tested in July of 1945, a month before the bomb was dropped on Hiroshima,

Japan. The project involved 125,000 labors, and cost nearly $2 billion.

1958-1979: APPLICATION OF MANAGEMENT SCIENCE

Project Management There were significant technology advancement

between 1958 and 1979. In 1959, Xerox introduced the first automatic plain-

paper copier. In the 1960s, many industries were influenced by the

development of silicon chips and minicomputers. In 1969, Bell Laboratories

developed programming language UNIX and computer industry started to

develop rapidly. NASA’s successful Apollo project earmarked a historic event

of the mankind. In 1971, Intel introduced 4004, a 4-bit microprocessor, which

is a foundation of the evolution of Intel’s 80386, 80486, and Pentium

processors in the 1990s. While many dedicated scientists developed

ARPANET, Ray Tomlinson in 1972 introduced the first e-mail software. In

1975, Bill Gates and Paul Allen founded Microsoft. Several project

management software companies were founded during the 1970s including

Artemis (1977), Scitor Corporation (1979), and Oracle (1977).

Between 1950 and 1979, several core project management tools including

CPM/PERT, Material Requirement Planning (MRP) and others were

introduced. CPM/PERT was calculated in large computer systems, and

specialized programmers operated the CPM/PERT mainly for the government

sector projects. The common organizations used the project office as

―brokers of information‖ having small number of skilled schedulers and

estimators (Vandersluis 1998).

Page 7: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

7

Actual Representative Projects

Polaris project (1956 – 1961)

The Polaris project refined the project management concepts as known today

12Sapolsky (1972). The $11 billion Polaris project was undertaken by the U.S.

government to deliver nuclear missiles carried by submarines, known as Fleet

Ballistic Missile. The project was initiated by U.S. Navy in late 1956, and

successfully launched its first Polaris missile in 1961. The Navy created a

new unit called Special Project Office (SPO) to avoid giving the Polaris

project to Bureau of Ordinance and Bureau of Aeronautics 12

Sapolsky (1972).

Apollo project

In 1958, National Aeronautics and Space Administration (NASA) was created.

Between 1969 and 1972, NASA successfully led six missions to explore the

moon. In 1960, 8NASA (1968) set up the Apollo program office to provide

following functions:

Maintain and schedule Apollo missions using PERT.

Procurement and contracting with suppliers such as GE.

Develop management system to measure the performance.

Set up a focal point of the Apollo program.

ARPANET

The Internet is as much a collection of communities as a collection of

technologies, and its success is largely attributable to both satisfying basic

community needs as well as utilizing the community in an effective way to

push the infrastructure forward. This community spirit has a long history

beginning with the early ARPANET. The early ARPANET researchers worked

as a close-knit community to accomplish the initial demonstrations of packet

switching technology described earlier. Likewise, the Packet Satellite, Packet

Radio and several other DARPA computer science research programs were

multi-contractor collaborative activities that heavily used whatever available

Page 8: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

8

mechanisms there were to coordinate their efforts, starting with electronic

mail and adding file sharing, remote access, and eventually World Wide Web

capabilities. Each of these programs formed a working group, starting with

the ARPANET Network Working Group. Because of the unique role that

ARPANET played as an infrastructure supporting the various research

programs, as the Internet started to evolve, the Network Working Group

evolved into Internet Working Group. 5Leiner et al. (2000)

The Internet project began its journey in 1962. It started with series of memos

discussing the concept of ―Galactic Network,‖ by J.C. R. Licklider of MIT

5Leiner et al. (2000). The U.S. Department of Defense initially funded the

project, and Advanced Research Projects Agency (ARPA) coordinated it. The

ARPA’s objective was to schedule and coordinate the activities of the

heterogeneous set of contractors 11

Hughes (1998). The ARPA started to

develop its 10

ARPANET, the origin of the Internet.

The ARPA project was a research and development project that was initially

developed by the ARPA then managed by several organizations. In the

1970s, Federal networking council was formed to support international

organizations and coordinate federal agencies such as NASA, Department of

Energy and others 5(Leiner et al 2000). Different from single organization-

driven projects, the initial ARPANET was driven by numbers of researchers

and organizations. Currently, the Internet is coordinated by several

organizations including the Internet Engineering Task Force (IETF), Internet

Engineering Steering Group (IESG), the Internet Architecture Board (IAB), the

Internet Society (ISOC), etc.

Page 9: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

9

1980-1994: PRODUCTION CENTER: HUMAN RESOURCES

Project Management During the 1980s and early 1990s, the revolution of

IT/IS sector shifted people from using mainframe computer to multitasking

personal computer that had high efficiency in managing and controlling

complex project schedules. In the mid 80s, the Internet served researchers

and developers, and local area networks and Ethernet technology started to

dominate network technology 5Leiner et al (2000).

During the 1950s through 1970s, most computer engineers were responsible

for operating the project management systems because the mainframe

systems were not easy to use. Morris (1985) acknowledged the

unfriendliness of the mainframe software. During the late 1970s and early

1980s, project management software for PC became widely available by a

number of companies in the mid-1980s which made project management

techniques more easily accessible.

Actual Project Cases

Three projects were selected to portray the era of 1980s and early 1990s:

The English-France Channel project (1989- 1991), Space Shuttle Challenger

project (1983-1986), and The XV Calgary Olympic Winter Games (1988).

These projects illustrated the applications of hi technology and the project

management tools and practices. The English-France Channel project was

an international project that involved two government agencies (British and

French government), several financial institutions, engineering construction

companies, and other various organizations between the two countries. The

project goal, cost, schedule, and other factors needed to be adjusted to

conduct the project. The language, use of standard metrics, and other

communication differences needed to be coordinated. The disaster of the

Space Shuttle Challenger instantly brought a lot of attention to the project

management community. The incident brought more interests in risk

Page 10: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

10

management, group dynamics, and quality management. In 1998, the

Calgary Winter Olympic game applied project management to event

management. Its successful adoption of the project management practices

expanded to various event management practices.

1995-PRESENT: CREATING A NEW ENVIRONMENT

On the verge of a revolution that is just as profound as the change in the

economy that came with the industrial revolution. Soon electronic networks

will allow people to transcend the barriers of time and distance and take

advantage of global markets and business opportunities not even imaginable

today, opening up a new world of economic possibility and progress. 9Albert

Gore Jr., Vice President (1997)

Project Management The Internet started to change virtually every

business practices in the mid 1990s (Turban et Al 2000). It provided fast,

interactive, and customized new medium that allowed people to browse,

purchase, and track products and services online instantly. As a result, the

Internet permits organizations to be more productive, more efficient, and

more customer-oriented. Between 1995 and 2000, the project management

community adopted internet technology to become more efficient in

controlling and managing various aspects of projects. While the information

technology revolutionized the traditional business practices, various industries

started to adopt and to apply project management practices.

Actual Project Cases

Year 2000 (Y2K) Project:

The Year 2000 (Y2K) Problem known as the millennium bug referred to the

problem that computers may not function correctly on January 1st, 2000 at 12

AM. It was a man-made problem that started back in the 1950s. President

Page 11: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

11

Clinton issued an executive order 13073 back in February 1998, "Year 2000

Conversion," which required all federal agencies to fix the Y2K problem in

their systems 2DOD (2001). Several government agencies and state

governments initiated the year 2000 awareness program back in 1996. The

order initiated to build a centralized focal point for monitoring all Y2K activities

within the US government. The Y2K project integrated several aspects of

project management. First, the Y2K project had a specific objective (to fix

Y2K problems) and sharp deadline (on January 1st, 2000 at 12:00 AM).

Second, the project was globally and independently conducted that virtually

every organization using computers were at stake. Each organization focused

on correcting Y2K problems within the organization, but the problem was

interrelated due to the dependency of various computer systems via

computer network. Third, there were various methodologies and tools to

remedy the problem. Fourth, from the initiation to completion, detailed

progressive reports were widely available. The Y2K project became the most

documented projects in the project management history because virtually

similar projects were conducted by millions of organization in the world.

Y2K problem boosted many organizations to adopt project management

practices, tools, and techniques to conduct their own Y2K project. Many

organizations set up the project office to control and comply with their

stakeholders regarding Y2K issue. Furthermore, use of the Internet was

common practice for Y2K projects which led to set up a virtual project office.

The goal of the Y2K project office was to deliver uninterrupted turn-of-the-

century, monitor Y2K project efforts, provide coordination, develop risk

management plan, and communicate Y2K compliance efforts with various

stakeholders. The Y2K office was a focal point for all the project works, and

its functions were highly visible that it boosted the awareness and importance

of the project office. In addition, it increased the awareness and importance of

risk management practices to numerous organizations.

Page 12: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

12

Iridium Project

Motorola's $5 billion Iridium project aimed to provide global communication

service virtually anywhere, anytime ¹Barboza (2000). In November 1998, the

Iridium network was established, and started to provide global network

services. In March 2000, Iridium filed for bankruptcy terminating its services.

The project was once viewed as a technological breakthrough; however, it

ended up so quickly and mysteriously. The program office was established

with full time project control managers, software engineers and analysts were

also relocated. In addition, the project control managers utilized sophisticated

project management software, Primavera Project Planner, to handle complex

and inter-related project scheduling management ³Fabris (1996).

1.3 Brief IT History

1.3.1 Algorithmic Logics and IT

The simplest definition of IT is one word: algorithm. Simply defined, an algorithm is a

procedural description of how something should be accomplished. Bureaucracies

are socially institutionalized forms of this, as are the patterns that govern basic

biological processes. What has developed over that last three hundred years is the

creation of tools to describe and replicate both these naturally occurring and

constructed patterns mathematically. The most widely applied example of this is

within the IT field today, both in hardware and software processes.

Algorithms arose from the study of logic tracing back to Aristotle. The word itself is

derived from the surname of Ja’far Mohammad Ben Musa al-Khowarazmi, the ninth

century Muslim mathematician who wrote ―The Science of Restoration and

Reduction‖ that transformed algebra and brought Arabic numerals to Europe

15Leonard (2000). Algorithmic logic was revived as a defined and respectable field of

knowledge by Leibniz, the co-creator of the calculus, at the end of the 17th century.

Leibniz offered a very simply but powerful idea: the entire universe can be described

as comprised of god and nothingness 16

Berlinski (2000). In other words, all of life

Page 13: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

13

follows a discrete binary system that can be modeled, or coded, within a logical

algebraic framework. As such, real-world processes could be mapped using

mathematical symbols, if the underlying algorithms could be identified. This opened

the theoretical possibility of modeling both the social processes of bureaucracies and

the basic sequence of DNA, among others, as mathematical abstractions.

Leibniz insight into algorithmic logic and binary systems are at the heart of IT to

conceive it 16

Berlinski (2000). IT is essentially defining a logical algebraic function

that produces consistent outcomes for specific processes then codifying them in

either software or hardware formats. However, the actual application of this

conceptualization into practical working systems took almost three hundred years.

The following history is a rough outline of how this occurred through a mixed

technical and industry viewpoint, both of which have influenced the form that Leibniz’

conception has taken within information technology. The history begins first with the

desire for a ―machine computer‖, then the application of computing to information

processing under business machine manufacturers, the actual building of binary

coded mainframe machines the 1960’s, and finally the extension of such systems

through the rise of personal computing and internets.

1.3.2 The Development of Machine Computers

The actual attempts to apply algorithmic logic to automate problem solving began in

the industrial revolution. The industrial logic of the time led to numerous attempts to

create ―smart machines‖. As complex and precise mathematical calculations

increasingly came to be essential to a modern industrial economy, machine

calculation came to be seen as a rational solution. Information, like any other

industrial activity, needed to be processed in ever-greater quantities and efficiencies.

At the time, the mathematical computations for such key publications as the Nautical

Almanac, showing tide charts, were done by teams of human ―computers‖. The

quantity and diversity of information needed to be processed, along with the high

incidence of human error, created a general push for the creation of mechanical

computers 17

Campbell-Kelly and Aspry (1996).

Page 14: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

14

The preeminent example of this is Charles Babbage’s attempt to develop the

Difference Engine in England in the 1830’s in order to mechanically produce

solutions for tide-tables accurately and efficiently. The conception, like the underlying

mathematics, was clearly understood, with a defined sequence of mathematical

steps (an algorithm) that would produce the needed tables accurately and

consistently. However, the actual construction of the ―Engine‖ (the name itself

demonstrating the logic of the era) proved almost beyond the technical capabilities of

the time. A small working model was produced, but it was unable to fulfill the table-

making functions for which it had been funded to solve.

Importantly, Babbage also conceived, though never completed, the concept of an

―Analytical Engine‖. The Difference Engine was inherently limited by its dedicated

mechanical design to solving one specific mathematical problem. The Analytical

Engine was conceived as a programmable machine that could perform multiple

mathematical functions rather than merely pre-defined calculations (Moschovitis et al

1999). Conceptually, this was the first attempt to build what would now be

recognized as a computer. It would be another hundred years before a fully

functioning programmable calculator as envisioned by Babbage would be built.

The increasing demand for information processing came to dominate the

development of mechanical computing, overriding Babbage’s general idea of a multi-

purpose calculating machine. Business equipment applications and manufacturers

came to dominate the search for mechanical means to process and tabulate

information. Herman Hollerith developed a mechanical system for processing census

data that was implemented in the US in 1890. The system tabulated the census data

in six weeks, as compared to seven years for the previous census1. Hollerith went

on to form the Tabulating Machine Company which became the foundations of IBM.

The computing industry would continue to be dominated by business equipment

firms developing machines to handle specific information processing needs, as

Page 15: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

15

opposed to generating or manipulating general information, until the advent of the

Second World War 17

Campbell-Kelly and Aspry (1996).

The search into a mechanical means to manipulate and generate information did

continue simultaneously, but was limited to extremely complex problems involving a

few key users concerned with large industrial projects or government-related

programs. Still facing the limits of human computers in a complex industrial world,

mechanical computing increasingly focused on analog machines. Processes that

were too complex to mathematically define, or be handled by existing mechanical

computation, could be modeled through mechanical analogues of the process itself.

These were machines that could create mechanical analogies of complex systems,

such as dams, electrical networks or tides, that could then be replicated and scaled

efficiently 17

Campbell-Kelly and Aspry (1996). Though analog computing was

inherently limited by the need to construct a new model for each process, it did serve

the specific needs of large-scale users. Such computing reached its peak impact

between the two World Wars, with increasingly large and dedicated models helping

analyze increasingly complex engineering and scientific problems.

A key development in analog computing occurred in 1931. Vannevar Bush of MIT

was able to develop a ―differential analyzer‖ that could perform a whole series of

engineering and science problems based on differential equations. While diverse in

application, the machine was not a real computer in the sense of performing and

generating calculations. It was still an analog machine, modeling natural processes

rather than manipulating numbers themselves, even if it could model multiple analog

machines 18

Edwards (1996). As such the application of the machine outside of

predefined engineering or science problems or to problems that could not be

physically modeled (astronomy, physics, weather, and code breaking) was

impossible.

Page 16: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

16

Only at the very end of WWII would the binary logic of Leibniz, Babbage’s general

purpose computing machine concept, Bush’s practical model of a multi-purpose

machine, the experience of business machine firms in robust information processing

equipment, and the need to generate pure mathematical information come together

to form the modern computer architecture. Even then, it would be another two

decades before the use of computers would be integrated with the business

equipment industry to find a widespread base for the use of computing power, and

another four decades before computers were spread to the population as a whole.

1.3.3 Modern Computing

In August of 1944, Hathaway Aiken and a team at Harvard completed the Mark I, the

first fully programmable computer to come into being. The five-ton machine worked

through the inputting of operational codes on lengths of paper tape, and was

designed to produce ballistics computations and code breaking for the US Navy. The

design by Aiken was implemented by IBM and essentially consisted of a row of

electro-mechanical punch-card machines. Being both programmable and automatic

in performing general mathematical calculations, it was in many ways the fulfillment

of Babbage’s original design for the Analytical Engine.

Importantly, the development of the Harvard Mark I represents a convergence of

technologies, ideas and institutions that would dominate many of the basic features

of information technology for the next thirty years. On the technical side, the Mark I

represented the first use of digital architectures and data, though decimal and not

binary. In other words, information was transformed into discrete mathematical

symbols that became both an input and output of the computing process for the first

time. It also marked the full engagement of IBM into the field of computing, creating

a blending of computing and business machines for the first time. The practical skills

developed by IBM engineers in designing information-processing equipment for

industrial uses blended extremely well with the experimental and theoretical

environment of leading research universities. This convergence was further

Page 17: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

17

stimulated by the interests and very substantial economic support of the US military

that viewed computing as a means to an end in support of initially World War II and

the then the Cold War 18

Edwards (1996).

The final technical architecture for the computer as currently defined was formulated

at the Moore School of Electrical Engineering at the University of Pennsylvania in

1944 and 1945. Operating under the auspices of the National Defense Research

Council (NDRC), a team led by John Mauchly and Wallace Eckert had been

commissioned to develop a computer to calculate ballistic tables for the US Navy.

Completed in 1944, ENIAC was the world’s first electronic computer, comprised of

18,000 vacuum tubes, 10,000 capacitors, 6,000 switches, and 1,500 relays.

Because it performed calculations at 5,000 operations per second, faster than

programs could be paper fed, it had to be programmed via hardwiring that called for

physically wiring the machine to determine the circuitry of the programming logic.

The Moore school’s second version, EDVAC, combined advances in electronics with

the programmable flexibility of the Mark I to finally overcome many of the basic

challenges facing computing. Spurred on by John Von Neumann, a member of the

Institute for Advanced Study at Princeton and a consultant to the Manhattan Project,

the EDVAC sought to solve the problems of limited memory (limiting the ability to

store programs), too many vacuum tubes (creating tremendous heat, instability and

high-energy consumption), manual and wired programming (making running new

programs time consuming and tedious), and the use of decimal numbers (limiting the

amount of information that could be processed and stored). Though never

completed, the EDVAC design outlined by Von Neuman, Eckert and Mauchly

defined the base architecture that all computers have to this day: stored-programs,

binary logic of programs and computation, basic input and output units, a control unit

and an arithmetic unit [17

Edwards (1996) 18

Campbell-Kelly and Aspry (1996)]. This

structure provided the road map from which all computers would develop in the

following decades. The government would continue to sponsor widespread research

Page 18: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

18

and development of computer technology, but the early EDVAC pioneers also began

the first real efforts to commercialize digital, electronic, programmable computer

technology after the end of the war.

1.3.4 The Development of the Computer Industry

Throughout the 1950’s and 1960’s the computer industry developed within the broad

framework of implementing the vision of the EDVAC in reality and expanding the

efficiency and robustness of the components themselves. Once again the

commercial market came to be dominated by the demand for large scale mainframe

data-processing machines, as related to solving business or information processing

related problems. While the invention of the stored-program computer created a

potential split of hardware and software, the development of the industry focused on

complete hardware and software solutions developed for specific end-users. In other

words, while the basic innovations of the EDVAC design created the potential for a

multi-purpose computer as exists today, the industry standard of packaged

hardware/software solutions created machines that were far more similar to their

dedicated analog predecessors than to Four periods of project management

mentioned in table 1 the flexible machines in existence currently. The vast majority

of software was custom built for specific main frame systems, almost exclusively by

the producers of the hardware themselves, focusing on specific information

processing needs. The use of electronic circuits gave amazing speed, but a flexible,

multi-purpose ―information machine‖ was still much more theory than reality.

During these decades, computing development was like a building a new cathedral

each time new technology emerged or systems were modified as a user’s needs

changed 19

Raymond (1998). Complete systems of hardware and machine specific

software would be developed to address each end users demand. Computing was

for only large-scale established projects with deep budgets. End-users were locked

into both hardware and software that were not transferable to other systems or uses

even within the same family of computers. Naturally government, in particular the

Page 19: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

19

military, and large corporations were the central users of computing power (18

Edwards 1996, 20

Guice 1997). Of the business-computing companies, IBM

dominated the private market for combined solutions of hardware and software

through the 1960’s, with a roughly 70% market share. It is important to note that

IBM, though dominant in the market, did not have a clear lead in computing

technology over other mainframe companies. IBM provided trust and stability for

firms buying its computer systems, with greater resources to bear on marketing and

maintenance, but no real sustained advantage in the technology itself 17

Campbell-

Kelly and Aspry (1996). Even more important, innovations in the industry tended to

come from private, as opposed to government or university, supported research.

1.3.5 From Vacuum Tube to Microprocessor

The most significant events outside of the development of computing as a business

machine revolved around the transformation of the basic architecture and structure

of computer processing. All computers from the ENIAC forward suffered from the

size, heat and energy constraints imposed by vacuum tubes for computation that is

the creation of the basic circuits that could manipulate binary language. In 1946,

William Shockley, John Bardeen and Walter Brattain of Bell Labs created the first

transistor, a solid state semiconductor that acted as a reliable and efficient

amplifying and switching circuit. By 1958, IBM was producing transistor based

business machines. In 1959, Robert Noyce of Fairchild Semiconductor and Jack

Kilby of Texas Instruments independently create the first integrated circuit (IC)

(21

Moschovitis et al (1999), 22

Braun and Macdonald (1982)). The IC is a seminal

moment in computing history, because it allows the integration of multiple transistors

on a single piece of silicon.

A single silicon chip was able to contain a room full of computing power on a single

chip. In other words, the basic hardwired programming structures that had taken up

a room in the early ENIAC could now be fit in a desktop calculator.

Page 20: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

20

The ability to miniaturize computing opened the door to wide-scale application of

computing power outside of the traditional government and private sector

applications. Even more important, Moore’s Law – named after Gordon Moore, one

of the founders of Intel – states that computing power that is the number of

transistors able to be placed on a single semiconductor, follows a linear trend which

doubles every two years 21

Moschovitis et al (1999). This means that both

miniaturization and computing power have had constant and predictable

development trajectories over the last thirty years, with projects previously unfeasible

because of size or cost concerns becoming possible within a short time frame.

The ultimate development in this process was the creation of the microprocessor in

1971 by Intel. The microprocessor combined all the elements for computation on one

single chip. Rather than have multiple chips each dedicated to a ―hard coded‖

process, the microprocessor enabled one chip to serve multiple functions 22

Braun

and Macdonald (1982). As but one example of the implications of this transformation

in size and computing power, calculators moved from a low-volume high-end

business product composed of dozens of chips (many customized) to a mass market

educational and home product composed of one single IC 17

Campbell-Kelly and

Aspry (1996). After 150 years, the development of the IC finally brought Babbage’s

basic conception of a multipurpose programmable machine computer to society for

the first time.

1.3.6 The Development of the “Information Machine”

The technology transformation provided by the IC pushed the creation of a multi-

purpose information machine that could process, manipulate and most importantly

create information in multiple formats. This trend would progress rapidly with the

widespread development of personal computers and the increasing dominance of

software as a distinct driver of technological change within the computing industry.

Again, IBM played a key role in this development. Like many dominant firms

throughout IT history, IBM’s corporate strategic decisions have shaped the

Page 21: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

21

development of the computer industry even more than its technological advances.

IBM was at the center of two such strategic decisions during the 1960’s 17

Campbell-

Kelly and Aspry (1996).

First, the introduction in 1964 of the IBM System/360was the first computer that

enabled software programs written for one computer to be used with all other

computers within the system family. For the first time, software was scalable as an

organization’s computer needs expanded, with software becoming transferable to

higher levels of computing power. Changing hardware no longer entailed the

creation of entirely new software products or the complete transformation of existing

systems. This dramatically reduced the cost and maintenance of software

development, and let software developers begin to focus on the expansion and

reuse of existing software programs for the first time, even if still limited to specific

hardware architecture.

More importantly it opened the possibility of separating hardware and software

purchase decisions, if hardware and software development ever became unbundled.

This is exactly what happened, and signaled IBM’s second industry changing

strategic decision. In 1969/1970 IBM decided to unbundle its software and hardware

sales, opening the market for independent software producers to compete to supply

the nearly 70% of the market IBM controlled. The combination of these two events

led to the development of a software industry focused on providing previously

bundled services and products, especially the initial development of a packaged

software sector.

The simultaneous miniaturization of ICs and the formation of an independent

software sector converged to fundamentally reorganize the computer industry in

1977 with the release of the Apple II, the world’s first true personal computer. The

basic form of the PC, as it came to be structured in the Apple II and the industry

overall, was created at Xerox PARC in the early 1970’s, and then slowly brought to

Page 22: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

22

market throughout the next decade 21

Moschovitis et al (1999). The graphical user

interface (GUI), the ―mouse‖, wysiwig visualization (what you see is what you get),

and the Ethernet were all developed within the Palo Alto lab, but later

commercialized by other companies throughout Silicon Valley. The pattern

established by the introduction of the PC became characteristic of the industry

overall. Entire new industries would develop virtually overnight, spurred by

innovation in private or university labs that were commercialized by new firms that

quickly challenged existing dominant players and business models 17

Campbell-Kelly

and Aspry (1996).

The PC also signaled the emergence of software as a key industry driver 23

Hoch et

al (2000). The market for software and prepackaged solutions in particular, has

slowly come to shape the development of the computer industry overall5. From the

introduction of the personal computer in the mid-1970’s forward, the market for

packaged software has increasingly expanded. Even if hardware came with

proprietary software operating systems, such as the Apple OS, IBM’s OS/2 or MS-

DOS, independent software vendors provided additional programs to add

functionality to these systems. Increasingly through the early 1980’s, these

packaged software products became essential determinants of hardware

acceptance in the market. Programs such as Microsoft Word, VisiCalc, Lotus 123,

and PageMaker established whole new industries and led to the promotion of the

specific hardware systems they ran on.

By the early 1990’s, the co modification of computing power and memory beyond

basic user needs led to software being determinate in the range of uses and

possibility of computing overall. The trend has continued with recent surveys

indicating that PC owners have on average over 137 software titles on their home

PCs (WSJ, 10-Aug-00). The computer has truly become an ―information machine‖,

able to perform multiple functions of processing and generating information.

Page 23: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

23

Additionally, while customized software solutions are still prevalent for large

systems, the expansion of packaged software has slowly continued to expand

throughout the industry. By the early 1990’s, the expansion of computing in general,

and the internet in particular, gave rise to packaged software solutions based in

Enterprise Resource Planning, databases and internet communications for large

scale, as opposed to individual, users for the first time. Many of these programs took

the shape of packaged systems that were sold broadly but that could be customized

to individual user needs. The full direction of the industry since IBM’s initial decision

to unbundle software and hardware is demonstrated by the transformation of

business machine or computer makers into predominantly software companies.

Firms like Oracle, Sun Microsystems and IBM are fully or in large part business

software companies (Hoch et al 2000:27). The recent rise of firms specializing in

B2B Internet commerce software solutions are merely inheritors of this overall trend.

The essential aspect of the rise of an independent software industry was the

movement away from fixed, preprogrammed solutions of either systems or ICs. The

flexibility and versatility that software enabled became the key aspect of all computer

based systems. As ICs have been embedded in an ever wider range of products,

this trend has continuously moved into the most miniature of products, with

everything from wristwatches to cell phones containing basic software code in place

of hardwired IC solutions. Software has increasingly come to dominate the direction

of computer technology, and essentially signifies the transformation of the computer

industry to what is now called information technology. The other significant

computing technology trend from the 1960’s, the internet, has only further expanded

on this basic pattern of embedded computing power, given flexibility and dynamism

by software, to create ever greater abilities to process and generate information in

multiple forms.

Page 24: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

24

1.3.7 Information Technology Fully Formed: The Internet

In the late 1960’s, prior to the development of a versatile and affordable ―information

machine‖ in the form of the PC, computing systems were complex, dedicated,

customized and expensive. Leading research institutes throughout the United States

all had proprietary systems focused on specific computational problems. The inability

to share information or to inexpensively recreate such systems pushed the Defense

Advanced Research Project Agency (DARPA) to fund research into creating a

system to link the computing centers and allow researchers to share information

20Guice (1997). The project became to be known as the ARPAnet, a dedicated

communications systems linking key research institutes throughout the US. Built by

a combined team of DARPA officials and Bolt, Beranek and Newman (BBN) of

Cambridge, Massachusetts, the original ARPAnet when online in December 1969

connecting UCLA, UCSB, SRI International and the University of Utah (Hafner and

Lyon 1996).

The basic architecture was simple in design, but complex in reality. Because each

computer system had its own unique software and hardware systems, a generalized

interface needed to be designed to allow each system to connect to the network.

The solution was a dedicated system of Interface Message Processors (IMPs or the

equivalent of today’s servers) that had the dedicated role of handling connections

and managing information flows within the network. In essence, the net was a

dedicated communications loop (like a ring road around a city) with each computing

center having a dedicated link (on ramp). This basic architecture continues to

structure all network designs no matter how large or small. The other basic but

crucial design choice included designing the network as packet switching

architecture (Hafner and Lyon 1996). The basic concept being to break down

information into pieces (or packets) that can be individually transmitted across the

network. To continue the metaphor above, packets are like cars on the ring road,

individuals broken up into multiple units, rather than consolidated into one vehicle

and route such as on a train.

Page 25: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

25

The network faced two crucial problems that packet switching solved. First, without

packets, transmission between two points on the network would have to be on

dedicated lines. In this way, if UCLA and UCSB were using the network any other

users would be locked out until finished. This is much the same as when two people

are using a telephone. Even if no information is being transmitted, the circuit is

dedicated to that specific connection. This was contradictory to the very notion of

creating a flexible, inexpensive and efficient means of connecting multiple systems.

Packets allowed for multiple users to transmit information simultaneously, with

packets from different users flowing through the network to be reassembled at their

destination. ―Dead‖ or empty spaces in transmission from one system could be filled

by packets from another, creating a constant flow of information in the network.

The second problem focused on the reliability of the network. Loss of data or the

cancellation of connections presented a huge problem for a network seeking

reliability and robustness. The solution, again made possible by the concept of

packets, was the ―store and forward‖ concept. The IMPs would receive packets of

information, store them, then forward them only when the connection was open and

previous packets in the same sequence had been received correctly. This enabled

the network to be incredibly flexible and robust. Each individual packet could be

routed on the most efficient path to its destination, and if packets could not be

delivered, the packets would be held until the receiving system was ready.

This basic structure, and its initial success, established the overall framework in

which all networks have been developed. The lessons and innovations in this basic

design have merged with the software and PC trends outlined above to move

networking to an ever-greater number of users and applications.

Two of the most significant breakthroughs for the expansion of networking were the

creation of Ethernet and the TCP/IC protocol 21

Moschovitis et al (1999), both

algorithms for managing information flows within and between networks. The

Page 26: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

26

Ethernet was invented by Bob Metcalfe at Xerox PARC in 1973. The revolutionary

aspect of Ethernet, much like the ARPAnet and ALOHAnet that it modeled, was the

ability to create local area networks (LAN) where multiple users could simultaneously

access resources from different locations. Ethernet signified the end of time-share

computing as a model for organizing computer resources, and assured that the

nodes in the ARPAnet could be extended via intranets to multiple users at each

University. In other words, individuals could for the first time access a network of

computing power, both locally (via the Ethernet) and nationally (via a specific node's

connection to the ARPAnet).

The second revolutionary transformation was the concept of connecting networks

themselves that is creating internets that connected individual networks into

workable wholes. The basic features of transmission control protocol (TCP) were

outlined by Vint Cerf and Bob Kahn in 1974. TCP simply was the creation of the

architecture and language crucial to building a ―network of networks‖. The basic

function of TCP is that it establishes a common language at ―gateways‖ between

networks through which information can be exchanged and forwarded to their

ultimate destination. In 1974, the power of TCP was demonstrated by the successful

sending of a message between the three technically distinct packet-radio network,

satellite network and the ARPAnet. The successful completion of the transmission

outlined the feasibility of internetworks, and was made even more robust with the

addition of an internet protocol (the IP in TCP/IP) in 1978 to handle the routing of

messages over multiple networks. This opened for the first time the possibility of any

user to access the computing power of any network in the world.

These basic developments created the environment in which, as PCs became ever

more present, individuals increasingly began to share information and communicate

with ever-wider audiences of users. By the end of the 1980’s, the vast majority of

scientists and researchers, as well as many corporate users in the US and around

the world were active users of both intra- and inter- nets. ARPAnet spawned

Page 27: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

27

numerous other networks for either specific military or science use, as well as the

creation of private commercial networks (20

Guice 1997, 24

Hafner and Lyon 1996).

Contrary to the original intention of the network, the main push behind the interest in

the internet was the use of email programs, and not the exchange of research or

sharing of computer resources. The transformation of the internet into a generalized

global and mass phenomenon occurred with the creation of HTTP and the Internet

―browser‖ in the early 1990’s.

Up until the creation of hypertext and internet addressing, sharing information

through the networks was complicated by the lack of a common language for

information exchange. The variety of computer systems, while having a common

language in TCP in which to exchange information or commands, prevented the

actual reading of information by the receiving computer if some incompatibility

existed between the two systems. Just as what happens when someone sends a

WordPerfect document to a computer that only has Microsoft Word. In early 1991,

Tim Berners-Lee and a group of CERN programmers created three essential tools to

overcome these limitations: Hypertext Transfer Protocol (HTTP) that allows for a

common platform for the exchange of digital information between computer systems,

Hypertext Markup Language (HTML) as the basic programming language for

creating documents for HTTP exchange, and the Universal Resource Locator (URL)

that allowed for universal internet addresses to be assigned to specific information

locations. They called their system the World Wide Web 21

Moschovitis et al (1999).

This established the framework in which an individual could connect to any computer

network in the world and communicate in an identical programming language that

would allow for the exchange of information.

The architecture existed for an expanded use of the internet, but the actual opening

up of the network depended on creating a more accessible interface for individual

computer users different than the command driven text model that dominated

computing at the time. The creation of the Mosaic browser software interface, by

Page 28: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

28

Marc Andreessen and Eric Bina at the University of Illinois in 1993, gave networks

and internets a GUI interface similar to what people had come to expect in their PCs.

This was the final piece of the puzzle that caused the hyper-expansion of the

―Internet‖ as it exists today

.

Finally, 24 years after the launch of the ARPAnet, individuals had the ability to

connect to a network of computers, create and exchange text and graphical

information, all through a simple graphical interface. And just like the early networks,

email (in addition to religion and sex) helped propel the network from a relatively

closed network of scientists and researchers into a mass phenomenon and industry

in its own right.

1.3.8 Post-PC Era

The completion of the pieces of the ―net‖ puzzle cemented the final end of computing

as it has been structured to the present. From a longer historical point of view, the

basic pattern of ever decreasing computer size with increasing connectivity and ease

of use has been a constant in the industry's development from Babbage’s time to the

present. However, computers, that are the basic box through which manipulates

information, are no longer stand-alone machines. Like the original goal behind the

ARPAnet, the network is now the computer. This is not to argue that PCs will quickly

fade from existence, but rather to recognize that both the long term and short term

trends keep moving technology away from stand-alone computing towards

information handling and manipulation within a networked environment.

This movement has followed the embedding of ICs in ever-greater quantities and an

every wider number of products. Wireless computer based networking is already a

reality, but wireless appliance driven connectivity is just unfolding. The increasing

blurring of computation and the natural sciences presents the possibility of universal

Page 29: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

29

connectivity of both biological and created systems. There is extensive research

trying to embed binary code on DNA, creating biological computational structures

.

This has been paralleled by attempts to use basic atomic structures as binary

circuits. Both of these are based in an increasing move towards nanotechnology,

which is cell or atomic level creation of ―information machines.‖ These moves signify

the increasing blurring of once distinct disciplines and definitions of IT. TCP/IP, from

its very first test, has been designed to connect multiple and differential networks.

Over the past decade, TCP/IP has been slowly extended to ever more devices,

ranging from cell phones to kitchen appliances 25

Nijhawan (2000). The new bio-

computing models will come with such connectivity built in. This means that IT will

increasingly be embedded networked computing, with an ever-greater range and

complexity of access and connection linking natural and built environments.

Software is IT: the Bricklayers of the Networked Society

The basic patterns outlined in the historic development of information technology still

structure the industry today. Beginning with the insight into binary systems and

algorithmic logic, which combined with the desire to have machines perform tasks

too complex or too critical for humans, IT has developed into a basic medium for the

production, manipulation and dissemination of information in all forms. This is

paralleled by a tremendous push to digitalize human knowledge, that is place it in

binary and algorithmic form, in order to adapt it to this basic IT structure. The desire

for flexibility that has characterized the development of IT from the beginning is most

easily achieved through software tools and methods. The evolution of IT to the

embedded networked computing structure developing now is based on the layering

of level upon level of algorithms and digital information, some hardwired and some

soft. What is essential to signal, however, is that the process of both building

algorithms and digitalizing information is increasingly coming to be dominated by

software rather than hardware structures?

Page 30: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

30

This is clearly seen in the history above, but it is perhaps little understood because

off the relative recent development of software as a stand-alone industry, product

and process. Not only are computers increasingly dependent on software, but the

Internet itself — with algorithms embedded in software form for TCP/IP, web

browsers, network routing and management, Java scripts, databases and commerce

to name only a few — is a reflection of the dominance of software as the central

method for digitalizing algorithms.

1.3.9 IT History summary

In short, history of computer IT development can be divided into three eras: the

mainframe era from the 1950s to the 1970s, the microcomputer era from the 1980s

to the early 1990s, and the Internet era from the 1990s to the present. The

mainframe era is characterized by centralized computing, where all computing needs

were serviced by powerful computers at the computer center. The proliferation of

microcomputers led to decentralized computing. Computing resources become

readily accessible to more users. This is a period that witnessed improved user

performance and decision-making quality. When computer networks became

pervasive in the Internet era, decentralized computing evolved to distributed

computing, where computing resources are located in multiple sites, as in

decentralized systems, but all of the computing resources are connected through

computer networks. People in the Internet era are far more empowered than in

previous eras, because they have access to not only technology tools as before, but

also to shared knowledge from others. 26Adopted from Applegate, Austin, and

McFarlan (2003)

Page 31: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

31

Table 2: IT evolution and strategic management relevance

Mainframe Era

1950s to 1970s

Microcomputer Era

1980s to early

1990s

Internet Era

1990s to present

Dominant

Technology

Mainframes,

stand-alone

applications,

centralized

databases

Microcomputers,

workstations, stand-

alone and client-

server applications

Networked

microcomputers,

client-server

applications,Internet

technology, Web

browser, hypertext,

and hypermedia

IS motivation Efficiency Effectiveness Business value

Information

Systems

Transaction

processing

systems (TPS),

management

information

systems (MIS),

Limited

decision

support system

(DSS)

Comprehensive

decision support

system (DSS),

executive support

systems (ESS),

enterprise resource

planning (ERP)

business intelligence

(BI), human

resource

management

(HRM), expert

systems (ES)

Supply chain

management

(SCM),customer

relationship

management

(CRM), knowledge

management (KM),

strategic

information systems

(SIS), multi-agent

systems (MAS),

mobile information

Systems

Page 32: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

32

Mainframe Era

1950s to 1970s

Microcomputer Era

1980s to early

1990s

Internet Era

1990s to present

Strategic

management

relevance

Provide

information for

monitoring and

control of

operations

Provide information

and decision support

for problem solving

Support strategic

initiatives to

transform

Table 2 summarizes the IS and their motivations during those three IT evolution

eras. Although IS are separately listed in the three eras, the lists are not mutually

exclusive. In particular, in the Internet era, businesses are still heavily dependent on

systems conceptualized and developed in earlier eras, such as TPS, MIS and DSS.

Clearly, the role of business IS has evolved and expanded over the last 5 decades.

Early systems in the 1950s and 1960s were used primarily for dealing with business

transactions with associated data collection, processing and storage. Management

information systems (MIS) were developed in the 1960s to provide information for

managerial support. Typical MIS are report based, with little or no decision-making

support capabilities. Decision support systems (DSS) first appeared in the 1970s.

Various analytical tools, models and flexible user interface for decision support at

solving difficult problems, such as planning, forecasting and scheduling. Executive

support systems (ESS) are specialized DSS designed to support top-level

management in strategic decision making 27

O’Brien (2005).

The 1990s saw an increased emphasis on Strategic Information Systems as a result

of the changing competitive environment. Competitive advantage became a hot

strategic management topic. IT and IS were developed to support business strategic

initiatives. The commercialization of the Internet in the mid 1990s created an

Page 33: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

33

explosive growth of the Internet and Internet-based business applications. Using the

Internet standards, corporations are converting their old incompatible internal

networks to Intranets. Also based on Internet standards, Extranets are built to link

companies with their customers, suppliers and other business partners 28

Chen

(2005).

What kind of information systems would be considered strategic information

systems? Although strategic support systems are almost exclusively used for top

executives dealing with strategic problems, a strategic information system can be

any type of IS that plays a key role in supporting business strategies. McFarlan’s

strategic grid defines four categories of IT impact: Support, Factory, Turnaround and

Strategic 26

Applegate, Austin & McFarlan (2003). When the IT has significant impact

on business core strategy, core operations or both, the corresponding IS are

considered strategic information systems. Thus, various information systems may be

dealt with in strategic management.

Many researchers have written on the strategic importance of information and

knowledge in the networked economy. 29

Nasbitt (1982) observed that the world was

transforming from an industrial to an information society, and IT would dominate the

economic growth of developed nations. 30

Quinn (1992) argued how knowledge- and

service-based systems are revolutionizing the economy. 31

Shapiro and Varian

(1999) discussed information-based products and services, and how to use

information to maximize economic gain.

IT and IS have made it possible to access vast amounts of information easily and

quickly. Systems such as enterprise resource planning (ERP) give managers the

ability to monitor the operation of the entire organization in real time. Executive

information portals have allowed senior managers to take a much more

comprehensive view of strategic management than ever before. Tools such as the

Page 34: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

34

balanced scorecard 32

Kaplan & Norton (1992) give a holistic view of the business

performance by integrating factors in multiple business functions.

In the last few years, business process management (BPM) software has been

designed with the intent of closing gaps in existing ERP deployments. As companies

are increasingly faced with problems associated with incompatible functional

systems from different vendors, enterprise application integration (EAI) has become

an important research.

BPM systems have been deployed to lower the cost and complexity of application and

data integration. Another recent development is Web services enabled by standards

based protocols (such as XML, SOAP, UDDI and WSDL). The wide acceptance of

Internet protocols also led to the emergence of service-oriented architectures (SOA).

SOA focus on building robust and flexible systems that provide services as they are

requested in a dynamic business process environment. Instead of being

programmed in advance, services are generated, brokered and delivered on the fly.

Figure 1: Chronology of strategic management and IT development

Page 35: 1 Introduction - INFLIBNETshodhganga.inflibnet.ac.in/bitstream/10603/2005/9/09_chapter 1.pdf · completed under budget and ahead of schedule 6Moore ... specialized programmers operated

35

Figure 1 presents a timeline that lists major developments in strategic

management and IT/IS. Although the two fields have progressed in their

separate paths, there are many instances where their paths crossed.

As shown in table 2 and the discussion following it, the motivation of IS has

shifted from efficiency to effectiveness, and in the Internet era, to value creation.

On one hand, IT is playing a more active and important role in strategic

management. On the other hand, strategic management concerns have

influenced the development of IS. In many cases, the theories and principles of

strategic management led the way of IS development. IT and IS, in turn, have

made it more feasible for those theories and principles to be practiced in

businesses.