the evolution of computer

Post on 02-Jul-2015

485 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

This is my report in MIS at PNU MA class. All the materials (e.g.text, graphics, images) I used were downloaded from the net. I just came up for some important details and proceeded with the presentation..

TRANSCRIPT

THE EVOLUTION OF COMPUTER HARDWARE

Reporter: LOLITA D. DE LEON

GENERATIONS OF COMPUTER HARDWARE

The first and second

generations of computer

hardware were based on vacuum

tube and transistor technology.

The third and fourth generations

were based on semiconductor

technology

First Generation: Vacuum Tube Technology, 1946-1956

Relied on vacuumtubes to store andprocess information

Consumed a greatdeal of power, wereshort-lived, andgenerated a greatdeal of heat

Extremely limited memory and processing

capability and were used for very limited

scientific and engineering work.

Maximum main memory size was

approximately 2000 bytes (2 kilobytes), with

a speed of 10 kilo instructions per second.

Rotating magnetic drums were used for

internal storage and punched cards for

external storage.

Second Generation: Transistors, 1957-1963

Transistors replacedvacuum tubes as thedevices for storingand processinginformation.

Much more stableand reliable thanvacuum tubes, theygenerated lessheat, and theyconsumed less power.

• Each transistor had to be individually

made and wired into a printed circuit

board – a slow, tedious process.

• Magnetic core memory was the primary

storage technology of this period.

• This system had to be assembled by hand

and, therefore, was very expensive.

• 32 kilobytes of RAM memory and speeds

reaching 200,000 to 300,000 instructions

per second.

Third Generation: Integrated Circuit, 1964-1979

Relied on integrated

circuits, which were made

by printing hundreds and

later thousands of tiny

transistors on small silicon

chips – semiconductors.

Memories expanded to 2

megabytes of RAM

memory, and speeds

accelerated to 5 MIPS.

FOURTH GENERATION: VERY LARGE-SCALE INTEGRATED CIRCUITS,

1980-PRESENT

Use very large-scale integrated circuits (VLSIC),

which are packed with hundreds of thousands

and even millions of circuits per chip.

Costs have fallen to the point where desktop

computers are inexpensive and widely

available for use in business and everyday

life.

Computer memory sizes have mushroomed

to over 2 gigabytes in large commercial

machines; processing speeds have

exceeded 200 MIPS.

VLSIC technology has fueled a growing

movement toward microminiaturization – the

proliferation of computers that are so small,

fast, and cheap that they have become

ubiquitous.

What is a Microprocessor? What is a Chip?

Very large-scale integrated circuit

technology, with hundreds of

thousands (or even millions) of

transistors on a single chip,

integrates the computer’s

memory, logic and control on a

single chip; hence the name

microprocessor, or computer on a

chip.

Chips are measured in several ways:

1. Word length – the number of bits can

be processed at one time by a

computer.

An 8-bit chip can process 8 bits, or 1

byte, of information in a single

machine cycle. A 32-bit chip can

process 32 bits or 4 bytes in a single

cycle. The larger the word length, the

greater the speed of the computer.

2. A second factor affecting chip speed

is cycle speed.

Megahertz – a measure of cycle speed,

or the pacing of events in a computer;

one megahertz (MHz) equals one

million cycles per second.

3. A third factor affecting speed is the

data bus width – the number of bits

that can be moved at one time

between the CPU, primary storage,and the other devices of a computer.

Obviously, to get a computer to

execute more instructions per

second and work through

programs or handle users

expeditiously, it is necessary to

increase the word length of the

processor, the data bus width, or

the cycle speed – or all three.

Reduced Instruction Set Computing(RISC) – technology used to enhance

the speed of microprocessors by

embedding only the most frequently

used instructions on a chip.

Reduced instruction set (RISC)

computers have only the most

frequently used instructions embedded

in them. A RISC CPU can execute

most instructions in a single machine

cycle and sometimes multiple

instructions at the same most time.

RISC is most appropriate for scientific

and workstation computing, in which

repetitive arithmetic and logical

operations on data or applications

calling for three-dimensional image

rendering occur.

Programs written for conventional

processors cannot automatically be

transferred to RISC machines; new

software is required.

CATEGORIES OF COMPUTER

We can use size and processing speed to categorize contemporary computers as mainframes, minicomputers, PCs, workstations, and supercomputers.

MAINFRAME – is the largestcomputer, a powerhouse with

massive memory and extremely

rapid processing power. It is used

for very large business, scientific, or

military applications in which a

computer must handle massive

amounts of data or many

complicated processes.

MINIFRAME – is a mid-range computer,

about the size of an office desk, often

used in universities, factories, or

research laboratories.

PERSONAL COMPUTER (PC) – sometimes

referred to as a microcomputer, is one

that can be placed on a desktop or

carried from room to room. PCs are

used for personal and business

applications.

WORKSTATION – also fits on adesktop but has more powerful

mathematical and graphics

processing capability than a PC

and can perform more

complicated tasks at the same

time than a PC. Workstations are

used for scientific, engineering,

and design work that requires

powerful graphics or

computational capabilities.

SUPERCOMPUTER –is a highly

sophisticated and powerful machine

used for tasks requiring extremely rapid

and complex calculations with

hundreds of thousands of variable

factors.

Supercomputer have traditionally

been used in scientific and military

work, but they are also starting to be

used in business.

Problem with this classification scheme:A PC today has the computing power of a

mainframe from the 1980s or the

minicomputer of a few years ago.

Powerful PCs have sophisticated graphics and

processing capabilities similar to workstations.

PCs still cannot performs as many tasks at

once as mainframes, minicomputers or

workstations; nor can they be used by as

many people simultaneously as these larger

machines.

In another decade, some PCs might have the

power and processing speed of today’s

supercomputers.

SERVER COMPUTERS –are specifically

optimized for

network use, with

large memory and

disk storage

capacity, high-

speed

communications

capabilities, and

powerful CPUs.

Distributed Processing – thedistribution of computer

processing work among multiple

computers linked by a

communication network.

Centralized Processing – processingthat is accomplished by one large

central computer.

DOWNSIZING – the process oftransferring applications from

large computers to smaller ones.

Cooperative Processing – type ofprocessing that divides the

processing work for transaction –

based applications among

mainframes and PCs.

MAINFRAME TASKS PC TASKS

File input/output User interface/screen

presentation

COOPERATIVE PROCESSING

Help screens

Editing data fields

Cross-field editing

Error processing

Calculations

PARALLEL PROCESSING – type of

processing in which more than one

instruction can be processed at a time

by breaking down problems into

smaller parts and processing them

simultaneously with multiple

processors.

SEQUENTIAL PROCESSING – each task is

assigned to one CPU that processes

one instruction at a time.

Program

CPU

Program

CPU

Task 1

Task 2

Result

Result

SEQUENTIAL

PROCESSING

PARALLEL PROCESSING

Program

CPU

Task 2

CPU

Task 3

CPU

Task 4

CPU

Task 5

CPU

Task 1

RESULT

top related