is 139 lecture 1
DESCRIPTION
IS 139 Lecture 1 - UDSM - 2014TRANSCRIPT
IS 139
Introduction
Overview - 1
How does a computer work? How components fit together Control signals, signaling methods, memory
types
How do I design a computer? Structure and behavior of computer systems ISA – instruction sets and formats, op codes,
data types, no. and type of registers, addressing modes, memory access methods, I/O mechanisms
Overview - 2
It’s all about function & structure
Structure => the way in which components are interrelated
Function => the operation of each individual component
Overview - 2
Architecture vs organization Architecture = attributes visible to
programmers e.g. instruction set, I/O mechanisms, addressing modes
Organization = units & their interconnections that realize architectural specs e.g. control signals, I/O interfaces, memory technology used
Manufacturers offer family of models => same architecture but different organizations
Why study computer architecture?
Program optimization – understanding why a program/algorithm runs slow
Understanding FP arithmetic crucial for large, real world systems
Design of peripheral systems i.e. device drivers
Design tradeoffs of embedded systems
Why study computer architecture?
Benchmarking
Building better compilers, OS’s
Writing software that takes advantage of hardware features - parallelism
You should be able to
Understand how programs written in high level languages get translated & executed by the H/W
Determine the performance of programs and what affects them
Understand techniques used by hardware designers to improve performance
Evaluate and compare the performance of different computing systems
General purpose computers
Software and hardware are interrelated
“Anything that can be done with software can also been done with hardware, and anything that can be done with hardware can be done with software” – Principal of Equivalence of H/W and S/W
This observation allows us to construct general purpose computing systems with simple instructions
Functions of general purpose computer
Data processing Data can take various forms
Data storage Temporarily or long term
Data movement Between itself & outside world
Control Orchestrates the different parts
Computer Organization & architecture - Stallings
What makes a computer?
Processor – data path & control
Memory
Mechanism to communicate with the outside world Input Output
Organization of a computer
Computer Organization & design - Patterson
Classes of computers
Desktop computers Familiar to most people Features: good performance, single user, execution of third party software
Servers Hidden from most users – accessible via a network – Cloud computing – in
data centers Features: handling large workloads, dependability, expandability From cheap low ends to extreme super computers with thousands of
processors used for forecasting, oil exploration
Embedded computers The largest class – wide range of applications & performance In cars, cell phones, video game consoles, planes Focus in limitation of power and cost
Growth of embedded computers in cell
phones
Tons of features
The essentials of Computer organization & architecture - Null
A look inside
The essentials of Computer organization & architecture - Null
How did we get here?
A lot has happened in the 60+ year life span E.g. if transportation industry developed at same
pace – here to London in 1 sec for a few cents
Different generations in the evolution of computers
Each generation defined by a distinct technology used to build a computer at that time
Why? – gives perspective & context into design decisions, understand why things are as they are
Generation 0: Mechanical Calculating Machines - 1
Defining characteristic - mechanical
1500s
There was a need to make decimal calculations faster
Mechanical calculator (Pascaline) – Blaise Pascal No memory, not programmable Used well into 20th century
Difference Engine – Charles Babbage “Father of Computing” Used method of difference to solve polynomial functions Was still a calculator
Generation 0: Mechanical Calculating Machines - 2
Analytical engine – an improvement over “Difference Engine” – It was a significant development More versatile – capable of performing any math
operation Similar to modern computers – mill (processor), store
(memory) & input/output devices Conditioning branch op – next instruction depending on
previous Ada, Countess of Lovelace suggested a plan for how the
machine should calculate numbers – The first programmer Used punch cards for input & programming – this method
survived for a long time
Generation 0
Drawbacks Slow – limited by the inertia of moving parts
(gears & pulleys) Cumbersome, unreliable & expensive
1st Generation: Vacuum Tube Computers (1945 -1953)
Defining characteristic: use of vacuum tubes as switching technology
Previous generations were mechanical but not electrical
Konrad Zuse – in 1930s added electrical tech & other improvements to Babbage’s design Z1 used electromechanical relays – was programmable,
had memory, arithmetic unit, control unit
Used discarded film for input
1st Generation
John Atanasoff, John Mauchly, J. Presper Eckert – credited with the invention of digital computers; However many others contributed
Their work resulted in ENIAC (Electronic numerical Integrator and Computer) in 1946 – the first all electronic, general purpose digital computer
Was built to facilitate weather prediction but ended up being financed & by the US army for ballistic trajection calculations
2nd Generation: Transistorized Computers (1954 – 1965)
Defining characteristic: Use of transistor as switching technology
Vacuum tube tech was not very dependable – they tended to burn out
In 1948 at Bell Laboratories – John Bardeen, Walter Brattain & William Shockley invented the TRANSISTOR
Was a revolution – Transistors are smaller, more reliable, consume less power
Caused circuitry to become more smaller & more reliable
Emergence of companies such as IBM, DEC & Unisys
3rd Generation: Integrated Circuit Computers (1965 –
1980)
Defining characteristic: Integration of dozens of transistors on a single silicon/germanium piece – “microchip” or “chip”
Kibly started with germanium, Robert Noyce eventually used silicon
Led to the silicon chip => “Silicon Valley”
Allowed dozens of transistors to exist on a single chip smaller than a single discrete transistor
Effect: Computers became faster, smaller & cheaper
E.g. IBM System/360, DEC’s PDP-8 and PDP-11
4th Generation: VLSI (1980 - )
Defining characteristic: Integration of very large numbers of transistors on a single chip
3rd generation had only multiple transistors on a chip
No. increased as manufacturing techniques improved: SSI (<100) => MSI (<1000) => LSI (<10,000) => VLSI (>10,000)
Led to the development of first microprocessor (4004) by Intel in 1971
Effect: allowed computers to be cheaper, smaller & faster – led to development of microprocessors
Computers for consumers: Altair 8800 => Apple I & II => IBM’s PC
5th Generation?
Quantum computing
Artificial Intelligence
Massively parallel machines
Non Von Neumann architectures
Common themes in evolution of computers
The same underlying fundamental concepts
Obsession with Increase in performance Decrease in size Decrease in cost
Moore’s Law
How small can we make transistors? How densely can we pack them
In 1965, Intel founder Gordon Moore stated “the density of transistors in an integrated circuit will double every year” – Moore’s Law
Ended up being every after every 18 months
Has hold up for almost 40 years
However cannot hold forever – physical & financial limits
Rock’s Law: “The cost of capital equipment to build semiconductors will double every for years”
Moore’s Law
Moore’s Law
Implications Cost of a chip remained almost unchanged
Cost of components decreased Higher packing density, shorter electrical paths
Higher speeds Smaller size
Increased flexibility Reduced power & cooling requirements Fewer interconnections
Increased reliability
Uniprocessor to Multiprocessor
Because of physical limits – power limits
Clock rates cannot increase forever Power = capacitive load x V^2 x frequency
Multiple processors per chip – “cores”
Programmers have to take advantage of multiple processors
Parallelism – similar to instruction level parallelism