generations/ processors / buses / software / standards

40
Revision Tutorial ACS 2010 - 1 Revision Questions for ACS 2010 (mainly compiled from existing tutorials) This pack - Server Technology/Computer Security/ Benchmarking/ Processor Technology/ PC Technology see separate tutorial packs (including answers) for Embedded Systems and Bus Technology Topic - Server Technology 1. Review the role of servers in modern IT configurations such as those used by our department 2. RAS is a common term used in server deployment – Explain the term 3. x86 servers are very popular Explain what is meant by an x86 server The answer should make reference to: Typical processors e.g. Xeon Chipsets – difference from desktop PC and laptops The core hardware Integrated peripherals Features not found on a typical desktop PC Increasingly the term x86-64 or x64 is used – why ? Desktop PCs can be used as servers – what are the issues ? 4. Name the major manufacturers of x86 server products Justify their interest in the x86 server market Discuss the size of this market and the market share of the major companies document.doc 06/06/2022 10:32 G South

Upload: peterbuck

Post on 03-Nov-2014

657 views

Category:

Documents


4 download

DESCRIPTION

 

TRANSCRIPT

Page 1: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 1

Revision Questions for ACS – 2010 (mainly compiled from existing tutorials)

This pack - Server Technology/Computer Security/ Benchmarking/ Processor Technology/ PC Technology see separate tutorial packs (including answers) for Embedded Systems and Bus Technology

Topic - Server Technology

1. Review the role of servers in modern IT configurations such as those used by our department

2. RAS is a common term used in server deployment – Explain the term

3. x86 servers are very popular

Explain what is meant by an x86 serverThe answer should make reference to:

Typical processors e.g. Xeon Chipsets – difference from desktop PC and laptops The core hardware Integrated peripherals Features not found on a typical desktop PC

Increasingly the term x86-64 or x64 is used – why ?Desktop PCs can be used as servers – what are the issues ?

4. Name the major manufacturers of x86 server productsJustify their interest in the x86 server marketDiscuss the size of this market and the market share of the major companies

5. How does the price of an entry level x86 server compare with a well specified desktop PC ?

6. Draw a block diagram showing the architecture of a classic PC and compare this with the block diagram of an x86 server

7. A growing area of the server market is that of blade servers – what is a blade server ?

Review the role of blade servers based on the following architectures:x86-64, EPIC (Intel Itanium), and RISC (IBM Power) blades

8. Virtualisation is becoming an issue for data centres – explain the nature of server virtualisation and storage virtualisation

document.doc 08/04/2023 03:23 G South

Page 2: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 2

9. Review the role of an administrator of a small PC network with one or more entry level servers (e.g. level of skill needed/ difference in managing a Windows PC compared with an x64 Windows or Linux OS)

10. Review the impact of ‘the cloud’ on server provision

Topic - Computer Security

Background Questions

1. Define the term IP address Internet Protocol Classically a 32-bit address - Normally given in a dotted format

2. What is meant by the internet community ? The internet was designed with the intention of cooperative computing Defence against hackers was not considered as a critical feature

3. How do hackers fit into the internet community ? They don’t -Security was for banking, military -TCP/IP was for academics

4. Define the term UDP and TCP

UDP User Datagram Protocol UDP basically allows a TCP/IP frame to get to the destination IP address – just

like a letter TCP Transmission Control Protocol TCP ensures that there is a reliable connection

5. What are UDP and TCP ports ?

Computers that use TCP/IP obtain services from one another via “handles” known as ports – also called end points

Many ports are pre-assigned to specific network services, such as HTTP (port 80) and FTP (port 21); these are called well-known ports

6. UDP and TCP ports - usage

A remote computer requests information on your use of a particular service with a corresponding port address

e.g. have you some task which will respond to a UDP or TCP frame sent to port 1234 ?

In the case of say a web server there is a well defined protocol attached to the application - HTTP

7. Why do open ports pose a security risk ?

The example of an open FTP port is a good one – a hacker might try to upload a Trojan Horse which might be activated by a subsequent attack

document.doc 08/04/2023 03:23 G South

Page 3: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 3

A more sinister example is where a Trojan horse is executing on your PC and listening on port 666 for a UDP frame which will activate it

8. What can we do to protect a PC from port probes ?

Disable unwanted applications and stealth any ports that are not needed – this may prevent applications such as Windows update from working

A home PC may be better operating in full stealth mode – you always initiate activities and reject any port requests not originating from your PC

General Questions

9. List the major characteristics of: A virus A Trojan A worm Spam Spyware Adware Virus Hoaxes A blended attack Zero day attacks Buffer overflow Denial of Service attack on a server hacker

How can the following be protected from these ? a home PC an office PC a departmental servers

10. How can a firewall be tested against attack (so called leak testing)? Give examples of such tests

11. Explain how a virtual browser environment operates and the benefits of virtual browsing including protection from zero day attacks

12. Why with all the free antivirus software and free firewall products is computer security such a problem ?

Why are the same viruses around for years ?

document.doc 08/04/2023 03:23 G South

Page 4: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 4

Topic - Benchmarking

1. Define computer benchmarking

2. Why do we need to benchmark PCs and Servers ?

3. List the major components in a modern PC/ Server which can be benchmarked

Hence design benchmark suites for: An office PC A multimedia workstation An enthusiast’s PC A management database server

4. What are the differences when benchmarking a server rather than a PC ?

5. How can a single figure of merit (benchmark) be achieved for typical classes of PCs ( e.g. home, games, office, ultra portable)

6. What are the obvious pitfalls in benchmarking PCs

7. Is it worth while to write your own benchmarking suite ?

8. Why is benchmarking so difficult ?

9. You are benchmarking a PC with a processor which has multiple cores – what are the issues compared with traditional single core systems ?

10. What is the role of a product such as SANDRA in managing a typical network of PCs ?

11. Certain features of hardware and compilers can confuse benchmarking programs - suggest how

caching could distort results compiler optimization could cause problems

Suggest how these can be worked round

document.doc 08/04/2023 03:23 G South

Page 5: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 5

Topic - Processor Technology

1. How important is it to have the fastest processor available in a home PC ? a business PC ? a departmental server ?

2. At what point will x86 processors become obsolete ?

3. List the trends which have dominated the x86 processor development

4. Intel and AMD have upgraded the x86 architecture to support 64-bit operations. Explain the technology of 64-bit processing.

Review the markets at which these products are aimed.

Discuss how the different approaches to processor development adopted by AMD and Intel.

5. Review recent developments in the x86 product range by Intel – Justify each development e.g. Core i7-9xx, Atom, Larrabee (general purpose x86 array), Core i7-8xx, Core i5, Core i3 (on-chip GPU etc)

6. Intel has updated the Core 2 architecture with a range of processors called Core i3, Core i5 and Core i7 (sometimes marketed as the Best processors of Good, Better, Best) . List the major features and application areas for these products. Justify the replacement of the FSB by QuickPath or DMI.Why use this product branding ?

7. Intel has a series of x86 products aimed at the ultra portable and embedded market called the Atom. List the features and application areas of the Atom product. Compare the Atom with the Core i3 / i5 / i7 products.

8. High performance graphics are normally implemented using GPUs – However Intel has an x86 product which is targeted in the high performance graphics area – Larrabee (x86 processor array)

Review the features and future of this product.

9. AMD has products such as Phenom, Athlon, Turion, Sempron, Geode and Opteron.Discuss how these map onto equivalent Intel products

10. The x86/ x64 processor market is big business – review the role of Intel and AMD in providing processors for this market

11. Where do the RISC (e.g. ARM products) fit into all this ?document.doc 08/04/2023 03:23 G South

Page 6: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 6

Topic - PC Technology

1. Many companies developed innovative microcomputer products in the late 70s and early 80s which had ‘office packages’ but most failed

discuss the reasons for these failures what can we learn from this ?

2. What was the background to the original IBM-PC and why has it been so successful ?

3. Review the history of the PC under the following headings:

Generations/ Processors / Buses/ Software/ Standards

4. History may be interesting but what relevance do Q1-Q3 have for us ?

5. Who needs to be informed about the future of the PC ?

6. Discuss the trends in the PC product: Total sales Products (Desktops to MIDs Mobile Internet Devices) Microsoft’s role in PC development Automated maintenance (vPro) Hardware assisted security (Trusted Platform)

7. Discuss the issues involved in using PCs in organizations from education to health care. Are PCs appropriate for these tasks ? There is much debate about PC lifecycles – compare the 3/4/5 year

lifecycle models ( 2 years for notebooks ) Review the issues relating to the TCO Total Cost of Ownership of PCs The latest PC motherboards contain the vPro hardware – should the

Dept of Eng & Tech purchase PCs using this technology?

8. Discuss the limitations of the PC platform and explain how Cloud Computing provides potential solution for selected users.

Hence justify why companies such as IBM, Microsoft, Amazon, Google etc are very keen to promote the concept of Cloud Computing and discuss who these products are targeted at.

9. In spite of problems most office/ home users still use PCs. Justify

10. Explain why many users have preferred to use Windows XP rather than Vista. Discuss whether Windows 7 will be more widely accepted than Vista and whether users will migrate to the 64-bit version

document.doc 08/04/2023 03:23 G South

Page 7: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 7

Outline Answers

Topic - Servers1. Review the role of servers in modern IT configurations such as those used by

our department

Peer-to-peerThe simple networking of PCs in a peer-to-peer configuration is simple to implement and brings advantages e.g. sharing resources and common access to departmental documents

Client-serverHowever introducing a server into a departmental network brings many advantages

The configuration is more structured It is obvious who is in charge – the server administrator Security is easier to establish – individual PC firewalls, anti-virus software can

be updated via the server Backup is centralised and can be automated Users can be given appropriate access to system resources Resources can be shared according to the individual PC user’s privilege Maintenance is simplified e.g. application inventories can be centralised, user

logs can be monitored

Servers become essential to the efficient running of the organisation as mission critical files are stored on the departmental server.

The server becomes a key feature in the network and RAS kicks into action – Reliability, Accessibility and Serviceability

A server outage may mean that users cannot logon, cannot access their data

An extreme client-server configuration is called thin client. The server holds and executes all programmes, stores all data whereas the client is merely a networked graphical workstation. There is no need to upgrade the client or provide virus protection since it is only a display system controlled by network packets from the Server.

In the Department of Eng & Tech a different client-server model is used. The clients are standard PCs and are loaded with the various application packages from MS Office to CAD packages. The programmes execute on the client PC but most user files are stored in the user areas on the server. When a user logs off only the files on the server are stored.

2. RAS is a common term used in server deployment – Explain the term

Reliability, Availability and Serviceability

The failure of an individual PC in a network will reduce the productivity of the individual user a PC – this reduction in productivity may be important

document.doc 08/04/2023 03:23 G South

Page 8: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 8

The failure of a server in a single server environment may be critical in that a small organisation may be in serious financial problems, if no users can access central account documents – server reliability is thus critical

The reliability of servers can be improved compared to a desktop PC by better mechanical construction, better electrical layout, better thermal design, redundant components such as power supplies, disks etc etc

Availability is reflected in the up-time of a server – 5 9s is often quoted 99.999% availability which will relate to features such as the ability to recover from an error such as a serious application crash or the communication features which deal with high network traffic

Serviceability relates to the ability to swap a failed component such as a disk – increasingly this is supported by hot swap components – a disk can be powered down and replaced – a feature of SATA and SAA disk sub-systems

3. x86 servers are very popular

Explain what is meant by an x86 serverThe answer should make reference to:Typical processorsChipsets – difference from desktop pc and laptopsThe core hardwareIntegrated peripheralsFeatures not found on a typical desktop pcIncreasingly the term x64 is used – why ?

Traditional server used ‘heavy metal technology’ derived from mainframe technology. The systems were expensive and used operating systems such as UNIX. These systems require an expert team to support server operation For simple applications such as file and print servers these systems were overkill and it was obvious that a low cost PC with an upgraded Windows operating system could provide most facilities at lower cost and could be maintained by an administrator with enhanced PC maintenance skills

This was the start of the x86 server market which now is the first choice for many organisations – the x86 server market accounts for 50% of server sales by value and the vast majority of servers numerically (98%)

Processors – Intel have a range of processors designated for use in servers – the Xeon range e.g. there are Xeon chips which are similar to say core i7-9xx processors except two of these processors can be installed on a server motherboard using a server chipset. Server applications/ OS generally make better use of a multicore/ HyperThreaded/ multiprocessor environment. Hardware features supporting virtualisation are provided and are increasingly deployed

Server chipsets also support buffered ECC memory, SATA and SAS hard disks with Raid and hot swap and usually have enhanced facilities for monitoring operation document.doc 08/04/2023 03:23 G South

Page 9: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 9

X64 implies support for 64-bit server Oss

4. Name the major manufacturers of x86 server products

Justify their interest in the x86 server marketDiscuss the size of this market and the market share of the major companies

The big three in the x86 (or x64) server market are HP, IBM and Dell – HP sell the most servers numerically but HP and IBM have similar server revenues

This is a large market with the attraction that it is less cut throat than the desktop/ laptop market

Entry level prices are quite cheap (similar to well specified PCs) but nearly all users will add significant extra memory/ disks/ backup facilities/ etc 5. How does the price of an entry level x86 server compare with a well specified

desktop PC ?

As indicated above, entry level servers often appear to be low cost (as an incentive to buy into a product line) but for most applications more memory, additional disk drives, backup support, OS licences will add significantly to the baseline cost

The major selling feature of a server is RAS whereas for desktops it is performance/ cost ratio

Don’t be tempted to use desktop hardware in a mission critical server environment – it is all about RAS

6. Draw a block diagram showing the architecture of a classic pc and compare this with the block diagram of an x86/x64 server

See website for PowerPoint – server audio/ graphics requirements are basic

7. A growing area of the server market is that of blade servers – what is a blade server ?

A blade server is really a computer on a card but lacking central facilities such as power supply, cooling, communications, monitoring support

The card plugs into a backplane of a support console which supplies central services to a large number of blade cards – this is very cost effective and simplifies maintenance

If more computing performance is required additional blade cards can be added – the OS must be able to share the software tasks between the blade processors – whereas desktop OSs e.g. Windows is poor at this server software such as Windows Server/ Linux can scale to support many processors efficiently

document.doc 08/04/2023 03:23 G South

Page 10: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 10

Review the role of blade servers based on the following architectures:x86, EPIC, and RISC blades

x86 servers traditionally run Windows/ Linux operating systems

EPIC servers are based on Intel’s 64-bit Itanium architecture which is aimed at high integrity applications (banking/ finance etc)

RISC systems (e.g. Power from IBM) compete with mainframes

8. Virtualisation is becoming an issue for data centres – explain the nature of server virtualisation and storage virtualisation

The classical approach to running two separate applications which require different OSs is to have two servers e.g. a Windows server to run IIS and a Linux server to run Apache

√ applications can be optimised for the individual hardware

! two lots of hardware to support! applications may not require all the computing performance of the platform

Solution – use virtualisation which effectively allows two operating systems to co-exist on the same hardware

The OSs are effectively partitioned from each other – ideally if Windows crashes Linux will be unaffected

There will be a performance overhead but if each server is only 30% loaded this may be marginal

Virtualisation can be entirely a software based function but with modern processors support for virtualisation is built into the hardware of the processor

Although most people think about using virtualisation to run different operating systems on the same hardware, features such as networking can be virtualised – the network facilities as seen by an OS or application can be entirely virtual – the mapping to the real networking hardware is achieved through a virtualisation layer – reconfiguring the network interface for an application is managed by the virtualisation layer

9. Review the role of an administrator of a small PC network with one or more entry level servers

The suppliers of servers and server software are usually keen to de-skill many areas of server administration by offering preconfigured systems and remote assistance – as a result users are usually more loyal to suppliers than in the desktop market

Microsoft imply that a proficient Windows PC administrator can easily acquire the skills to maintain a Windows based server – Linux provides more of a challenge and requires more training etc but the financial benefits are significant

document.doc 08/04/2023 03:23 G South

Page 11: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 11

10. The cloud and servers

Internet based server provision can be in many forms – a remote server which you physically rent and upload files to/ a remote storage environment which is secure and expands to meet archive requirement/ virtual server provision where you rent e.g. a virtual Windows server which is located in a remote data centreThe selling point is often elastic computing - you rent what you currently need

Computer Security – General Questions

A9Home PC – install a security suite from a well regarded source – free products are acceptable but may require more effort if the products are not integrated (firewall/ virus scanner, antispyware tool etc) – visit websites offering security guidelines e.g. Microsoft and establish basic rules for safe operation – ensure security products are correctly installed and set to highest level of security and check updates are occurring as expected (including Automatic Updates) – don’t visit ‘dodgy’ websites, don’t open enclosures unless you are really sure where they come from, don’t download antispyware tools etc when a popup window suggests you are infected – do run full security scans on a regular basis – look out for unexpected behaviour by the PC (slowing down/ unexpected level of activity) – backup important files – have strong passwords etc etc

Office PC – many of the same rules as above but it is worth purchasing profession security tools and management tools such as vPro – also ensure the backup strategy can survive a severe cyber attack – ideally the data should be partitioned from the OS and applications – rebuilding a virused PC may then mainly consist of reimaging the OS and applications – vPro has all the tools to do this from a central monitoring console

Departmental server – a server can suffer from any classic PC cyber attack – however it has an additional problem – you can’t run a server in stealth mode – the most serious attack will be a distributed Denial of Service where many PCs will try to overload the server causing it to crash – there needs to be a strategy and appropriate testing to ensure that if the number of connections becomes very large then the server will disconnect users which are not completing the appropriate handshaking elements of the protocol – server logs need to be inspected if activity is high to identify attack echanisms

A10There are many packages to test firewalls on the Internet – they try to exploit basic weaknesses in firewalls e.g. the TooLeaky test tries to open your default browser and contact the hacker’s website on port 80, potentially sending your details as part of the GET command – firewalls must allow port 80 traffic but the firewall should check that the browser is under user control rather than a rogue application.. see ZoneAlarm case study for other examples

A11 using material from ZoneAlarm case study

document.doc 08/04/2023 03:23 G South

Page 12: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 12

This works by using a technique called Virtual Browsing - Zone Alarm’s ForceField product creates a ‘clone’ of your computer – a virtual browser environment. As you surf, any malicious unsolicited downloads are safely contained in that clone so they never reach your actual computer

By creating a virtual browser ForceField can shield you from these Zero Day attacks because unlike traditional anti-spyware and antivirus software, ForceField does not need to know the threat in order to stop it. Instead, it automatically catches and neutralizes stealth Web browser downloads in a safe, virtual data space that acts as your computer's clone

Features include - Idle port blocking opens in-use ports only during transmission and immediately shuts them post-transmission to seal them from exploitSome firewalls still lack this protection -stateful inspection protects you from spoofing-style invasions by examining port and packet information. Some firewalls still lack this protection - full stealth mode cloaks every single port to make a computer undetectable to port scans

More features:• Self-protection combats attempts to hijack or sabotage ZoneAlarm and

defaults to a protected state if attacked. Other firewalls vary in their ability to fend off such attacks

• OSFirewall monitors for suspicious and dangerous behaviour at the operating system level – a ZoneAlarm exclusive

• MD5 spoof prevention prevents hackers from spoofing applications through program validation

• Behavioural rootkit detection blocks rootkit installation and activity based upon behaviour rather than signatures or heuristics

A12

It is clear that many computer are poorly protected and infect surrounding systems

Benchmarking

Outline Answers

A1.A standard by which a computer system can be measured or judged

A test used to compare performance of hardware and/or software.

Many trade magazines have developed their own benchmark tests, which they use when reviewing a class of products. When comparing benchmark results, it is important to know exactly what the benchmarks are designed to test. A benchmark that tests graphics speed, for example, may be irrelevant to you if the type of graphical applications you use are different from those used in the test.

document.doc 08/04/2023 03:23 G South

Page 13: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 13

http://www.webopedia.com/TERM/B/benchmark.html

A2.Checking whether a PC will run specific OSs and applications. For example: when Vista was introduced, Microsoft had a benchmarking package which allowed users to benchmark their PC and to get information as to whether their PC hardware was suitable to run Vista. It potentially avoided users spending time and money loading Vista on a PC which was unsuitable.

Microsoft had hardware specifications for Vista Premium edition (1 GHz Processor, 1 GB RAM , 128 MB Graphics Memory (along with many other graphic requirements including DirectX 9-Capable Graphics Processor ) but the benchmarking tool was easier to use.

Benchmark Suites allow a range of PCs which appear to meet our requirements to be compared and allow us to get the best value for money – the problem is always that the benchmarks may not be relevant to our target applications.

Server benchmarking is more standardised with organisations such as The Standard Performance Evaluation Corporation (SPEC) providing benchmarks such as SPECmail2009 – this currently only has four entries.

Using the SPEC information on the server family of your choice, you can compare the performance of servers from the major manufacturers

A3.You can benchmark:

Processor e.g. Integer, Floating point performance, multimedia performance and now encryption performance also cache performance

Memory performance Graphics subsystem Disk subsystem Network subsystem

These need to be combined into some kind of figure of merit which usually depends on the application area

e.g. Office PC benchmark – the PC will be used for Office Applications (Word, Excel, Access, OneNote), email, internet, financial packages, antivirus software etc All relatively undemanding tasks which will execute on any modern PC however it is good to have a benchmark suite since this will aid selection of an appropriate PC – best buy etc

e.g. multimedia workstation -

A4.The same areas apply to the hardware of a server but purchasers are usually more interested in the number of transactions per second that a server can handle – hence

document.doc 08/04/2023 03:23 G South

Page 14: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 14

the SPEC results are useful. Whereas a PC is not going to operate without user assistance a server is usually expected to run independently

A5.One way used by magazines is to establish a test machine against which other systems are compared.

The individual benchmarks are recorded for each PC and then a weighting placed on each benchmark result to give a single score – a games PC would have a heavy weighting placed on the graphics

A6.Games are often used to benchmark multimedia PCs, however often the processor performance/ cache size/ disk performance are not that important – the graphics subsystem is actually the critical component

A7Writing benchmarking software is very demanding especially if you think that the company who designed, say a desktop PC, will sue you if your benchmark software unfairly compares their PC with a rival product.The operation of a benchmark suite needs to be as transparent as possible so that we can be sure what it measures.

It probably isn’t worth developing a custom benchmark suite unless there is a definite demand e.g. SANDRA doesn’t have a module to test a feature which is vital to your project area.

A8.Some factors:

PCs and servers are complex systems with many component parts The tasks which PC deploy are so varied Compilers may optimize certain code so loops etc are bypassed Each individual has a different way of working There are many background activities on a PC which may interfere with the

test A PC with a fresh install of OS and applications may behave differently after

two years of use – patching and local setups etc

A9.Multi-core systems don’t work that well for classic OSs and traditional application software

Things may get better with .NET framework 4.0 but there is still a long way to go before the software really uses multi-core systems to full advantage

A10SANDRA can be used as an automated inventory tool allowing system administrators to record what they have connected to a network

document.doc 08/04/2023 03:23 G South

Page 15: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 15

If, in the future, a PC appears to be performing slowly then the current setup can be compared with the delivery setup e.g. page file size, clock speed etc

A11Suppose we have a benchmark program which completely resides in cache memoryThe performance of this program will be much better than one which is requires frequent cache updates and far worse than one which forces paging and the swapping of fragments to the page file on the hard disk

Compiler optimization can eliminate code which produces results which are not used

Say I put 10 operations in a loop – the compiler spots that they are all the same and removes the code thus destroying my benchmark

Processor Technology 32-bit/ 64-bit Processors

1. It is human nature to want the fastest and latest processor - the processor performance always appears on the benchmark for a PC - features such as disk drive performance are not memorable.

Note: State of the art processor performance may only be 10-15% above the middle of the road product but the cost may be an additional 30-50% or more

Money spent on extra memory or a bigger disk drive normally has a better pay back in terms of performance

home PC - might make the difference with a game but a better graphics card is a sounder investment – latest chips support Intel Viiv technology for media center and incorporate virtualization technology

office PC - tasks are typically very mixed and rarely cpu intensive, a bigger disk will be a better investment

dept server - processor performance is often less important than disk/ network performance and size of RAM – however dual-core processors and quad-core etc will give significant benefit for processor intensive tasks

2. When nobody wants to buy them - there is a market for the foreseeable future

Intel traditionally had independent Road Maps for : the x86 range of processors (traditionally called IA32) now with significant 64-bit

features using so called Intel 64 technology and the 64-bit Itanium processor (IA64) – Intel Architecture 64-bit

It is hard to see the advantages of using an Itanium in a PC used for office/ business activities – this product is intended for enterprise applications

document.doc 08/04/2023 03:23 G South

Page 16: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 16

It is easier to see why 64-bit graphics would be useful in multimedia production and 3D games etc but the first is very specialist and the second relates as much to the graphics card as to the conventional CPU

The extended x86 architecture (e.g. Intel 64) with 64-bit support is an attractive option – old code can be mixed with new code – future-proof

3. This is not a detailed VLSI design course - the simple facts are -

All manufacturers have the same basic technologies for improving processor performance including:

reduce the die size - allows faster operation and lower power dissipation (requires capital investment in new plant) – latest technology 45 nm (e.g. the Core 2 Duo series) – next generation is 32 nm e.g. new Xeon etc

increase the transistor count to give a dual or quad-core system but reduce the clock speed to keep the chip cool – overall result more processing than a faster single core but at significantly lower power

reduce the clock and processor voltage when system is lightly loaded – excellent for mobile systems – significant power saving for all users

increase cache size - allows faster access to code and data ( increases complexity and reduces yield of chip) – initial Core i7 processors have a versatile 6MB L3 cache shared between the cores

increase front side bus speed - allows faster processor data transfers (increases interface logic complexity) – Core 2 Duo E6300 FSB = 1066 MHz and replacing FSB technology by the modern QuickPath (Core i7) or for AMD HyperTransport technology also DMI (core i5)

increase parallelism in instruction execution - allows more instructions per clock (increases transistor count lowering yield) – Core architecture allows four simultaneous instructions

widen internal buses - more data transferred per clock - clock (increases transistor count making chips more expensive)

add extensions to the 32-bit registers to give 64-bit features including 64-bit addressing (Intel call this Intel 64)

add additional interfaces to each core to mimic a dual core (Hyper-Threading) support virtualization in hardware – allowing tasks to be isolated (e.g. in a

media center PC – two users supported with independent tasks – crash-proof!)

on-chip memory controllers with fast bus to memory – AMD do this with HyperTransport – Intel do this with the Core i7 using their QuickPath technology

smart decoding of instructions into micro-operations plus combining certain micro-operations into a single operation (micro-op fusion) – Core Architecture

Execute Disable technology which detects that an attempt has been made to an area designated as code (e.g. by a virus or a malicious buffer overflow)

on-chip graphics controller 4.

document.doc 08/04/2023 03:23 G South

Page 17: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 17

Upgrading the x86 product from 32-bits to 64-bits is something that Intel could have done at any time in the Pentium era.

In the 80s Intel had performed a similar process when they upgraded the 16-bit 80286 to the 32-bit 80386.

The traditional 64-bit market has involved specialist server and workstation products from IBM, HP and Sun (and others). However many of these processors have reached the end of their lifecycles and required large investment. HP had worked with Intel on the Itanium and publicly declared the Itanium to be the successor for HP’s 64-bit products

So Intel was originally keen not to disturb the 64-bit market by introducing a 64-bit Pentium. They put their efforts into producing Pentium 4s with higher clock rates. However AMD extended the AMD x86 products (AMD64) to allow native 64-bit processing (particularly for use in servers) and Intel eventually followed suite (Intel 64).

So there are 64-bit enhancements to most x86 processor products – this means the basic x86 registers (EAX, EBX etc) have been extended to 64-bit (RAX. RBX etc) and the processors can perform 64-bit operations very efficiently. However, even with the 8087 numeric coprocessor in 1978, the x86 product could perform 80-bit real arithmetic and with the Pentium range this was extended through SSE instructions to 128-bit

So 64-bit processing isn’t so much about 64-bit arithmetic, it is about 64-bit addressing (Intel 64 – Extended Memory 64-bit Technology). The 32-bit x86 processors could address memory using a so called flat memory space using a 32-bit address (the classic 4 GByte limit). The 64-bit processor could in theory address memory using a 64-bit address pointer – this is huge ! and modern x86 64-bit processors can address a physical address space using a 40-bit pointer (1 Tera Byte of memory or 1000 GB).

64-bit x86 market

So complex enterprise databases, sophisticated CAD applications, complex financial models, state of the art multimedia packages etc which require more than 32-bit addressing, can run on Intel 64 enhanced x86 processors. This requires a OS, BIOS and chipset which supports 64-bit addressing. Many ordinary PC applications will gain no benefit from using a 64-bit processor.

Note:32-bit and 64-bit applications can be run using a new 64-bit OS. The 64-bit applications are new, however the 32-bit applications will run providing 64-bit device drivers are available – Windows 7 may do this better than Windows Vista

5.Core i7-9xx – uses Intel Nehalem micro-architecture – new socket (LGA 1366) – quad-core on a single chip – 2.66 GHz to 3.3 GHz - on-chip memory controller with three memory channels (DDR3 with up to two memory modules per channel – FSB replaced by QuickPath – HyperThreading for all cores -large caches L1 code = 64k, L1 document.doc 08/04/2023 03:23 G South

Page 18: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 18

data = 64k, L2 = 256k, L3 = 8 MB shared by all four cores - 781M transistors – SSE4 instructions

Atom – x86 – low power - 800 MHz to 1.866 GHz – performance enhancing features such as instruction reordering, speculative execution, or register renaming are omitted to reduce transistor count/ provide a low cost chip/ keep the Power requirements low (.6W to 2 W so no fan required also down to .01 W when idle) – targeted at Mobile Internet Devices market and Ultra Mobile PC/ Netbook as well as embedded products – the Atom product has been well received in the Netbook market but finding new areas in the embedded market is more of a challenge !

Larrabee – as of Q4 2009 not to be released as a mainstream producta new and unique graphics architecture that is aimed at both the enthusiast 3D market, as well as stream computing (taking in a single task, breaking it up into smaller parts for individual cores to process and then reassembling to give the overall result) – traditionally Intel has offered integrated graphics products which are easily outperformed by the GPUs from AMD (ATI) and NVIDIA

The idea of Larrabee is simple – use a number of simple x86 cores (plus vector processing units) to perform all the aspects of 3-D graphics which are normally performed by a number of dedicated hardware units – there are dedicated texture units on Larrabee but these are assisted by the x86 cores which are linked on a ring-bus

So to change the functionality of Larrabee only requires a change in the software

Larrabee may provide a new approach to graphics – each core could be given part of a screen to process – hopefully leading to an enhanced benchmark in terms of frames /second

Core i7-8xx, Core i5, Core i3These processors include:

the Nehalem micro-architecture the new LGA1160 socket Intel Turbo Boost Technology Hyper-Threading Technology At least 2 Cores, 4 Threads 4 MB Intel SmartCache Integrated memory controller supporting two DDR3 channels Integrated graphics (iGFX) core Discrete graphics support for a single 1x16 PCI Express 2.0 slot The Intel Flexible Display Interface (FDI)

The Intel Flexible Display Interface will allow the integrated graphics core in these processor s to channel its graphics data to the display controller in the Ibex Peak chipset for output to the monitor.

6.Core i7-9xx – based on the Nehalem micro-architecture – i7-9xx processors are aimed at high end marketdocument.doc 08/04/2023 03:23 G South

Page 19: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 19

new LGA 1366 socket on-chip memory controller: the memory is directly connected to the

processor with thee channel memory: each channel can support one or two DDR3 DIMMs. Motherboards for Core i7 have four (3+1) or six DIMM slots instead of two or four, and DIMMs can be installed in sets of three or two. DDR3 only but no ECC support

QuickPath interface to motherboard: o 64 KB L1 instruction and 64 KB L1 data cache per core o 256 KB L2 cache (combined instruction and data) per core o 8 MB L3 (shared by all cores)

Single-die device: all four cores, the memory controller, and all cache are on a single die.

Turbo Boost technology allows all active cores to intelligently clock themselves up in steps of 133 MHz over the design clock rate as long as the CPU's predetermined thermal and electrical requirements are still met

HT - each of the four cores can process up to two threads simultaneously, so the processor appears to the OS as eight CPUs

QuickPath interface 45 nm technology. 784 million transistors in the quad core version. Sophisticated power management set an unused core in a zero-power mode. Support for SSE4

FSB is a parallel bus and could no longer be upgraded – QuickPath is a modern serial technology from Intel

Overall a high performance

7. Atom The Atom processor is Intel's smallest processor manufactured using 45nm technology. The Intel Atom processor has been designed for simple, affordable, Netbooks and Nettops.Atom based systems are suitable education, photo and video viewing, social networking, voice over IP, e-mail, messaging, browsing etc.

In addition the Atom is targeted at embedded applications (with extended lifecycle support since embedded systems often have a very long lifetime) Market segments include digital signage, interactive clients (kiosks, point-of-sale terminals), thin clients, digital security, residential gateways, print imaging,and commercial and industrial control.

The Atom processor is software compatible with previous x86 software.

Features conventional FSB, SpeedStep technology, TDP (Thermal Design Power) below 2.5 W, SSE2 and SSE3 instruction support, Execute Disable Bit (to protect against buffer overflows, L2 cache is dynamically sized to conserve power

Intel supply the drivers for embedded XP but other OSs are supported

document.doc 08/04/2023 03:23 G South

Page 20: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 20

Windows 7 should run satisfactorily on the Atom hardware

8.Larrabee – a multi-core x86 system – status of this product is somewhat in the air

Using multi-processing graphics hardware is not new, however the Larrabee architecture is base on the Pentium architecture to simplify the design of each core

The Larrabee chip is targeted at accelerating applications such as the compute and memory intensive demands of the latest PC games and high-performance computing applications, such as image processing, physical simulation, and medical and financial analytics.

Intel initially plans to use Larrabee chips in discrete graphics cards and support both OpenGL and DirectX (Microsoft)

Larrabee uses multiple in-order x86 CPU cores that are assisted by a wide vector processor unit, as well as some fixed function logic blocks. This provides dramatically higher performance per watt than out-of-order CPUs on highly parallel workloads.

A Vector Processing Unit processes a one dimensional array in a single operation – e.g. averaging 8 pixels in one operation (1,3,5,7,2,4,8,9) – very much related to x86 SSE instructions (single instruction multiple data extensions)

Larrabee is more flexible than current GPUs. Its CPU-like x86-based architecture supports subroutines etc. Some operations that GPUs traditionally perform with fixed function logic, such as rasterization and post-shader blending, are performed entirely in software in Larrabee. Like GPUs, Larrabee uses fixed function logic for texture filtering – this is the process of adding texture information (colour) to a primitive graphics object.

PC Product

Answers

Q1. Generally these were start up companies which were not very well funded The architecture of each product was different so ‘bespoke’ software i.e. OS

and applications had to developed for each product The limited processing power of these systems required software to be

developed in assembly language which was expensive The software developers didn’t know which platform (hardware/ Operating

System) to target their efforts Cash flow was a major problem – in a fast moving technology it was difficult

to get the cash from suppliers to pay for the development/ manufacture/ marketing etc

As a result many companies failed leaving customers with computers with minor faults (e.g. requiring a replacement disk drive) but no support

document.doc 08/04/2023 03:23 G South

Page 21: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 21

In purchasing computer equipment price and performance are important but so is the track record of the supplier – no point in having a 5 year guarantee/ warranty if the supplier is bankrupt

Q2. IBM was a very large organisation dominating the mainframe computing

world It had not embraced the new microprocessor based technologies which were

regarded as being more relevant to the world of embedded systems IBM was however in a period of technical exchange with Intel The PC was designed by a small team of about 12 and used off the shelf

components based on the Intel 8088 processor and peripheral chips (address bus buffers/ data bus transceivers/ interrupt controller/ dma controller etc plus lots of discrete logic) – the development was rapid because the design team didn’t have to get senior management approval at every stage as they would with a million dollar server product

The operating system was provided by Microsoft/ Bill Gates who actually purchased it from another developer

A BASIC interpreter was available in ROM which probably helped BASIC to become such a widely used language

IBM was happy for third party developers to write applications for the PC and develop adapter cards (e.g. a network card)

The PC was successful because many organisations wanted to use microcomputers but needed a developer which inspired confidence for a long term product life – IBM had sufficient cash to act as underwriter for the PC product

Software developers soon saw that the PC offered a vey large market with a single hardware/ software platform

Hardware developers (e.g. a network card ) could see a large market using the simplistic PC bus and supporting software

The IBM PC was also seen as a status symbol and many PCs were just on a manager’s desk

Adding an 8087 chip (100 pounds) meant that the PC was a powerful number cruncher and was much cheaper than a mini computer – 8087 socket was unpopulated for simple word processing etc

Q3.Generations

Original PC – 8088 plus floppy disks – 8-bit motherboard XT – 8088 plus hard disk plus floppy disks– 8-bit motherboard AT – 80286 plus hard disk – 16-bit motherboard PS/2 (e.g. 80386) – IBM upgrade which failed to make an impact outside

corporate environment ISA architecture PCs – 80386 – ISA bus - graphics and sound cards 80486/ Pentium + PCI bus – superior graphics – dial-up internet Early Multimedia PC – Pentium + PCI + AGP + broadband Modern PC – Core processors + PCI + PCI Express (x16, x1 etc)

document.doc 08/04/2023 03:23 G South

Page 22: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 22

ProcessorsMainstream = 8088 – 80286 – 80386 – 80486 – Pentium to Pentium 4 – Atom to Core i7

Adapter BusesPC-bus (8-bit) / AT-bus (16-bit) / PCI bus / AGP bus/ PCI Express (x1 to x16)

Standards The original PC products had architectural features (IO map for peripherals)

specified by IBM – many features are still maintained for legacy support The Intel and Microsoft managed the PC specification although frequently

accused of unfairly using their market position (called Wintel) Now PC architecture seems to be determined by Microsoft and its Hardware

Engineering Conferences WinHEC – is this still happening ?

Q4.Most of us will get involved in the purchasing of PCs, PC peripherals, software and related products. By knowing about the history (or track record) of the PC we can make informed decisions about procuring PC based systems

When specifying, purchasing or deciding when to upgrade PCs it is very useful to be knowledgeable about previous/ current/ future PC Technology

PC specifications are often vague and it is desirable to be able to ask well informed questions such as: How is the RAM configured - as dual/ triple channel etc How much will it cost to upgrade to a quad Core processor and what are the

performance benefits Are you being sold old technology which cannot be upgraded Is it worth waiting for the next generation PC

Q5. Anybody who has PC products or any PC based projects that have lifetimes of more than two years e.g. informed home Users/ System admin who advise purchasing dept/ specialist users

If you are going to develop a data acquisition unit today and want it to work with the majority of PCs in the future, then USB2.0 at 480 Mbps makes a good choice, but a Firewire interface is a poor choice, since all new PCs will have USB2.0 ports and less will have Firewire – and what about USB3.0 ?? is it too earlier to develop/ purchase a USB3.0 product

Q6.PC Total sales are increasing say 10% per year 2010 321 million - desktop 126 million mobile 195 million 2012 403 million - desktop 131 million mobile 272 millionHence Intel’s interest in getting into the mobile phone market which is growing much faster

document.doc 08/04/2023 03:23 G South

Page 23: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 23

Products Desktops - mainly business customers laptops – increasing sales netbooks - surprising successful MIDs (Mobile Internet Devices) - developing market -currently products are

under development by Apple and partners of Microsoft - e.g. iSlate/ iPad - future depends on consumer response - the netbook has been successful but is there a market for a touch screen product without a keyboard and what will the network connection be: 3G mobile phone WiFi WiMAX – backed by Intel ( but opposed by mobile phone operators ?) LTE Long Term Evolution - G4 mobile phone technology – probably long term

winner

Microsoft (market capitalisation Dec 2009 $271 billion) Microsoft's role in 2000 was very dominant - there was little competition Microsoft's current role is less dominant Many companies see that low cost systems can use an open source OS such

as Linux ubuntu, Linux Moblin, Google Chrome - obviously such systems can't execute Microsoft office etc but must use open office

Another aspect of this is virtual desktops - the technology is simple - The OS is a distribution of Linux e. Moblin, Android, Chrome etcThe GUI/ application is a Virtual Desktop incorporating a browserFacilitates easy access to say Google Docs, Social networking etcEasy to protect from virus attack ?No need for Windows/ Office/

Microsoft’s competition - Google (market capitalisation Dec 2009 $196 billion !) Apple (market capitalisation Dec 2009 $190 billion !)

Automated Maintenance

o It is clear that PC ownership is expensive – most users don’t quantify the cost of PC ownership:

a simple task such as ensuring a PC has the latest security updates can be time consuming

recovering data after a zero-day attack by a hacker can take many hours (zero-day implies that the anti-virus companies haven’t issued a signature for the virus)

o Products such as Intel’s vPro can automate isolation, recovery, rollback, update, reinstallation of OS and packages etc and provide appropriate documentation – this is achieved by dedicated hardware which should be more secure than software solutions

Hardware Assisted Security

document.doc 08/04/2023 03:23 G South

Page 24: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 24

The data on many PCs, especially notebooks, is valuable but very vulnerable to hacking (e.g. when using a WLAN, theft etc)Software security such as password protecting a Word document is often very unsatisfactory – users will often not use such systems because they are inconvenient

What is the alternative ? – hardware based encryption – this uses the TPM Trusted Platform Module

The TPM is used to generate the cryptographic keys based say on scans of the system files etc – if a file is changed or replaced then the system won’t boot

Supported by features in high end versions of Windows 7 operating system

Q7 Needs checking !!!

The original PC was relatively simple and the software was quite basic. However over 25 plus years, developers used the PC as a platform for developing every kind of package - financial, educational, technical, scientific, media etc

The PC has been attractive because it is relatively cheap and the same platform can be used for all these different activities – the very large market has allowed economies of scale in virtually every area

By the additional of specialist peripherals from graphics cards, sound cards, broadband modems, USB peripherals- the PC has been adapted to meet all these requirements. It is however very complex.

Is the PC appropriate for applications which are primarily office tasks ??

The PC is more complex than is needed for typical office processing – Word, Email, Excel, simple Databases – however staff are familiar with PC hardware and software so any alternative requires significant retraining and probably staff cannot continue to work after hours on their home PCs

Lifecycle issues

The classic five year lifecycle leads to excessive maintenance costs in years 4 and 5 as the hardware becomes unreliable and the OS and applications need updating – new software may be sluggish or unreliable on old hardware. A three year cycle is recommended by Intel and Microsoft – the hardware is well matched to the software and if bought with a new OS release then the PC can be used without hardware or software upgrades over its full lifecycle.

Regardless maintaining PCs with different versions of the OS can be extremely expensive in terms of IT support.

A simple calculation relating to a PC worker without a functioning PC for 1 day a year (plus lost cash flow for the company) confirms that PC hardware costs are often a minor part of the IT budgetdocument.doc 08/04/2023 03:23 G South

Page 25: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 25

The TCO Total Cost of Ownership is a worrying statistic for large organizations – many workers cannot function without their PC say in a call centre. The PC as however got a very complex operating system which can perform complex tasks well outside the range of simple accesses to a central database. If we look at the usage log of a typical PC – the cpu idle time is usually >99%

The TCO is not just the purchase price of the PC – to keep a PC functional it is vital to keep anti virus software updated, the OS patched to latest standards and applications updated. If a PC has a projected life of 5 years the cost of support will be greater than the capital cost of the hardware.

vPro Casestudy .. update !!!

2 IT technicians earning £25k plus overheads = £80 /yearHardware cost of refreshing 120 PCs at £600 each = £72kPCs refreshed every 5 years so PC cost per year = £15k with spare parts (72/5)

vPro adds £10 per PC plus a PC acting as a maintenance station – say £2k System can now be managed by one technician saving £40k / year PCs can be refreshed every three years at a yearly cost of £24k per year

2 IT Tech option with 5 year PC refresh over 5 years costs £72k + £400k = £472k

VPro + 1 IT technician with a 3 year refresh over 5 years costs £120k + £200k = £320k

8. The PC as a general purpose computing platform has become increasingly complex – as a result it is vulnerable to viruses and other malware – as a result it is necessary to implement daily updating and subsequent patching – network based utilities such as internet explorer become easy targets for malicious activities such as downloading Trojan horses - for the average office task it is over specified -

The thin client solution is frequently re-invented – basically a very simple graphics workstation connects to a powerful 64-bit server which hosts all the applications and stores (and backs-up) all the data – the user can access their data from anywhere and groups of individuals can access common data – virus detection at the client is very easy to implement since the only application is the graphical interface and network interface

This concept has been further developed in what is called by the buzzword Cloud Computing – companies such as Amazon have significant hardware and software investment in internet based computing – they are able to sell spare capacity as a Simple Storage System S3 – they use the term elastic computing – users can adjust their requirements on a very flexible basis – pay for what they need – backup etc is managed by the service provider

document.doc 08/04/2023 03:23 G South

Page 26: Generations/ Processors / Buses / Software / Standards

Revision Tutorial ACS 2010 - 26

The next stage is to sell virtual server capacity – part of a powerful server facility can be remotely accessed as say a Windows Server 2008 configuration – again the facility is very scaleable

IBM, Microsoft see this a big business opportunity for exposing their file server/ web server/ database server/ management system packages to users – since users don’t have to purchase any server hardware this is attractive for trying new products etc

A limitation of such products is that an outage at the computing centre will be critical and affect many users

Such products are targeted at companies with limited IT support

9. Historically users insist on PCs for office type tasks – this is what users understand and Office applications are more or less stable, so the cost of retraining is small.

Windows Vista was designed : To be secure with most activities occurring at user rather than admin

privilege – this can be annoying when installing software package,s since users are required to enter passwords many times

With a smart User Interface (Aero) To allow 64-bit upgrading of OS and applications to take advantage of 64-bit

processors (Intel 64 etc) allowing RAM greater than 4 GByte

Although many users are happy with Vista there has been criticism at corporate level because:

Many organisations have limited IT support Old PC hardware couldn’t execute Vista so organisations would have to

support two OS (XP and Vista) Many of the drivers were immature especially the 64-bit drivers There were no applications requiring more than 4 GBytes of RAM

Hence organisations (such as MMU) with a mix of PCs (some 5 years old) stayed with XP – a safe option and requiring no new IT skills – note: some PC suppliers even offer a Windows Viva downgrade to XP

Windows 7 is probably better regarded as a minor update even though Microsoft declare it to be a major OS release – the Vista drivers are now hopefully stable so device driver problems are history and the same drivers can be used for Windows 7.

Windows 7 is designed to operate on any modern PC hardware from Core i7 to the Atom so it is less demanding on specific hardware than VistaWindows 7 will be available in 32-bit and 64-bit versions – it should be stable with existing 32-bit applications, 32-bit applications designed for Windows 7 as well as new 64-bit applications which can take advantage of the 64-bit architecture e.g. > 4 GByte RAM

document.doc 08/04/2023 03:23 G South