introduction - loughborough university  · web viewcustom voice controls can also be created, both...

87
Classification : Restricted Deliverable Reference WP 3 Task 3000 No. D301 Status Draft/Issued/Revised Issued Rev. 1.0 Date 14 May, 1997 Release to CEC -

Upload: others

Post on 03-Jan-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Classification : Restricted

Deliverable ReferenceWP

3Task

3000No.

D301Status Draft/Issued/Revised

IssuedRev.

1.0Date

14 May, 1997Release to CEC

-

Page 2: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

Rev. 1.0ii

Page 3: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

Document Control Sheet

Rev. 1.0

WP3: Trends in PDIT Support

Task 3000: The Supporting Environment

ESPRIT 20876 - ELSEWISE

Editor:

Alastair Watson University of Leeds

Other Main Contributors:

Gerard Sauce CSTBJohan Neuteboom TNOJacques Rossillol Bouygues

Distribution Draft Issued Revised

Date 25-4-1997 14-5-1997

TW *

BOU *

HBG * *

SKA *

BICC * *

CAP * *

CSTB * *

DSBC *

ECI *

ULeeds * *

RAM *

RAKLI *

TNO *

TR *

VTT * *

CECiii

Page 4: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

Amendment Record

Revision Status Page Nos. Amendment Date By

1 Issued Additions & final corrections 14/5/97 AW

Rev. 1.0iv

Page 5: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

Table of Content

1. INTRODUCTION

2. PRODUCT DATA TECHNOLOGY STANDARDS

2.1 What exists today2.1.1 Electronic Data Interchange (EDI)2.1.2 Standard EDI messages2.1.3 EDIFACT message structure2.1.4 ISO 10303 (STEP)2.1.5 The IAI Industry Foundation Classes2.1.6 OLE for Design & Modelling

2.2 Barriers to LSE uptake2.2.1 Electronic Data Interchange2.2.2 Product Model Based Standards

2.3 Trends and expectations2.3.1 Electronic Trading2.3.2 Open EDI and STEP2.3.3 Product Model Based Standards

2.4 Relevance to LSE2.4.1 Electronic Trading2.4.2 Product Model Based Standards

3. COMMUNICATIONS INFRASTRUCTURES

3.1 What exists today3.1.1 Networking Technologies3.1.2 Communications Technologies3.1.3 The Internet and World Wide Web3.1.4 Other Sources of Digital Information

3.2 Barriers to LSE uptake3.2.1 Networking and Communications3.2.2 The Internet and World Wide Web3.2.3 Other Sources of Digital Information

3.3 Trends and expectations3.3.1 Networking and Communications3.3.2 The Internet3.3.3 intranets3.3.4 Other Sources of Digital Information

3.4 Relevance to LSE

4. DELIVERY PLATFORMS

4.1 What exists today4.1.1 Desktop Hardware4.1.2 Desktop Operating Systems4.1.3 Servers4.1.4 Portable Computers4.1.5 Distributed Objects4.1.6 Other Emerging Technologies

Rev. 1.0 v

Page 6: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

4.2 Barriers to LSE uptake

4.3 Trends and expectations4.3.1 Microchip Technology4.3.2 Disk Technology4.3.3 Desktop Hardware4.3.4 Desktop Operating Systems4.3.5 The Impact of Java and the NC4.3.6 Portable Computers4.3.7 Servers

4.4 Relevance to LSE

5. SUMMARY

6. INDEX

Rev. 1.0vi

Page 7: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

1. Introduction

This document (D301) is one of a series of outputs from Work Package 3 of the eLSEwise project. The overall objective of the work package is to identify, and to critically evaluate, the key technologies in Product Data Information Technology (PDIT) which are likely to provide the tools to support and shape the LSE Industry in the future. The formal outputs from the work package are as follows:

Title Scope

D301 Appendix to D305:

“The Supporting Environment”

Product Data Technology Standards, Communications Infrastructures, Delivery Platforms.

D302 Appendix to D305:

“Systems and Technologies”

Human Interface, Data Management, Knowledge Management, Virtual Enterprises.

D303 Appendix to D305:

“Application Software”

What exists today, Barriers and Opportunities, Trends and expectations, Relevance to LSE

D304 Appendix to D305:

“Applied Research Futures”

European IT Research projects and their future relevance to LSE

D305 Main Report:

“Trends in PDIT Support”

Summery Work Package Report

Document D305 provides a concise overview of the overall results from the Work Package – particularly the opportunities for the LSE industry and the barriers to LSE uptake. More comprehensive results, with greater technical detail, are presented in the supporting Appendices. The Work Package has adopted a layered approach, with D301 providing the technology to support D302, and with both these supporting D303. By adopting the viewpoint of the research community, D304 may be regarded as an orthogonal check on the findings of the other Appendices. Although the coverage of the Work Package is wide, it cannot (and would not) claim to be complete and comprehensive – the domain is too wide! Judgement has been applied in selecting which areas of technology should be addressed, and to what depth. In all cases the criteria has been the expected relevance to the LSE industries.

The major output from eLSEwise Task 3000 (The Supporting Environment), document D301 is concerned with those technologies that provide support to higher-level systems and applications, and thus ultimately determine the form of the future IT solutions that can be offered to the LSE industries. This “supporting environment” has two distinct parts. Firstly, the universal IT infrastructure: networking, communications, and digital information sources together with the IT delivery platforms (hardware and operating system). Secondly, those Product Data Technology Standards which are directed specifically at, or which are directed towards, LSE. The objective of Task 3000 was to determine the probable character of this supporting environment by the year 20051, and to identify the implications for LSE. The methodology adopted has been to review current sources on what is expected in the near future, and to project from this to 2005. Given the pace of development in IT, and the strong market drivers, it is doubtful that meaningful predictions could be made beyond that date.

The scope of this document can be outlined in terms of the three main technical chapters:

1 This been the nominal future horizon adopted by eLSEwise.

Rev. 1.0 vii

Page 8: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

· Product Data Technology Standards: Those standards concerned with the structuring of information for digital exchange and sharing including: Electronic Data Interchange (EDI), ISO 10303 (STEP), the International Association for Interoperability’s Industry Foundation Classes (IFCs), and the Microsoft approved OLE for Design and Modelling).

· Networking and Communications: That part of the universal IT infrastructure which lies behind the delivery platform – Networking and Communications technologies, the Internet and the World Wide Web, intranets, and other means of communicating digital information.

· Delivery Platforms: That part of the universal IT infrastructure with which the end-user interacts – the hardware and operating system of desktop and portable platforms (including some consideration of servers). This chapter includes sections on distributed objects and other emerging technologies - such as voice input, Java and the Network Computer.

In each case the following issues are addressed:

· What exists today?

· What are the current barriers to LSE uptake?

· What are the trends and expectations by 2005?

· What is the relevance of this to LSE?

The report assumes the reader has some knowledge of computing and IT terminology, but efforts have been made to keep the language accessible to the none-specialist.

Rev. 1.0viii

Page 9: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

2. Product Data Technology Standards

2.1 What exists today

2.1.1 Electronic Data Interchange (EDI)

Electronic trading, by means of exchanging standardised EDI messages, is generally accepted and practised today in particular industry sectors such as transport, retail and health care. This is reflected in the set of internationally agreed standard messages which mainly support information exchange between the processes in these three sectors. The standard paper documents that were used in the past were simply replaced by their electronic counterparts. For that reason EDI is still strongly document oriented. Success depends on the presence of standards for the definition of the electronic messages. These messages must be computer interpretable in order to trigger consequential actions, which may be independently executed by the computer.

The standardisation effort was conducted under the auspices of the United Nations and resulted in a standard format for electronic documents called EDIFACT and in directories with standard messages, segments and elements.

EDI can bring many advantages to companies:

· EDI service available 24 hours a day;

· reduction of errors;

· no human intervention;

· message travels at the speed of light.

EDI is generally defined as “The electronic exchange of structured and normalised data between computer applications of parties involved in a (trade) transaction” . The initials EDI formally stand for Electronic Data Interchange, which (incorrectly) suggests a much wider meaning involving any form of data interchange in electronic form. The particular objective of EDI is to exchange electronic documents. It is thus not surprising that the definition of electronic messages only started after the completion of a standard layout for paper documents: the United Nations Layout Key for Trade Documents (ISO 6422), and the definition of an international trade dictionary: the Trade Data Elements Directory (ISO 7372). The same working party (Working Party on Facilitation of International Trade Procedures) of the United Nations Economic Commission of Europe (UN/ECE) developed the electronic variants for document exchange. This resulted in co-operation with US standardisation bodies into the world standard for the exchange of electronic trade messages: the EDIFACT standard.

EDI involves the following characteristics:

· Transactions: This means that one party requests a service and the other party offers the service and receives payment for the service. It also means that both parties have a relationship supported by a business agreement.

· Protocol: A transaction is not a single message sent from one party to another. In general a transaction will consist of a series of messages (order, change order, accept, reject). This means that from the start until the end, a transaction can be in different intermediate states.

· Automated: As EDI messages are structured, a receiving application is able to interpret the message and perform the required “sequel actions”, without human interference.

Rev. 1.0 ix

Page 10: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

· Documents: The EDIFACT syntax was designed to describe the content of documents. For this reason the building blocks of EDIFACT are data segments and data elements. A message design describes what segments are relevant for a certain type of document and in what order and how many times they appear. It also reveals if the relationship between the different document segments is a one to many or a many-to-many relationship.

· Structured data: To allow computers to interpret incoming data it is necessary to structure it according to strictly defined rules. The building blocks for EDIFACT are called data-segments. Each data-segment starts with a 3 letter tag to identify the segment and ends with a “-” character. Between these are data-elements, which are numbers and/or strings, separated by “+” and “:” characters. The data structure is flat: it is not possible to have segments within segments.

· Automatic “sequel actions”: A computer that is able to interpret the incoming messages must also be able to take the required consequential actions. For example, if an order comes in for a certain item, then at least the receiving computer should send a confirmation, but it could also register the order and generate an invoice or a shipping order for transportation. Although this capability makes EDI particularly attractive, there is still a majority of companies that do not exploit it.

Within an EDI exchange between two parties, the underlying objective is to send data produced by application A to the other party where it is used as input to application B. Thus, if the first party wishes to send an order, the data produced by application A has first to be converted to the standard neutral EDIFACT format. This standard electronic order is then send to the other party. The receiving system uses its own conversion programme to decode the incoming file from EDIFACT to the in-house format. What automatic “sequel actions”, if any, follow depends entirely on how the receiving company has programmed its systems.

2.1.2 Standard EDI messages

The UN/EDIFACT Standards comprise a set of internationally agreed standards, directories and guidelines for the electronic interchange of structured data, in particular that related to trade in goods and services between independent, computerised information systems. A distinction is made between batch EDI and interactive EDI.

To date over 125 standard EDIFACT messages have been defined (of which about half are still in the standardisation procedure). Eleven of the messages, which start with the prefix “CON”, are specifically designed for the building and construction industry:

· CONDRO Drawing organisation message: Describes the general (project) organisation and structure, valid for a complete project or environment. Its aim is to acquaint the participant of a project with the existing organisation, the computer environment used, the agreed conventions for structuring/naming the transferred data and, in subsequent use, to announce any changes to the above-mentioned agreement.

· CONDRA Drawing administration message: Describes a parallel non-EDI exchange of a set of engineering/CAD files. CONDRA gives additional administrative information about these files; for example, their nature, a list of their contents and technical information necessary to interpret them. The EDI message itself does not carry any engineering or graphical information. Such information is transferred within files written in existing standard graphical exchange formats or native formats, referred to within the CONDRA message as external file reference to identify each of these files.

· CONAPW Advice on pending works message: Used during the design, building and maintenance stages to communicate with organisations about existing and planned

Rev. 1.0x

Page 11: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

services in the vicinity of the works. Enables a contractor who intends to start works to advise public authorities and water, gas, telephone, electricity distributors of his intention, and to request them to send back plans or information in any form concerning existing networks. The first two messages may be used in conjunction with CONAPW to allow graphical and other information to be conveyed.

· CONRPW Response of pending works message: This message is a reply to a CONAPW message and enables service providers to respond to a contractor giving details of any services and networks at the location where construction work is to be undertaken. The first two messages may be used in conjunction with CONRPW to allow graphical and other information to be conveyed.

· CONITT Invitation to tender message: A structured description of a Bill of Quantities used during the pre-contract tendering process. Initially used by the client’s representative to send the competing main contractors the applicable Bill of Quantities. May also be used subsequently to issue any amendments, or be used in turn by the main contractor to issue subsets of the Bill of Quantities to competing subcontractors. The receiving software must interpret the message, which bears no relation to the traditional paper document. It addresses both the size and indexing of such documents by effective grouping and indexing of work items and the use of standard items to avoid repletion. The message is based on universal practice and is not dependent on the type of business or industry.

· CONTEN Tender message: Used by a main contractor or subcontractor, after the receipt of an invitation to tender (CONITT) message, to submit a tender - a commercial offer to execute the project work defined by the Bill of Quantities included within the invitation to tender.

· CONEST Establishment of contract message: A structured description of a Bill of Quantities used after the completion of the tender process. Initially used by the client’s representative when the contract is agreed to issue the applicable Bill of Quantities to the main contractor, and subsequently to issue any amendments.

· CONQVA Quantity valuation message: Advises another party about the progress of work performed (against groups of work items) during a specific time period, or since the start of the project. Typically used between the main contractor and the client's representative to submit progress details. The message can also be used between other parties. The message allows for the inclusion of “new” items of work.

· CONPVA Payment valuation message: Advises another party about the value of work performed (against groups of work items) during a specific time period, or since the start of the project. Typically used between the main contractor and the client's representative during the process of approving the value and payment for work completed. The message can also be used between other parties. The message allows for the inclusion of “new” items of work.

· CONWQD Work item quantity determination message: A justification of the work item quantities given in another messages that describes the work items or their progress valuations. Typically will be used between a contractor and the client's representative or other partners within the construction process, to substantiate the quantities associated with specific sections of work.

· CONDPV Direct payment valuation message: The instruction by the contractor to the party responsible for payments, to pay the subcontractors for work completed. This message is designed to be used to support the business process of communicating the value of progress against groups of work items which make up a construction project

Rev. 1.0 xi

Page 12: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

Other more general EDIFACT messages like the electronic invoice “ORDERS” and the electronic invoice “INVOIC” can be (and are) used in the building and construction area. The UN issues a new version of the EDIFACT message directory twice a year.

2.1.3 EDIFACT message structure

EDI messages are very generically defined. To make them computer interpretable, codes must be used to identify parties, articles and locations. Currently the EANCOMimplementation guidelines are widely use to achieve this.

In EANCOM messages, an EAN standard article number identifies each product and an EAN location number identifies each party. The EAN identification numbers are unique and recognised worldwide. Their use means trading partners do not have to maintain complex cross-references for each trading partner's internal codes.

EANCOM also provides a logical sequence of messages used in business. Trading companies jointly agree on messages adapted to their needs. Standard messages available in EANCOM can be divided into the following categories:

· Master Data Messages: Containing data which rarely changes such as product measurements, names and addresses.

· Commercial Transactions Messages: Covering the general trading cycle from quotation request to remittance advice including purchase order, transport and logistics messages.

· Report and Planning Messages: Allowing the exchange of general trading reports, including forecasts on delivery, sales and stocks, to allow partners to plan for the future.

· General Messages : May be used to send data for which no specific standard message exists.

EAN has established an international committee of EDI experts: the Communication Systems Committee (CSC). Its main objective is to monitor the development and maintenance of the EANCOM standard in accordance to user needs and requirements identified in specific project teams. In construction the EAN coding system is mainly used for standard building components between supplier and wholesaler. The connection in the supply chain between wholesaler and contractor is not established yet, although some initiatives are exploring this field. It is more complex because of different product specifications, which cannot be translated into standard products with unique identifying codes.

EAN closely collaborates with national and international user groups around the world representing key companies in a wide range of sectors (Chemicals, Electronics, Publishers and Libraries, Healthcare).

2.1.4 ISO 10303 (STEP)

Recognising the strategic importance of product data, the International Standards Organisation instigated the development of a new standard in 1984 with the mission statement:

To create a single ISO standard that enables the capture of information to represent a computerised product model in a neutral form throughout the life cycle of the product without loss of completeness and integrity.

Originally called “STandard for the Exchange of Product model data” (STEP), ISO 10303 was initially a European initiative. In 1990 STEP was unified with a parallel US initiative. ISO 10303 now has the formal title “Industrial Automation Systems - Product Data Representation and Exchange”. With participants in over 25 countries, and directed at the

Rev. 1.0xii

Page 13: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

needs of all sectors of engineering, ISO 10303 is the world’s largest standardisation effort. It inevitably has a complex committee structure, which is located under ISO TC184/SC4.

ISO 10303 is being developed and published as a series of complementary but essentially independent Parts. The first batch of these Parts has been published, and STEP is now deployed in sectors like the aircraft and automotive industries. Many more Parts are currently at different stages of development.

The most important (fundamental) Parts are:

· Part 1: Overview and fundamental principles

· Part 11: EXPRESS language reference manual

· Part 21: Clear text encoding of the exchange structure

· Part 22: Standard data access interface specification

Those Parts that are directly applicable to the engineering industries are the 200 series Application Protocols (APs), a selection of which are tabulated:

Part Name of Application Protocol Status2

201 Explicit Drafting IS

202 Associative Drafting IS

203 Configuration Controlled Design IS

208 Life Cycle Management – Change Process Working

212 Electrotechnical Design and Installation Working

221 Functional Data and Schematic Representation of Process Plants Working

225 Building Elements using Explicit Shape Representation DIS ballot

227 Plant Spatial Configuration DIS

228 Building Services: Heating, Ventilating and Air Conditioning Working

230 Building Structural Frame: Steelwork Working

231 Process Design and Specification of Major Equipment Working

Each AP is concerned with specifying how data relating to a particular type of “product” should be exchanged in well defined industrial contexts (the “Application”). They each include a formal Product Model, defined in EXPRESS, which specifies how data should be structured for exchange. STEP data exchange is implemented in accordance with an AP by writing compliant export and import translators. The role of an export translator is to map data from the native data structures of one application into the neutral data structures of the Product Model, and to write out a corresponding Part 21 format STEP data exchange file. Similarly, an import translator reads data from a Part 21 file that complies with that AP and maps the data into the native data structures of the receiving application.

Each AP includes two Product Models:

· Application Reference Model (ARM): A data model that captures the required product information and uses the language of the Application.

2 PDTAG Newsletter No 15 (January 1997)

Rev. 1.0 xiii

Page 14: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

· Application Interpreted Model (AIM): A data model which is a re-interpretation of the ARM using standard constructs drawn from the 40 and 100 series “Integrated Resources”

It is the AIM that should be implemented, although many translators have been implemented at the ARM level.

ApplicationActivityModel

ApplicationReferenceModel

ApplicationInterpretedModel

IntegratedResourcesModels

scopes satisfies

need for

used inIntegrationApplication Protocol

Application WorldSTEP World

ApplicationInterpretedConstructs

industry

Interpretation and a STEP Application Protocol

Interpretation is a complex process complex that offers advantages, including interoperability between APs, when commonality exists between APs. Each AP has a sister 300 series Part that specifies corresponding Abstract Test Suites, which are used to test that a particular translator implementation complies with that AP. These are linked to the Conformance Classes that each AP defines, effectively implementable sub-sets of the full AP.

As can be seen from the previous table, ISO 10303 does not yet provide much for the LSE industries. The construction sector contributed to the early work on product models through initiative such as the RATAS project and the General Application Reference Model (GARM). However, active industrial participation in STEP initially tended to be focused on applications involving complex geometric forms. In recent years the centroid has shifted towards supporting the wider engineering process fuelled by a significant number of major STEP related research and development projects (many of which have been in Europe). In the LSE sector industrial awareness of, and active participation in STEP, is most evident in the Process Plant sector - particularly in relation to oil and gas. As an outcome of initiative such as PROCESSBASE and PIPPIN, the process plant sector has two well developed and complementary Application Protocols. These are AP221, which is concerned with conceptual design of a plant, and AP227, which is concerned its spatial layout.

In the construction sector there is only one well developed Application Protocol, AP225. This is concerned with the representation of the shape of building elements and was largely funded by the German Government. Although AP230 is still in the early stages of ISO standardisation, it is based on the CIS (CIMsteel Integration Standards) - STEP like specifications that are already entering industrial use. CIS/AP230 are important in that they support the (structural) engineering process (for steel framed structures) and thus providing a pointer to the form of future STEP standards for LSE.

Rev. 1.0xiv

Page 15: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

2.1.5 The IAI Industry Foundation Classes

The International Alliance for Interoperability (IAI) is an international consortium with the goal of enabling software interoperability in the building design, construction and facilities management sector. The origins of the IAI lie in an initial grouping of twelve US companies who demonstrated prototype interoperable applications in June 1995. IAI now has over 300 member companies grouped into seven international chapters. The mission statement of IAI is:

To define, promote and publish a specification for sharing data throughout the project life cycle, globally, across disciplines and across technical applications.

The Industry Foundation Classes (IFC) are common specifications for the objects that occur in a building. They define how each object will be digitally represented, and collectively these data structures will support a project model that is useful in sharing data across applications.

The ultimate realisation being that the specialists involved in the life cycle will interact with a shared project model. However, the current focus is on file-based data exchange (like STEP) rather than on the longer-term goal of interoperability between objects within different applications.

IAI was launched as an open initiative in 1995 with very ambitious plans in terms of the rate at which the IFC could be specified. Since then considerable progress has been made in establishing the organisational and technical infrastructure and in defining the development methodology. This has been assisted by significant cross-fertilisation from the STEP community. One beneficial outcome being the adoption of EXPRESS and the Part 21 data exchange file format from STEP. Less progress has been made in establishing implementable standards (as the sheer scale of the task became apparent). IFC Release 1.0 has now been published and provides limited data exchange capabilities in the areas of Architectural Design, HVAC Engineering Design, Construction Management, and Facilities Management.

2.1.6 OLE for Design & Modelling

OLE for Design & Modelling (OLE for D&M) is a specification enabling OLE (see 4.1.2) to work with data representing 2D and 3D geometry, to handle the spatial arrangements of graphical objects.

Intergraph was the original designer of the OLE for D&M specification, and led the effort to create OLE D&M applications extensions for 2D and 3D geometry. It then transferred the specification to the Design and Modelling Applications Council (DMAC).

The DMAC was formed at a meeting at Microsoft, in January 1995, to develop the specification further, and for the development of OLE technology for design and modelling applications. The OLE for D&M specification developed is in the public domain, in draft state, and is the responsibility of DMAC. OLE for D&M is now a Microsoft OLE industry solution, an extension of OLE for a specific industry. Over 40 companies are members of DMAC, and it includes many of the major CAD/CAM/CAE vendors.

OLE for D&M allows for the integration into an application of CAD/CAM/CAE software, along with standard desktop OLE-compliant software, such as spreadsheets. That is, the integration of engineering and business software.

OLE for D&M enables three dimensional compound document functionality within applications that model 3D entities. It provides three types of functionality:

· Three dimensional objects: an object has an interface that can communicate certain 3D information. A container may need to know the spatial relation of the object to other 3D

Rev. 1.0 xv

Page 16: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

objects it contains, or the extents of the object in three dimensions. In addition, 3D objects must be able to provide two dimensional information (for example, allowing the insertion of a 3D object in a 2D container).

· In-place activation: the user interface must allow transitions from one active object to another. These transitions must be smooth, with the user continuing to work in the complete view of the model, rather than in a separate window for the active object. This allows editing of the object in the context of the container data, with the effects on the rest of the model visible.

· Location: an object may make use of information regarding the location of another object in the container. For example, the user may manipulate one object relative to the geometry of some other object

To date OLE for D&M is understood only to have been commercially implemented within two of Intergraph’s next generation Jupiter products: Imagineer, a 2D drafting tool, and Solid Edge, a mechanical engineering solid modelling. As a consequence these applications can interwork with existing OLE applications such as spreadsheets and word processors.

DMAC agreed the final draft of its Geometry and Topology interfaces in November 1996. These custom COM interfaces allow client applications to access the geometry or the topology of surface entities and curve entities that reside in a server application’s Document object. These entities, which may be a single entity or topologically connected composites, are described as being a Surface Body (such as a B-Rep Solid) and a Curve Body (such as an extruded feature). Access via these interfaces is by means of procedural calls that deliberately isolate the client from the higher-level data structures, which exist on the Server side. Several advantages are claimed, not least the restriction of the geometry traversal and computation engine to the Server side. Intergraph and SolidWorks have committed to demonstrate server implementations by early 1997 and ANSYS, MSC, Delcam, and Pathtrace to demonstrate client implementations. This suggests that OLE for D&M will become more widespread, but within the mechanical engineering sector in the first instance. DMAC has also instigated work defining an “add-in mechanism” which would extend the capability of COM to allow client applications to create or edit data on the server side, and the IDMSurface2 proposal concerned with improved support for free-form parametric surfaces.

2.2 Barriers to LSE uptake

2.2.1 Electronic Data Interchange

To what extend a sector will apply EDI is determined by a set of success factors. In the following some factors for successful uptake are mentioned. It helps to understand why the uptake in building and construction area is so low.

· Size of information flow: It is more profitable to introduce EDI when the number of products or trade partners is high. The volume of information in a structured form is also an important factor. The more documents are exchanged between companies the more these companies can profit from the use of EDI.

· Permanent relations in the supply chain: Permanent relations among a reduced group of business partners will promote a stable communication pattern. This will support the use of EDI.

· Dominant party: Dominant partners can support or enforce the use of EDI in the supply chain. When such a strong partner is present in a sector, the sector is more likely to take up EDI.

Rev. 1.0xvi

Page 17: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

· Degree of automation: It is no use for a company to participate in EDI when there are no computer programs used to support the internal business activities of the company.

The success factors mentioned above implicitly explain why the phenomenon of electronic trading is still at a very low level in the building and construction area. Many construction companies do not use extensive computer support for their activities. The sector is highly fragmented without a dominant party. Relationships are established around projects. Thus when a project comes to an end, trading relationships are broken. The potential for generating large numbers of messages exists within the supply chains, and at a higher level, the size of the information exchanged is large (for example Bills of Quantities and CAD-files).

The fact that the EDIFACT standard messages are very generally defined leaves scope for selection and interpretation. Thus each electronic trading initiative may first need to agree their own version of a standard message. A fragmented industry like the construction industry also needs a communication infrastructure through which all parties can be reached. Until recently commercial EDI services were provided by competing Value Added Network (VAN) companies, with only limited intercommunication between the networks. The combination of these various start-up costs makes the realisation of EDI links for a one-off project difficult to justify. By providing an open infrastructure, the Internet is helping to erode such barriers.

Another factor that is holding back the standardisation process is the struggle for domination of the new electronic trading environment. In the current supply chain there are three major players: supplier, trader and buyer (contractor). The trader wishes to play a central role in electronic trading, and needs product price information in order to select the cheapest components. On the other hand, suppliers do not want to compete on price alone - but attributes like quality and aesthetics are not expressed in current EDI messages. Contractors would prefer to deal directly with different suppliers, bypassing the wholesaler. This highly competitive situation is not ideal for the standardisation of electronic messages, and may need to be resolved before substantial progress can be made.

A specific problem for the construction sector is the nature of the building products. In the retail and transport sector the products are called “articles”. Articles are standard products that can easily be identified using an identification code (generally the EAN article code). Building products have variable properties with many possible combinations. It can be shown that there are not enough EAN codes available to identify each variant. An alternative is to use a classifying code. This requires agreement among parties in construction on how to define and characterise building products, components and parts. This comes close to product modelling as practised within the STEP standardisation effort.

2.2.2 Product Model Based Standards

The title Product Model Based Standards is used loosely as a collective reference to those standards other than EDI. These include STEP, IFC and OLE for Design & Modelling (although the latter is not really based on a product model). In these cases the primary barrier to deployment within LSE is the lack of maturity of the standards. Industrial deployment within LSE is only starting to become a practical proposition as commercial implementations are now becoming available.

Considering each standard in turn:

· STEP: No relevant Application Protocols have yet reached the status of an International Standard but AP221, AP225 and AP227 are well advanced and AP230 has made significant progress. In the Process Plant sector there are implementations of AP221 and AP227 which have been used on a trial basis. Similarly, several CAD vendors

Rev. 1.0 xvii

Page 18: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

have implemented AP225 allowing it to be used on a trail basis. Industrial deployment of commercial implementations of these APs is imminent.

· Significantly, these STEP APs have all been implemented at an Application Reference Model (ARM) level, not at the Application Interpreted Model (AIM) level. Although AP230 itself is not yet ready for implementation, the CIS (which is an ARM level precursor to AP230) has been extensively implemented and tested within the CIMsteel project. The industrial deployment of applications software with commercial CIS translators has now commenced (but is not yet occurring on a significant scale). A testing programme to ensure that commercial translators conform to the CIS is also underway.

· IFC: The IAI recently completed the IFC Released 1.0 specifications for which a number of CAD vendors had previously demonstrated prototype implementations. Commercial implementations will not however be available until after IFC Release 1.5, which is scheduled for mid 1997. Given the major vendors who are involved in IAI, a high profile launch of these IFC implementations is anticipated.

· OLE for Design & Modelling: This has been implemented commercially, but (as yet) only in a small number of applications. None of these are directed at the construction sector.

Thus, the barrier imposed by the lack of suitable commercial implementations of these standards is currently being removed. However, there is another significant barrier to widespread LSE deployment. Current each standard only addresses a small part of the LSE need for appropriate standards. For example, the available STEP Application Protocols only address four distinct areas of LSE when a suite of perhaps fifteen Application Protocols would be needed to give good coverage. Similarly, scope of IFC Release 1.0/1.5 is very limited. OLE for Design & Modelling can probably be disregarded in this respect. Although it is being extended, this standard has much more limited objectives than has STEP or the IAI. It is unlikely that it could be expanded to provide the full range of semantics needed by LSE3.

This debate also highlights another barrier to the uptake of such standards by LSE, the confusion created by the different standards (including EDI, this chapter has identified four). Although each has a different focus, there is no overall framework to ensure conceptual and technical fit between them. Additionally, a degree of duplication and the lack of direction are diluting the limited resources that the LSE industries are willing to invest in developing such standards.

2.3 Trends and expectations

2.3.1 Electronic Trading

As a medium for electronic commerce, the Internet offers several advantages:

· is widely accessible globally;

· offers a flat monthly access rate;

· has a volume-independent and time-of-day independent pricing structure for data transmission;

· is robust because it encompasses multiple alternative pathways, gateways, and

3 Gann, D., K.L. Hansen, D. Bloomfield, D. Blundell, R. Crotty, S. Groák and N. Jarrett (1996). Information Technology Decision Support in the Construction Industry: Current Developments and Use in the United States . Department of Trade and Industry (UK), Overseas Science and Technology Expert Mission, Visit Report. Science Policy Research Unit, University of Sussex: Brighton, UK. September 1996.

Rev. 1.0xviii

Page 19: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

interconnections;

· has high bandwidth for data throughput;

· is platform independent.

With its low cost, high speed, open access and robust architecture, the Internet (see 3.1.3) would seem at first glance to be an ideal medium for electronic commerce in the construction area. There are still significant concerns, however, regarding its use for commercial transactions. Lack of security4, inability to confirm message integrity, vulnerability of messages to interception and fabrication, lack of user support, and difficulties in obtaining reliable assurance of authenticity or receipt.

Standards and software products are being developed to solve the problems of Internetsecurity, and to address the particular needs of electronic trading. One approach may be the creation of close secure environments for electronic trading based on the emerging intranets (see 3.3.3).

The growth of the World Wide Web (WWW) for electronic trading will not replace EDI because electronic trading becomes most beneficial for companies when (secure) electronic messages can trigger “sequel actions” without the need for human involvement. This requires structured, unambiguous messages, which are “understood” by the receiving computer system. HTML and SGML do not offer this functionality, however it is possible to send structured EDIFACT messages from within an HTML page. The WWW (see 3.1.3) thus provides an inexpensive entry-point to electronic trading for small and medium enterprises. Once in use, it can be further developed to support EDI.

2.3.2 Open EDI and STEP

The EDI standardisation effort can be characterised as a bottom-up approach that has resulted in a significant number of standard messages. To solve the problem of ambiguous messages an initiative called Open EDI was started in one of the working groups of the ISO/IEC JTC1/SC30 N127 technical committee. Within the Open EDI initiative a conceptual business model will be developed. This business model will provide a mathematical description of business transactions, which can be used to automatically generate the required messages. It is expected that this trend will grow, and will converge in the direction of the modelling activities that already took place within the ISO/STEP standardisation effort. It is not yet clear if Open EDI will define its own language for defining the business models, or if it will use the languages, methods and techniques already defined within STEP. The optimistic view is that where an EDI message contains product information, a synthesis of STEP techniques and EDI practise will be used.

EDI standard messages and the EDIFACT approach have been in use for many years. Thus many companies have invested heavily, and wish to see an EDI compatible solution to the remaining limitations. One example of how to deal with product specifications is the PRODEX message, which contains product information using EDIFACT to describe it.

2.3.3 Product Model Based Standards

In the immediate future the first implementations of STEP and IAI standards will enter deployment within the LSE industries. For example, the CIS are now being implemented beyond the CIMsteel project, and it is expected that twelve applications will have commercial CIS translators by the end of 1997. Much will depend upon the reaction of industry. The resourcing of standardisation activities in STEP and IAI is a major issue, and a constraint on 4 As an example of the degree of the security problems, as a mater of policy the French LSE contractor Bouygues does not allow any networked PC to be connected to the Internet through direct modem connections.

Rev. 1.0 xix

Page 20: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

future developments.

It is helpful to consider the differences between the two organisations:

· Part of ISO, STEP is a well establish organisation with much wider industrial participation than LSE. It is based on well-established and widely supported technologies (which are under pressure for evolutionary change), and has a slow decision making structure. It is mainly resourced through unpaid participation, the participants in turn being funded by companies, industry initiatives, industry associations, and research and development projects.

· The IAI is a recently created independent body which was spawned by a group of major CAD and application software vendors, but which is now constituted as an international association with a federal structure. It had a less bureaucratic, and potentially more flexible, structure and a significantly more commercial outlook than STEP. Currently, IAI is only concerned with the Architecture, Engineering and Construction (AEC) sector. It is based on newer but less well-established technologies, although much has been borrowed from STEP. It is resourced partly by its membership, and partly through unpaid participation. To date IAI has been very successful in recruiting members from industry and software houses in many parts of the world.

Both STEP and IAI are based on the concept of an underlying product model, and both aim to support data exchange and data sharing. There is a major difference in emphasis in that the IFC are intending to realise dynamic interoperability between objects in different applications at an early date. Also, STEP employs a bottom-up modelling approach in which each Application Protocols is developed independently, the resulting ARM then being “interpreted” to create consistent AIM for implementation. In contrast, IAI employs more of a top-down approach in which the focus is initially on high level cross-disciplinary information transfers.

As this implies, there is complementarily of scope and function in that STEP is producing detailed domain-specific product models while IAI is currently producing shallower cross-domain models. This view is reinforced by the fact that both are using EXPRESS to define their models, and by the development of a Building Construction Core model (Part 106 in STEP) which is intended to also be used within the IFC. However, there are doubts about the ability to define a core model that is common to STEP Building Construction and the IAI, and which can address (different) technical and practical problems in both camps.

Although the contexts differ, a major problem that IAI and STEP both face is the evolution of standards. STEP is currently facing this problem directly, as the need to revise earlier Parts of ISO 10303 becomes pressing, and the knock-on effect on other Parts is significant. Similarly, IAI have a stated policy of issuing regular revisions to the IFCs but have not (openly) faced the inevitable problem of by the need for backward compatibility. Other key issues that will effect future developments of these standards include:

· The current major review of the STEP architecture. Prompted at least in part by pressure from LSE (which questioned the cost-benefit of the current interpretation procedures), a major re-engineering of the STEP approach is expected to emerge from WG10 in 1998.

· STEP has substantial industrial support, including the Process Plant sector, but there is only limited industrial involvement in the Building and Construction sub-group (were several new initiatives have been delayed following the launch of IAI). LSE thus has

Rev. 1.0xx

Page 21: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

a presence both in STEP and IAI, but is supported by common applications software.

· Following its initial launch publicity, IAI have been slowly reducing industrial expectations to more attainable levels. IFC Release 2.0 is currently scheduled for release by the end of 1997 with a scope that may be much greater that of Release 1.0/1.5, but is still only vaguely defined. If it is delayed and/or has lesser scope than expected, there is a danger of IAI loosing industrial support.

The reality is that there is significant overlap between STEP and IAI (many technical experts are involved in both organisations), and a number of tensions – both technical and political/commercial. Most observers agree that convergence between the two is essential, and that this will probably be the end result. As an illustration, IAI has applied for formal “liaison status” within ISO 10303.

This situation makes it very difficult to predict what Product Model based standards will exist by the year 2005.

Technically the issues are different implementation forms, and how to integrate increasingly large and diverse Product Models. The latter problem is most likely to be addressed by using computer power to map information dynamically between different Product Models rather than investing large amounts of human effort in pre-integrating those Product Models (to give an integrated schema). This implies that mapping technology, based on the likes of the new EXPRESS-X mapping definition language will prove to be very important. An alternative, but complementary, vision is that STEP Application Protocol interoperability will be realised by mapping each Product Model back onto a common Ontological Model. The issue of implementation forms is less critical once agreed models are defined in a standard way (and this might be the long awaited and fully object oriented EXPRESS 2). The reason being it is then quite easy to transform information (say) into the distributed object stores that are expected to underpin future IT platforms (see section 4.1.5).

The key question for LSE is how quickly such technical issues, and the associated questions of STEP and IAI, can be resolved to the point that the LSE industries are confident to invest the considerable resources required to define open global5 standards to fully support the engineering process. It is assumed that by 2005 the technical basis will be clear, and many of the necessary standards will either have been established or will be well advanced.

2.4 Relevance to LSE

2.4.1 Electronic Trading

Electronic trading is a means to significantly improve the logistics of construction. When the required procedures are well organised and standardised, and well embedded in an LSE project, considerable time can be saved by exchanging the right messages at the correct point in the process.

Project partners in LSE projects are often globally spread. Linking them together through a network and allowing information to be exchanged electronically will significantly reduce lead-times. It is essential to identify what information flows are most relevant in trading and supporting logistics to the construction site. EDI can play an essential role in the preparations for the construction phase, when procurement takes place. During the construction phase EDI can be used to support logistics by allowing the goods to arrive just-in-time. The trading and transport messages can be used for this (although a message to call-off materials has not yet been developed).5 Because LSE projects are transient virtual enterprises that frequently involve participants in many countries, the required standards must be open, globally applicable, and (hopefully) universally adopted.

Rev. 1.0 xxi

Page 22: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

While some standard messages have been developed for use in the design and detailing phases, it is not yet evident that these are the appropriate phases for the introduction of EDI. These phases are characterised by use of design and engineering programs with large complex data files which lie more in the domain of STEP. EDI as electronic trading can most readily be applied during procurement and construction. However, there is a need for EDI messages which will (a) allow a technical specification to be issued, and having selected appropriate products, will (b) allow details of these products to be returned.

2.4.2 Product Model Based Standards

Section 2.2.4 concluded with a prediction that by 2005 the basic technical issues will have been addressed, and that establishment of the open global standards that LSE needs will be well advanced. This section outlines the new opportunities that this will present to LSE:

· The primary opportunity is for LSE to become to an industry which is centred on (coherent) digital information, allowing the efficiency benefits of information technology to be fully realised. This would, for example, enable companies to treat previous LSE projects in which they were involved as a resource which can be mined, perhaps using pattern matching, to find solutions to similar problems.

· The existence of common information standards across all the participants in an LSE Project offers the prospect of plug-compatibility between their respective information management systems. Thus project wide information systems will become the norm, leading to the realisation of the concept of a shared project database.

· By shifting the emphasis away from data held in applications to managed project information, the very nature of the applications software that the LSE industry uses will be transformed. Applications will become more modular and thus more widely applicable. This will reduce software costs, give the industry more choice of applications, and empower engineering companies to combine modules to gain greater added value.

· As the previous points imply, the most significant opportunity for LSE will be to substantially shift the culture of the industry. Inevitably, the existing culture can also be viewed as a barrier to the introduction of such standards.

Clearly the degree to which these opportunities can be exploited will depend on the coverage and completeness of the product model based standards (and here the timing may be elastic). Additionally, full exploitation will probably require the existence of complementary (i.e. compatible) EDI standards.

Rev. 1.0xxii

Page 23: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

3. Communications Infrastructures

3.1 What exists today

3.1.1 Networking Technologies

Some form of local area network (LAN) interconnects a growing majority of PCs in an office environment, and most Unix workstations. This enables the sharing of files, applications software, and printers via peer-to-peer services or (more typically) via one or more “servers”. The first LAN technologies to become widely used were Ethernet and Token Ring:

· Ethernet. Ethernet employs a Carrier Sense, Multiple Access/Collision Detection (CSMA/CD) protocol to arbitrate between the connected devices. These share a bus-based cabling topology with a shared bandwidth of 10 Mb/s. Earlier installations used thick or thin coaxial cabling. The more recently introduced twisted-pair cable, with devices connected radially to hubs, is now favoured because of the easier upgrade path to Fast Ethernet.

· Token Ring. As the name suggests, Token Ring uses a transmit token rather that CSMA/CD to ensure a more predictable performance. First introduced in 1987 with a 4 Mb/s bandwidth, the effective throughput is similar to standard Ethernet (but is maintained under high load). 16 Mb/s variants are now available.

Today Ethernet remains by far the most common installed LAN technology.

PCs are usually connected to the LAN by a network adapter card, although network adapters are increasingly being integrated into the motherboard. Depending upon the technology and topology of the network, the cabling infrastructure may include electronic hubs, switches, routers, and bridges. A dedicated server running a network operating system, such as Novell NetWare most generally provides the services that run across a LAN.

With increases in the number and the speed of devices on a LAN, and the introduction of more demanding applications, the older technologies like Ethernet no longer provide sufficient bandwidth. When any network which is based on a CSMA/CD type protocol approaches saturation, the effective performance falls-off very rapidly. Such technology is also inherently unsuitable for many emerging applications, such as video, where a predictable level of throughput is required. A number of alternative technologies, which are both faster and more expensive are available, including:

· Switched Ethernet. A variant of standard Ethernet in which switches are introduced to avoid sharing the standard 10 Mb/s bandwidth between different users.

· Fast Ethernet. A modification of standard Ethernet which offers an increased bandwidth of 100 Mb/s (over suitable twisted-pair cabling). Fast Ethernet comes in two (incompatible) variants; 100 BASE T which retains the original CSMA/CD protocol, and 100 BASE VG which uses a new protocol6 with Token Ring like characteristics

· FDDI. Fibre Distributed Data Interface that was initially developed for fibre-optic cable, but is now also available in copper cable form. FDDI uses a counter rotating (ring) network topology to provide fault tolerance, and provides predictable 100 Mb/s data transfer rates.

· ATM. Asynchronous Transfer Mode is an emerging connection-based technology in which switches establish a virtual circuit (VC) between devices, with a defined priority

6 Demand Priority Access Method (DPAM)

Rev. 1.0 xxiii

Page 24: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

and bandwidth.

Of these technologies, FDDI is now a mature but expensive product, while both Fast and Switched Ethernet are currently seen as pragmatic incremental solutions.

Although ATM is still not a fully agreed standard, it is the subject of much speculation and an increasing numbers of commercial products. ATM was originally designed as a carrier-independent communications transport, which maximises the efficiency with which the medium is used, and which is able to scale up to high-speed trunk lines between (telephone) exchanges. Predictable point-to-point 155 Mb/s VC connections are now possible via ATM.

ATM is now also being deployed to create high-performance LANs. Because current network operating systems rely on broadcasts to function, ATM vendors must provide a LAN emulation solution. A more effective and efficient solution is offered by the draft MPOA (Multiple Protocol Over ATM) standard but, in the longer term, ATM will be directly supported by operating systems7. Following the introduction of the first single-chip implementations8, the cost of ATM network adapters is currently falling below 440 ECU at 155 Mb/s, and below 175 ECU for the lower-cost 25 Mb/s variant.

Most LANs are now inter-linked, either on an informal when-needed basis via the telecommunications system, or more formally to create Wide Area Networks (WANs). Networking protocols such as FDDI can be used to create a WAN over dedicated cabling or leased lines. Communications can also be established automatically on an as-needed basis, either using analogue modems or newer, much faster, technologies such as ISDN or ATM.

3.1.2 Communications Technologies

In recent years global telecommunications infrastructures have gone digital, only the line from the telephone to the local exchange remains analogue. Satellite communications are also digital and terrestrial broadcasting will follow (many portable cellular phones are already digital). This trend is blurring the distinction between a computer network and the global communications infrastructure. Today the key bridging technologies include:

· Analogue Modems. By converting signals from digital to analogue, modems allow a dial-up connection to be made via any telephone. The achievable data transfer rate is influenced by several factors, but the typical maximum rate is 28.8 Kb/s. The leading manufactures are currently introducing the next generation modems which can down-load from compatible Internet sites at up to 56 Kb/s (and up-load at up to 37 Kb/s), but standards have not been agreed.

· ISDN. Integrated Services Digital Network is a set of standards that allow a wire or fibre to carry voice and digital signals. ISDN uses existing public switching telephone wiring and provides one or more 64 Kb/s channels for data communications. Basic Rate ISDN lines provide two such channels, while the more expensive Primary Rate lines can provide 24 channels.

· ATM. Based on a 48-bit cell and hardware switches, ATM was (initially) developed as a communications standard. However, its wider telecommunications use may be delayed by a lack of bearer services. For example, British Telecon has announced Cellstream but this will not be available until 1997, and then only with support for permanent VCs.

Datapro claims9 that ATM will come into its own as the telecommunications operators begin to offer WAN and long-distance backbone services, and provide outsourcing services. ATM 7 pc-lan, October 1996.8 Rainer Mauth, Byte, November 1996.9 Computing, 17 October 1996.

Rev. 1.0xxiv

Page 25: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

is not a traditional telecommunications service that stops in the basement, it will go all the way to the desktop. This will fundamentally change the way telecommunications companies businesses operate10.

3.1.3 The Internet and World Wide Web

Today, the most widely known communications infrastructure is the Internet. Best regarded as the network of networks, the Internet provides an infrastructure that spans the globe allowing information to be passed between any Internet connected computer. Three distinct periods in the history of the Internet can be identified:

· 1969-1983: ARPANET, an experimental project of the US Department of Defence Advanced Research Project Agency, its mission was to explor networking technologies. The main outcome was the TCP/IP protocol published in 1982, and the first TCP/IP applications: email, TELNET and FTP.

· 1983-1990: The Internet was developed within the research community - Universities and governmental research centres. In 1989 RIPE was created in order to co-ordinate Internet Europe development. The Internet was the solution which assured the exchange of information, even when individual network servers break down.

· 1990 - today: Marked the beginning of the World Wide Web services across the Internet. Information access is facilitated by the introduction of new document oriented services using browser software. Businesses recognise the commercial potential of the Internet, and growth both in the network and network traffic increases exponentially.

The Internet now connects millions of sites all over the world - universities, research institutes, government services, private or public enterprises, and individuals). In this “cyber space” people in geographically distant lands communicate across time zones without ever seeing each other, and information is available 24 hours a day from any location.

Two main reasons may explain the exponential growth and onward development of the Internet :

1. The specifications or rules that computers need to communicate are publicly and freely available: these standards are known as the TCP/IP protocol suite. They allow connection to every kind of platform, PC, Macintosh, Unix workstations, and mainframes.

2. No one as the monopoly on access to, or use of, the Internet. The Internet nervous system does not have a central brain, such as a powerful supercomputer, rather all the networks and computers act as peers in the exchange of information and communication.

The main Internet services are :

· TELNET: Remote login is an interactive tool that allows access to programs and applications located on another computer.

· FTP: File Transfer Protocol is a tool that allows files to be transferred from one computer to another. The file can be a document, graphics, software, images, video, sounds etc.

· EMAIL: Electronic mail is the most commonly available and most frequently used service on the Internet. Email allows a text message to another person, or to a whole group of people. It is possible to attach a file to a message.

· NEWS: In this context news refers to a discussion and usually means an interest group or conference. News tools allow messages relating a specific topic to be send or read. News servers manage these messages. Usually Volunteers proposes a synthesis of the

10 David Steel, Mercury’s ISDN product manager.

Rev. 1.0 xxv

Page 26: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

discussion, sometimes in the form of FAQ (Frequently Asked Questions).

· WAIS: Wide Area Information Servers are commercial software packages that allow the indexing of huge quantities of information, and make the resulting indices searchable across networks. A prominent feature of WAIS is that search results are ranked (scored) according to how relevant the “hits” are. Subsequent searches can be refined.

· GOPHER: A method of making menus of material available over the Internet. Gopher is a client and server style program, which requires that the user have a Gopher program. Each Gopher server presents its own information, using tree menus, and proposes links with other Gopher servers, making easier the information research on the Internet.

· WWW: The term World Wide Web is used loosely to refer to the whole constellation of resources that can be accessed using Gopher, FTP, HTTP, TELNET, USENET, WAIS and some other tools. More correctly, WWW refers only to the universe of HTTP (HyperText Transfer Protocol) servers, which are accessed via browser client software, and give the user seamless access to text, graphics, sound files, etc.

Tim Berners-Lee invented the WWW in 1989 in an attempt to efficiently store research data at CERN. He based the system on the concept of hypertext (or text with links) that can be followed electronically - even across different computer system - to other related documents, files, sounds, images, or programs. The key to the WWW is HTML (Hyper Text Mark-up Language) which is a relatively simple set of codes (which are a subset of SGML) that turns ordinary text into hypertext when viewed by a WWW browser. The most popular browsers today, Netscape Navigator and Microsoft Explorer, are very widely used and support a wide range of protocols.

The types of resources accessible via Internet are growing at an astounding rate. The term resource describes anything you can access on the Internet, no matter where it is physically located. Examples of some Internet resources include databases, free and shareware software, on-line libraries, updated weather information, on-line magazines, archives of daily newspaper articles, mailing lists. Many companies now have a “WWW site”, and business is now being conducted over the Internet.

3.1.4 Other Sources of Digital Information

Although the Internet has become the most widely known means of accessing digital information, other routes are available and some have been in use for many years:

· file transfers using modems: Several software utilities, such as LAPLINK, allow a PC user to see a list of the files stored on a remote PC and copy selected files to his own computer. These utilities were initially designed to allow the transfer of files between computers linked by a short cable (connecting either their RS232 ports or their parallel ports). By linking the RS232 ports via modems and the telephone system, files can be transferred between two computers irrespective of their location.

· on-line servers (BBS): A computer running a server program and equipped with modems to provide dial-up services to remote users. Still used by software and hardware vendors to make upgrades available on-line. Access may be restricted by login procedure, users may be allowed to send messages to the administer of the server.

· on-line service providers: Companies such as CompuServe and AOL provide registered users with added value services such as sending email messages and binary files to other users, discussion groups, and areas where organisation can store files available for download by others. Such companies are now rapidly switching to become Internet access providers.

Rev. 1.0xxvi

Page 27: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

· diskettes: Floppy discs are the simplest means of transferring software and data between PCs, and the current generation of 3.5” diskettes are compact and relatively robust. Their major limitation is low capacity, thus file compression software (such as PKZIP) is widely employed. For example, in the construction sector, DXF drawing files are frequently transferred between partners in zipped form on diskettes.

· optical disks: For reliably transferring larger files, such as generated using CAD, an alternative to diskettes is needed. A variety of medium and formats have been used spanning magnetic tapes, removable magnetic disks, and optical disks. The latter is currently one of the more attractive. Initially introduced some five years ago with a 128 MB capacity, this has since increased to 230 MB with 650 MB devices now available. Media costs are low with a 230 MB disk costing under 18 ECU.

· CD-ROM: The IT equivalent of the audio CD, the CD-ROM is now very widely used for distributing software, data and digital publishing. CD-ROM drives are inexpensive, and are now a standard fitting on most new PCs. They have a 650 MB capacity, the data cannot be modified, is reliable, and insensitive to magnetic fields, dust, and moisture. CD-ROMs can be pressed very cheaply in large volumes. Alternatively, they can be written directly using a CD-R recorder that now cost as little as 700 ECU. These are widely used for archiving data, particularly in sensitive fields like banking were the write-once characteristic is a major advantage.

The fact that CD-ROMs have a high capacity, and can be produced economically both in small and large batches, makes them ideal as a distribution medium. For example, CSTB in France used to publish some 21 books for the CAE industry, a total of more than 15,000 pages. These contained the complete set of the French standards, regulations and rules governing construction. Subscribers now receive a single CD-ROM four times a year containing the latest version of this information, which now includes 8,000 pictures. Access is via an interactive program that provides an index, keyword search, and lists of related items.

3.2 Barriers to LSE uptake

3.2.1 Networking and Communications

Cost is the only underlying barrier to widespread uptake of networking and communications, both within, and between, permanent office environments. However, LSE offices are frequently located in areas of the globe where the availability of equipment and specialist support staff is also a problem. More critically, the local communications infrastructure may be inadequate. Provided time and resources are available, such issues can be overcome.

LSE is characterised by the business need to communicate with an ever changing mix of partners. Compatibility between systems, and the ability to rapidly establish effective new telecommunications links, are thus major concerns.

Major LSE companies, such as John Brown Engineering, have already invested heavily to establish global communications infrastructures. The initial goal is to support in-house information systems that will enable their engineering offices to be viewed as a unified global resource. Thus enabling work to be transferred to offices with spare capacity or lower costs, and to be shared more efficiently between offices.

Such digital infrastructures are typically based on a mix of leased lines and public packet-switching systems. The majority of LSE companies infrastructures are incomplete, much use is made of modems, with an increasing proportion of faster technologies such as ISDN. The speed with which the LSE industry adopted fax illustrates the potential for rapid growth.

Rev. 1.0 xxvii

Page 28: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

Significantly, it was the universality of fax that was the key driver - only a simple phone line is needed and compatibility is (usually) not a problem.

LSE faces more substantial communications barriers concerning:

· temporary offices;

· mobile workers on sites.

In both cases, short timescales and remote locations are often an added complication. In many cases mobile phones can provide a quick and invaluable contribution, but even these may not be available. A service may simply not be available in that region, or poor reception may be a major problem. Current solutions include DECT (Digital Enhanced Cordless Telephone) which provides excellent voice services plus “Cordless LAN” data services, and the emerging GSM/DCS based services for which coverage is limited and costs remain high. Wireless LAN technology, based on ISO 802.11, is still relatively immature and does not yet provide voice services.

Conventional telephone lines and data communications lines may take too long, or be impractical, to install. In this case the only option is a radio a link, or increasingly common, a satellite link. The latter can provide high data-rates, although the cost may be high. In areas that are politically or militarily sensitive, it may also be difficult to obtain approval for installing sophisticated communications links.

The mobile site worker remains the biggest challenge. Wireless networks, both terrestrial and satellite, are available. However, real portability and reliable rapid data transfer speeds are not yet a practical combination.

A related technology, satellite-based Geographic Positioning Systems (GPS) are already having growing impact on LSE practice. Very compact receiving equipment is now available and, by including a ground station, a high degree of accuracy can be achieved.

3.2.2 The Internet and World Wide Web

The Internet as a whole continues to move forward and support (access to) an increasing proportion of business activities. However, significant problems remain and these are not peculiar to LSE. Two underlying reasons can be identified:

1. The initially objective of the Internet was to connect University research laboratories and governmental research centres, these needs are relatively far from the needs of business activities.

2. Despite very rapid evolution, the Internet is still a young technology, especially for an industry where project duration typically extends over several years.

Initial experience in applying the Internet has identified a number of disadvantages. Of these the most significant from the LSE perspective are:

· Coverage: Coverage is not spread uniformly, and does not extend over all parts of the world. For example, coverage in Africa and Asia is limited. In some areas only emailservices are available.

· Reliability: Each Internet message may pass through many intermediaries, thus they are susceptible to failure at each stage. This can result in disconnection or the loss of messages or files during transfer. This is unacceptable for many industrial applications, the placing of orders and making of payments for example. The bulk of current Internet usage is still based on file transfer (FTP), hypertext documents access (HTTP), electronic mail (SMTP) and remote login (TELNET). These older protocols do not include provision to control the transfer - resend in the event of failure for example -

Rev. 1.0xxviii

Page 29: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

thus reliability is a problem. New protocols address these problems but are not yet widely used.

· Performance: The goal of the developers of the early ARPANET was to bring expensive hardware resources closer to researchers. Response times were not the key requirement. Since then the success and the expansion of the Internet has hugely increased both the scale of the Internet and the volume of usage. The amount of traffic and file size is now so great that growth in usage continues to outstrip efforts to improve the infrastructure, with the result that access time are variable and frequently slow. This overloading is illustrated by slow downloading of WWW pages, and the fact that email messages may take four hours to be delivered. Such considerations may have been of less important for academic use, but remains a substantial barrier to widespread business use.

· Security: Computer security is a major issue for enterprises who must protect their software, data, and transactions. The Internet comprises a massive co-operative network involving tens of thousands of individual interconnected networks. Because it supports large numbers of researchers, it is still regarded as an open network. The security implications are major. Sensitive information in transit on the Internet, or stored in computers linked to the Internet, is vulnerable to attack from intruders who may be located anywhere. Every enterprise connected to Internet must adopt procedures to minimise the risk, and existing solutions such as “firewalls” need specialists to be effective.

· Characters sets and Languages: Being a global industry, LSE requires good support for different character sets and languages. Similarly, Internet based information sources such as the WWW should present that information in all relevant languages. Existing infrastructures are deficient in both respects.

3.2.3 Other Sources of Digital Information

Diskettes are still widely used within LSE for off-line information transfer, but their limited capacity is now a major barrier. CD-ROMs present no barriers to publishing large volumes of information. However, few LSE companies are yet using CD-R technology for transferring one-off data. The equipment to write CD-ROMs has only recently fallen sufficiently in price, and the process is not yet completely reliable.

The main barrier to using all other off-line formats is that none are sufficiently widely used to be known to be readable by the recipient. Telecommunications and wide area networks are increasingly being used, the cost being offset by the immediacy of the data transfer. The Internet and the WWW offer a cheaper and more flexible means of transferring and accessing information, the practical barriers to uptake being performance and security problems.

3.3 Trends and expectations

3.3.1 Networking and Communications

Networking and communications will expand considerably, both in bandwidth and coverage. By 2005 all IT devices will be expected to have some form of digital connection, while the growth in capacity will result in new applications and new patterns of usage.

This has considerable implications for the strategic planning and provision of networking and communications infrastructures. Indeed, the current distinction between (local) networks, longer distance communications, and telecommunications will become increasingly blurred.

Rev. 1.0 xxix

Page 30: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

Digital infrastructures will need to be seen in a more holistic sense, embracing the cabling and other carriers plus the many servers and other devices that collectively deliver digital services to users.

Technologies that will enable these changes include:

· ATM and Gigabit Ethernet. ATM, which is already being deployed, is expected to be a key technology. Companies such as Sun have announced 622 Mb/s ATM network adapters, which will allow 1 GB to be transferred in under 20 seconds, and speeds up to 2 Gb/s have already been predicted over fibre. Competing Ethernet technologies will continue to evolve, the introduction of Gigabit Ethernet pushing the bandwidth to 1 Gb/s. Products are expected to ship in 1997, but the standard is not likely to be ratified until 1998. Gigabit Ethernet may prove to be cheaper than ATM, but ATM will be more reliable and more scaleable.

· Computer-Telephony (CT). The convergence of telecommunications and computers is already leading to the emergence of multifunctional CT servers to replace, or to work co-operatively with, existing PBXs. CT servers may contain PC boards to support interactive voice response, speaker verification, voice mail, fax on demand, and fax broadcast. New CT boards with switching-matrix chips make it possible to eliminate the PBX, while de-facto hardware and software standards are substantially reducing costs. For example, moves to standardise on Windows NT plus Microsoft’s Telephony API (TAPI) which allows users to control CT telephony resources from TAPI-enabled clients. Other standards such as Multi-Vendor Integration Protocol (MVIP) will allow PC-based CTs to replace PBX in small and medium companies. The future adoption of ATM will remove scaling limitations and provide an integrated voice and data-switching infrastructure. The Enterprise Computer Telephony Forum, a consortium of key CT players, is developing unified set of hardware standards (H.100) and CT APIs (S.300)11.

· Broadband Modems. Existing narrow-band analogue modems will soon start to be displaced by broadband modems. Because these are able to utilise a much wider spectral bandwidth, these raise the theoretical digital capacity of existing carriers by two to three orders of magnitude. Cable modems, which will operate over cable TV networks, have a theoretical capacity of 30 Mb/s per channel (shared between the users) while Asymmetric Digital Subscriber Line (ADSL) modems provide a theoretical 9 Mb/s over a phone line. The real performance will be substantially less. Currently available broadband modems provide an Ethernet port, so the computer appears to be connected to an Ethernet LAN. The modem may however send Ethernet packets, ATM cells, or proprietary-format packets as a modulated signal over the phone or cable network. As this implies, standards are currently an issue but this is being addressed - a perquisite for the broadband becoming a commodity product. It is probable that ATM will be standardised upon, allowing the Ethernet interface to be eliminated. Eventually a broadband modem may be integrated onto the motherboard12.

The major attraction of broadband modems is that they allow wide area networks to be constructed using existing (cable TV or phone) infrastructures. Because these infrastructures embrace homes and small businesses, the commercial opportunities for the cable owners are considerable. Large-scale pilots have already been conducted and the concept proved. However, there are technical problems linked to the topology and the original role of the cable TV and phone infrastructures. Considerable investment is required to overcome these limitations in the existing cabling. One characteristic that may

11 Bob Emmerson, Byte, November 1996.12 Tom Halfhill, Byte, September 1996.

Rev. 1.0xxx

Page 31: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

remain is that the actual performance tends to be asymmetric, with upstream paths being much slower than downstream. However, with an effective performance of about 1.5 Mb/s, the commercial, social and business implications are considerable.

For corporate customers, data-networking provides are anticipating demand for a broad range of services including voice, video, and data services via wireline telephony, personal communications services, video conferencing, cable TV networks, digital near video on demand, and high-speed access to the Internet.13 The longer-term vision appears to be a complete end-to-end ATM solution.

Parallel progress will be made in wireless technologies, although a major limiting factor will be the future allocation of radio frequency bands. Key technologies include:

· Mobile Phones. An emerging generation of advanced digital mobile phone, which use the Global System for Mobile Communications (GSM), is expected to become a mass-market product by 2000. This integrates additional functions such as e-mail, communications with a fax machine, and Internet access.

· Satellite Broadcasting. PC users in France who subscribe to the Canal+ digital pay TV channel on the Astra satellite can already directly download software and other material at 400 Kb/s. The set-top box is linked to the PC and instigates the up-loading of the required material via a 2.4 Kb/s telephone link. The service is due to be expanded to include Internet access and the option of faster telephone links. Similar competing services are planned. Uptake will be boosted after 1998 when a new generation of satellites will allow bi-directional services via the satellite14. It is confidently predicted that by 2005 very fast two way communications will be possible via large constellations of satellites which span the globe15, although doubts have been raised about the growing threat of space debris damage.

· Terrestrial broadband broadcasting. Other wireless options include two-way ground-based digital broadcasting such as local multipoint distributed services (LMDS) which could deliver bandwidths of 10 Mb/s or more.

Similar progress will be made with wireless LAN technology. Already a credit card sized device offering 1.6 Mb/s Ethernet at a range of up to 300 m has been announced. However, it is expected that the cost/performance will remain less attractive that cable-based networks.

3.3.2 The Internet

Since no one really controls the web, predicting Internet trends is a difficult exercise. Development and improvement of the TCP/IP protocols have been sanctioned by the Internet Society (ISOC), a non-profit professional organisation run by its members (both individuals and organisations in various communities, including academic, scientific, and engineering). ISOC is dedicated to encouraging co-operation among computer networks to enable a global research communications infrastructure. The society sponsors several groups that determine the needs of the Internet and propose solutions to meet them. One of these groups is the 13 Larry Yokell, Byte, September 1996.14 Adele Hars, Byte, November 1996.15 For example, two billionaires (Bill Gates and a pioneer of cellular phones) recently signed a $9 billion contract with Boeing to co-ordinate the building of Teledesic, a network of 288 low orbit communications satellites. Their primary purpose will be to provide high-speed (2Mb/s) two-way Internet connections where land-based fibre optics are not available, although they could also be used to carry telephone calls. The projected starting date for the service is 2,002. Although Teledesic is the most ambitious, a substantial number of other companies are planning similar systems based on large numbers of communications satellites.

Rev. 1.0 xxxi

Page 32: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

Internet Architecture Board (IAB), which provides direction to two principal task forces: the Internet Engineering Task Force (IETF), and the Internet Research Task Force (IRTF). The IETF is concerned with operational and technical issues of the Internet, and the IRTF is involved in research and development matters.

Several projects focus on commercial use of Internet with an emphasis on reliability, security and ease of use.

· CommerceNet: ftp://ftp.commerce.net/; http://www.commerce.net

· The Commercial Internet Exchange (CIX): [email protected]: ftp|gopher://cix.org/

· Enterprise Integration Networking (EINet): [email protected]

The World Wide Web Consortium (W3C) exists to realise the full potential of the Web. The W3C is an industry consortium that seeks to promote standards for the evolution of the Web and interoperability between WWW products by producing specifications and reference software. Although industrial members fund W3C, it is vendor-neutral, and its products are freely available to all.

The Consortium attempts to find common specifications for the Web so that through dramatic and rapid evolution, many organisations can work in their own fields to exploit and build on top of the global information space that is the web. The technologies involved in the web are changing very rapidly, and so the Consortium must have both efficiency and flexibility in its process, to be able to respond to the needs of the community in a timely manner.

While the intrinsic value of the Internet, and of the WWW, is founded on their compliance with global standards, commercial products are increasingly influencing industrial usage. The current “browser war” between Netscape and Microsoft illustrates this.

Until recently, Netscape’s Navigator was the undisputed market leader with an estimate 80% of the browser market. However, as a consequence of aggressive promotion by Microsoft, Internet Explorer 3.0 has rapidly increased its market share to approaching 30%. It has been predicted that the market share of Navigator will fall below 40% by the end of 1997. These striking figures are partly explained by the fact that Microsoft is giving its software away free. However, what they illustrate is the critical importance to Microsoft of controlling the browser market. This is further reflected in Microsoft’s plans to merge the Windows Explorer with the Internet Explorer thus providing a unified Internet centric view of the desktop.

The Netscape and the Microsoft browsers both currently provide innovative features including, their own (incompatible) extensions to the current agreed version of HTML. One consequence is that W3C may find itself having to ratify a de facto standard.

Underlying the browser war is the current struggle between Microsoft’s ActiveX (see 4.1.5) and Sun’s Java (see 4.1.6) to be the dominant technology to support the downloading of software components over the Internet. Both browsers include:

· a Java Virtual Machines, within which to execute downloaded Java “applets”;

· support for running downloaded ActiveX components.

Sun’s strategy is to encourage Java developers to write pure (i.e. Windows independent) applets, while that of Microsoft is to encourage the use of ActiveX - both in conjunction with, and independent of, Java. The industrial exploitation of such down-loadable software architectures is still in its infancy. Legitimate concerns remain about the security implications of downloading executable code, with Microsoft’s security model currently looking the more vulnerable. There are also concerns that web browsers may become

Rev. 1.0xxxii

Page 33: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

increasingly partisan, and not support both technologies in full.

Irrespective of which technologies prove to be the most successful, the important outcome will be an increasingly mature capability for developers, and for end-users, to create applications that are networked via intranets (see 3.3.3) and the Internet, and which are accessed via web browsers.

The extensible browser concept already has a reality in the form of current “plug-in” viewers. From a LSE perspective, one of the more interesting is based on the Virtual Reality Modelling Language (VRML). This viewer allows 3D models that are defined in VRML to be down-loaded from the web, and viewed as a non-immersed virtual reality model under the control of the user. VRML 2.0 provides sensors and allows 3D animation. Such plug-in viewers will continue to increase in the utility they provide.

There is also a parallel to the browser war on the server side of the web, both in terms of tools for the authoring of the material and the provision of efficient server software. Rapid improvements in the quality and functionality of such software are the outcome. As an illustration, it is already commonplace to be able to link the content of web pages to a live database - such that the pages reflect the current state of the database. Similarly, it is also increasingly easy to provide web pages that provide users with an active querying capability onto a database via a standard browser. Importantly, such developments will continue to reduce the effort needed to establish, and to maintain, high quality information on web servers located both on the Internet and on intranets.

It has been estimate that 500 million web pages will exist by the end of 1997. Thus a key technology in the practical use of the web to locate particular information are the so called “web-search” servers. These servers each maintain a large index of web pages, which is updated by “web-crawlers”. This database is queried by a user via a browser, a particular current example of the technology described above, and the search engine returns a list of relevant “hits”. Such servers may be publicly accessible (frequently funded by on-screed advertising), a commercial service, or increasingly in the future, part of a company’s intranet infrastructure.

Clearly the quality of the search engine is critical to the successful and early location of the required information. Based on research by Rank Xerox, the first generation engines that incorporate linguistic capabilities are starting emerge. These will provide:

· the ability to analyses a user’s query sentence;

· presentation of the hits as a manipulatable two dimensional map;

· computer generated summaries of these documents;

· agents which can email the user if relevant new pages subsequently appear.

With the prospect of machine language translation to follow, it is clear that search engines will be considerably more sophisticated by 2005. Other technologies are also being developed that can automatically “push” appropriate information to users, either from the Internet or from intranets. Additionally, browsers themselves will progressively acquire new capabilities to assist the user locate information and subsequently manage that information.

Several ESPRIT projects specifically aim to explore or developing Internet resources, for example:

· GENIAL: GEN (Global Engineering Network) Intelligent Access Libraries: GEN is seen as an information network opening up world-wide markets for users and suppliers of European engineering products and services. The vision fosters a working culture for large-scale collaborative engineering enabling enterprises to become "agile". The

Rev. 1.0 xxxiii

Page 34: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

objective of the GENIAL project is to provide a substantial contribution to this vision by establishing a Common Semantic Infrastructure. This enables enterprises from different engineering sectors to combine internal knowledge with engineering knowledge accessed on-line and world-wide via GEN.

· RISETEP: Enterprise Wide Standard Access To STEP Distributed Databases. This project aims to test the feasibility of the sharing of distributed digital mock up units in a heterogeneous environment.

In conclusion, the exponential growth trend of the Internet - with many new users and networks been added each day - makes its future success and utility inevitable.

3.3.3 intranets

The term intranet has only been in general use for a short term. It is used to refer to an internal company information system which is based Internet technologies, and is accessed via standard WWW browsers, but is not accessible to people outside that company. Fuelled by the increasing maturity of Internet technology, many companies are turning to intranets as a convenient and cost effective means to improve internal access to information, and to implementing group working.

Security considerations are eased by the fact that the intranet is not linked to the Internet, thus allowing more sensitive information to be handled. Intranets can be created which link offices on a global scale, and it seems likely that controlled access to intranets by trusted trading partners would follow.

3.3.4 Other Sources of Digital Information

With the rapid growth of networking and communication technologies, and the expected absence of a universal high capacity replacement for the 1.4 MB diskette, off-line means of distributing digital information are expected to reduce in importance.

The various tape, magnetic and optical removable read-write formats will continue to be used, particularly where secure high volume data transfer is needed. However, unless one of these formats becomes sufficiently widely deployed, the need to ensure the receiver has compatible equipment will remain a practical barrier (except for regular information transfers).

The exception is the CD-ROM, which is already almost a standard fitting on new PCs. CD-ROMs are already very widely used for distributing software, multi-media and similar material. It is expected that the technology to write CD-ROMs directly will soon reach the point that the advantages of an inexpensive, high capacity media are fully extended to one-off and short-run CD-ROMs. This will further increase the usage of CD-ROMs for transferring, publishing, and archiving information. Looking beyond 2000, it appears likely that the CD-ROM will be progressively replaced in this role by the higher capacity Digital Video Disk (DVD) formats (see 4.3.2).

3.4 Relevance to LSE

By 2005 it is clear that the concept of a universal digital network which spans the globe will be a practical reality, even for mobile users. Cost-performance may still be a restraint on LSE exploitation, more particularly for mobile users and remote sites (where land-based connectivity may not be available). Because of the many transient inter-company relationships implied by multiple project-centred virtual enterprises, and their implied long supply chains, the “open” Internet is expected to feature strongly in the future LSE usage of

Rev. 1.0xxxiv

Page 35: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

networks. It being assumed that current concerns about reliability, confidentiality, and security (plus performance) will be addressed.

The opportunities for LSE will include:

· The availability of an integrated global infrastructures which will play a critical role in enabling the efficient internal operation of (international) companies, and their collaborative involvement in virtual enterprises.

· The real-time transfer and sharing of work between offices, allowing advantage to be taken of global time differences to achieve 24 hour working on projects and access to low cost manpower16.

· The use of this infrastructure to gain rapid access to information such as suppliers’ product database and to support the progressive exchange and sharing of project information leading to the establishment of shared project databases. Similarly, its use to enable improved efficiency both within companies and virtual enterprises through video conferencing and workflow management. Increasingly, solutions will be based on Internet and World Wide Web standards.

· The complementary exploitation of future satellite based Geographical Positioning Systems (GPS) and either satellite or terrestrial based surveillance, control and data acquisition systems. For example, in the more effective control and management, of earthmoving and other heavy plant, and the logistic support of the construction process in general. The same technologies will also offer new opportunities in the remote monitoring of the finished LSE facilities such as transport infrastructures, and major earthworks in vulnerable situations.

· The use of the World Wide Web as a marketing and sales tool, both for LSE suppliers and for major LSE companies.

· The high capacity of DVD disc technology will continue to find a role for publishing and distributing material where networks are inappropriate or unreliable.

16 There are already signs of this within LSE, while the use of globally floating software licences has been reported as a means of achieving maximum utilisation of expensive software licences between offices in different time zones. In the software industry, global access to cheap programming capabilities is already a reality.

Rev. 1.0 xxxv

Page 36: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

4. Delivery Platforms

This section focuses on the nature of the systems, i.e. the hardware and operating system, on which IT solutions are delivered to the LSE industries.

Every commercially viable IT system can be classified as being an example of a particular delivery platform. High volumes are needed to gain economies of scale in the design, manufacturing, marketing, support, and most particularly the programming, of each platform. Delivery platforms must necessarily be directed at the needs of the widest possible range of users.

As this implies, only a small number of different deliver platforms can be viable. These platforms thus generate intense commercial competition, both short-term and strategic, within the IT industry. Which delivery platform(s) become the most widely used depends more on marketing, and other factors, than on technical merit17. It can be inferred that the IT industry will continue to be marked by superior technology which does not succeed commercially. That said, the competitive pressures are such that once a new functionality or capability exists, this is likely to be quickly replicated - at least to some degree - in the leading systems.

Today a single IT delivery platform dominates: Microsoft Windows running on a PC with an Intel type microprocessor. The global influence of Microsoft and Intel is illustrated by the size of their revenues, approximately $9 billion and $20 billion respectively.

In addressing the subject of future delivery platforms, only those aspects, which appear to be most significant for the LSE industries, will be considered in detail. These inevitably include the operating system and underlying hardware, but also includes a broader consideration of distributed objects. Selected emerging technologies that are likely to become part of the delivery platform, such as voice recognition and video conferencing, are also considered. Greater emphasis will be placed on the desktop platform than on the supporting servers.

Because of the influence of commercial factors, this study is necessarily based on forward industry projections rather that a detailed investigation of “blue-skies” research. The former should be reasonably accurate in predicting what will be available by the year 2000, and this gives a baseline for projecting what may be commonplace by the year 2005.

4.1 What exists today

4.1.1 Desktop Hardware

The dominant desktop hardware platform is the PC (Personal Computer) which is based on the Intel x86 microprocessor family. First introduced by IBM in 1981, the PC initially had a 8/16-bit architecture. Since then the PC has evolved very considerably, becoming a full 32-bit system with the introduction of the third generation x86 processor. The sixth generation processor was recently introduced:

x86 Generation Microprocessor Introduced

17 This is best illustrated by the success of the IBM PC which, when it was first launched in the early 1980s, was far from being the most technically innovative delivery platform.

Rev. 1.0xxxvi

Page 37: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

first 80088/86 1978

second 80286 1982

third 80386 1985

fourth 80486 1989

fifth Pentium 1993

sixth Pentium Pro 1995

The lifetime of an average PC is approximately five years. Thus, although all new systems are now based on Pentium or later processors, many of the installed base of PCs are still 486 processor systems. Currently the proportion of new systems based on the Pentium Pro is increasing rapidly as Intel reduces the price of the Pentium Pro family (to shift the market onto the latest generation of processor). Both the Pentium and the Pentium Pro currently have a maximum clock rate of 200 MHz.

Today18 a typical new PC - with a UK price of about 2,250 ECU - comprises: 166 MHz Pentium processor, PCI bus, 16 MB of EDO memory, 1 GB of EIDE hard disk, 1.4 MB floppy disk, six speed CD-ROM drive, and a 15” screen with a 2 MB video card.

A growing proportion of PCs is described as “multimedia” and includes a sound card and speakers. Microphones are also becoming more common, but very few PCs yet include a video camera. PCs at the top end of the performance spectrum are increasingly using multiple processors (typically two, but increasing) to provide greater processing power via Symmetrical Multi-Processing (SMP).

With faster microprocessors and more hardware integration, the supporting chipsets have become increasingly important in the design and cost effective manufacture of PCs. Intel has established an increasingly dominant position in supplying processor-related components, such as these chipsets, and is also now a major supplier of complete motherboards to many PC manufactures.

Relatively few PC manufactures have substantial in-house design capabilities19, most simply buy-in and assemble standardised components and sub-assemblies. A major factor in the success of the PC has been the use of standardised sub-systems such as disk drives, power supplies, graphics cards, and monitors. Cost-effective global supply chains for these sub-systems have been a major factor in driving down costs, and making even leading-edge PCs a commodity item. Intense competition has also ensured substantial innovation in the design and performance of these sub-systems.

Although Intel is clearly the dominant supplier of microchips to the industry, there are many other manufactures - particularly of the less demanding microchips such as random access memory (RAM). In order to ensure more stable supply chains, many manufactures “second source” (i.e. manufacture under licence) microchips.

The dominance of Intel is best illustrated by considering how few other companies have the in-house capability to design leading-edge x86 compatible processors:

· Cyrix, who produce a competitive and innovative range of x86 compatible processors. For business applications, its latest 6x86 processor running at 150 MHz has the performance of a 200 MHz Pentium.

18 This box should be regarded as containing no more than an indicative snapshot, the content is very volatile19 And very few have an in-house capability to manufacture microchips.

Rev. 1.0 xxxvii

Page 38: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

· AMD, who’s newest but delayed K5 processor is currently only comparable in performance with a 100MHz Pentium, but is considerably cheaper.

· IMS, who are a small company which claims to have an innovative Pentium Pro class processor under development for release in 1997.

Processors belonging to the x86 family have a traditional Complex Instruction Set (CISC). The only other CISC processor family which is still of significance is the Motorola 680x0 (as originally used in the Apple Macintosh), but its importance is declining steeply.

The subsequently developed Reduced Instruction Set (RISC) processors provide an alternative. A number of competing families exist. Of these, the most important for the “PC type” market are the PowerPC (Motorola, IBM and Apple) and the Alpha (Digital) processor - both of which run the Windows NT and other operating systems.

The philosophy behind RISC processors is that they are simpler, but much faster, than the equivalent CISC processor. To date Intel has succeeded in preventing a substantial price-performance gap emerging. The latest PowerPC processors are currently faster than the best Intel x86 processors, and the fastest 64-bit Alpha processors substantially so (but at a significant price premium). The problem for these alternative processors it that none have achieved the sales volumes necessary to be a fully competitive desktop platform.

It is significant that Unix workstations are now based exclusively on RISC microprocessors, and these are now available as 64-bit processors. The most widely used are:

· Sparc (Sun)

· Precision Architecture (HP)

· PowerPC (IBM)

· Alpha (Digital)

· MIPS (Silicon Graphics)

Although the total Unix workstation market is considerably smaller market than that for PCs, it is divided among a number of manufactures and different processors. Characteristics of these workstations are innovative architectures and large high resolution screens with good graphics performance. The spectrum of available processing power is very wide. Workstations at the bottom-end overlap in performance with the PCs, and frequently employ many PC components to keep down costs. At the top-end, a combination of SMP and leading edge microprocessors (typically four, but can be many more), large amounts of memory, and exotic graphics sub-systems deliver many times the performance and capacity of a PC. Historically, workstation vendors have enjoyed wider margins on advertised prices but a recent round of substantial price cuts (probably a response to the growing challenge from PC-based Windows NT platforms) has eroded these.

Today20 a typical new 64-bit Unix workstation with a UK street price of about 16,750 ECU would comprise: 64-bit Ultra Sparc processor, 128 Mb of memory (512 MB maximum), 2 GB of hard disk, 20” screen with fast 24-bit graphics and multimedia support.

Multimedia and video capability is the norm, frequently with high-performance support from specialised microchips. Video cameras are quite commonly included with workstations to enable video conferencing.

20 This box should be regarded as containing no more than an indicative snapshot, the content is very volatile

Rev. 1.0xxxviii

Page 39: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

4.1.2 Desktop Operating Systems

The dominant environment encountered on today’s IT delivery platforms is Microsoft’s Windows (which comes in several variants). The number of PCs still running DOS is declining fast, and the proportion of Macintosh users21 is also decaying. A minority of LSE users uses Unix, both on PC hardware and (more particularly) on Unix workstations.

· Windows 3.1 was launched in 1992 and, together with the 1993 update Windows 3.11 (also known as Windows for Workgroups), is still the most widely used variant of Windows. It is not a complete operating system, but a graphical user interface (GUI) grafted onto the underlying DOS architecture. The first incarnation, Windows 1.0, was launched in 1985 but was not a success. In 1989 Microsoft launched Windows 3.0 which was a limited success. Some seven years after the original launch, Windows 3.1 was finally the right product at the right time, and rapidly attracted substantial support from developers of application software22. It is the resulting depth and breadth of Windows applications software remains the key to the success of the Windows delivery platform. Although Windows 3.11 is only a 16-bit environment, some 32-bit applications have become available based on the WIN32S subset of the Windows NT API.

Because the earlier releases of Windows had not been a success, and Microsoft’s DOS user base was under threat from the GUI provided by the Macintosh (and to a lesser extent, by Unix plus X-windows). Additionally, Windows 3.1 was constrained by the fact that while contemporary PC hardware was 32-bit, the underlying DOS was only a 16-bit environment. To address this limitation, IBM and Microsoft had been collaborating to develop an advanced and completely new 32-bit PC operating system. This partnership collapsed23, the eventual outcome being the launch of two internally quite similar operating systems, OS/224 from IBM and Windows NT (new technology) from Microsoft.

· Windows NT was launched in 1992 with three key attributes; it provided a familiar GUI like that of Windows 3.1, it ran existing 16-bit Windows software, and it provided a new API for implementing full 32-bit Windows applications. In fact Windows NT is a very different operating system from the original variant of Windows, having more in common with Unix. Windows NT requires a more powerful PC, it is a stable and secure threaded multi-tasking operating system, it has a hardware abstraction layer25, and it supports SMP. Uptake has been steady, but has increased significantly since the introduction of version 3.51 (particularly the server version).

· Windows 95 was launched in 1995 for users of less powerful PCs. It only runs on single processor x86 hardware and is more of a successor to Windows 3.11. Windows 95 is a 32-bit environment, but with some 16-bit code, and is completely independent of DOS. It is a considerable technical improvement over Windows 3.11, provides a new improved style of Windows user interface, and supports plug-and-play. Windows 95

21 Macintoshs are most widely used in particular niche markets (such as desktop publishing). 22 Borland now bundle the Microsoft Foundation Classes with Borland C++, while Borland, Powersoft and Symantec are all to the Microsoft-developed Java reference implementation for Windows. This means that Microsoft now either provides, or has a direct input into, all the environments used for developing Windows applications.23 The commercial success of Windows 3.1 was a significant factor.24 OS/2 is also a 32-bit threaded multi-tasking PC operating system, but its desktop market share is very small.25 This means that Windows NT can easily be transferred to another hardware platform. NT is currently available on the following processors Pentium/Pentium Pro, Alpha, and PowerPC. It was recently announced that no new releases would be issued for the latter processor.

Rev. 1.0 xxxix

Page 40: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

runs DOS software, existing 16-bit Windows software, and 32-bit Windows software26. Uptake has been rapid, particularly for the home and portable market, but many corporate users are still using Windows 3.11 or are moving direct to Windows NT.

Version 4.0 of Windows NT was launched in September 96 with several important improvements, including the addition of the enhanced Windows 95 user interface. Although NT still lacks plug-and-play, version 4.0 will substantially increase the desktop presence of Windows NT.

A key feature of Microsoft’s Windows environment has been OLE. Originally an acronym for Object Linking and Embedding, OLE enables the creation of compound documents with contents from a variety of sources (text, graphics, video, database queries, and so on). It is now an umbrella term for a large collection of technologies, including that of compound documents.

· OLE provides infrastructure to support application integration, in terms of the document and other components, based on an object based desktop architecture. An OLE software component is reusable code adhering to an external binary standard, but whose implementation is not constrained. In particular, it does not need to be implemented using object technology. Communication with the component takes place through OLE objects, with function calls to the object interfaces. OLE itself provides API functions for access to OLE-implemented components, and for customisation of these components. The interface is complicated and difficult to program. OLE is built on the foundation of the Component Object Model (COM).

As a standard Windows has been defined, and is implemented, by one company. In marked contrast Unix, is an operating system that is an “open standard” and a long history.

· Unix is now formally under the control of The Open Group (formed by merging the Open Software Federation (OSF) and X/Open). The problem for Unix, which is a robust and mature operating system, is that a proliferation of competing dialects remains (such as Solaris, AIX, and SCO). This fragments the Unix market, and applications software must be ported between different implementations. Unix employs the X-windows protocols to drive the graphical user interface, and most versions of provide some capability to run Windows application via emulation software.

For desktop systems, Windows NT now presents a significant threat to Unix. For example, Compaq has created an NT workstation division specifically to sell Pentium Pro based workstations into the Unix workstation market. Also, of particular significance to the LSE CAD sector, AutoDesk recently announced that new versions of AutoCAD would only be available on Windows NT platforms.

4.1.3 Servers

Today the majority of IT delivery platforms are linked to servers; either local servers (via a local area network), and/or remote servers (via a wide area network or the Internet). These servers range from simple PCs to large main-frame computers, while the service(s) provided can include; printing, file storage, application software, computation, databases, communications, World Wide Web etc.

File, application, and print services are most commonly provided to PCs by servers running Novell NetWare, an operating system that was specifically developed for this task. The server hardware is typically PC-based, frequently with large amounts of memory to improve performance, and large capacity disk drives. SMP hardware, with x86 or other

26 In fact the Windows 95 32-bit API is a (large) sub-set of the Windows NT API, but the majority of 32-bit application software is written to run in both environments.

Rev. 1.0xl

Page 41: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

microprocessors, is commonly employed.

Running the Unix, Windows NT, or OS/2 operating systems on a similar wide spectrum of server hardware can also provide equivalent services to PCs (or workstations). Although these operating systems are less efficient than NetWare at file serving, their greater flexibility makes them more suitable at providing other services. For example, Work Wide Web servers have traditionally run Unix.

To improve reliability and performance, servers frequently employ specially configured hardware platforms27 that include features such as uninterruptible power supplies, redundancy, in-built diagnostics, hot-swap of faulty components, Redundant Array of Inexpensive Discs (RAID), and internal architectures to maximise throughput.

With a cost now below 400 ECU, 4 GB magnetic hard discs are now common used in both desktop and server PCs, with much larger capacity 3.5” discs available. IBM is probably the leading manufacture of disk drives, and of high capacity disk subsystems which employ RAID technology. The latter have capacities up to multiple terabytes28, and as well as IBM mainframes, are aimed at servers running Windows NT, NetWare and Unix.

4.1.4 Portable Computers

Low voltage microprocessors, power management, battery technology, flat screens, and small low-power disk drives have improved to the point that portable PCs are not far behind desktop computers in performance or price. However, portables still attract a price premium. They represent a substantial and growing proportion of hardware platforms, and offer the dual advantage of compactness and portability.

Today29 a typical new portable PC with a UK price of about 3,200 ECU would comprise: 133MHz Pentium processor, PCI bus, 8 MB of memory, 800 MB hard disk, 1.4 MB floppy disk, keyboard with integral mouse, 11.3” 800x600 TFT screen, and Windows 95.

Upmarket portables may include up to a 13” screen30 with 1024 by 768 pixels, a 2 GB disk drive, CD-ROM and multimedia capabilities, fax and modem. They may run up to 3.5 hours on Lithium Ion batteries, and have the capacity to easily run demanding operating systems (like Widows NT or Unix).

Apple was an early entry into the portable market and their Powerbook range has a significant market share. Both these and PC compatible portable computers tend to be of A4 format, with thinner lighter models being more expensive. Smaller “sub-notebook” formats exist, but have not sold well.

Smaller still are the true hand held computers, with keyboards, as manufactured by companies such as Psion, Sharp, and Hewlett Packard. These are widely used, and can accommodate peripherals such as bar-code readers, but suffer from limited PC compatibility. Computers like the Newton, which have a stylus rather than a keyboard, tend only to be used in niche markets.

Portable digital phones are a complementary and developing technology. They already allow portable computer users to send faxes etc., but are also acquiring screens and keyboards in there own right enabling them to edit, send, and receive email and faxes.

The personal digital assistant (PDA) market has been significantly altered by Microsoft’s 27 Such platforms are usually built from standard (PC) components.28 Computing, 20 June 1996.29 This box should be regarded as containing no more than an indicative snapshot, the content is very volatile30 Which has a larger viewing area than a 15” CRT screen.

Rev. 1.0 xli

Page 42: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

recent release of a new variant of Windows:

· Windows CE is a 32-bit multi-tasking treaded operating system, which is held in ROM and supports a subset of the 32-bit Windows API. The operating system is portable to a range of microprocessors and includes a Windows 95 interface, with slimmed-down versions of Explorer, Word, Excel, and Exchange. A key feature is the use of an infra-red link to automatically synchronise files with the user’s desktop PC.

Companies such as Compaq, Casio and NEC have just launched new Windows CE computers with keyboard, touch screen 240x480 grey-scale display, and 2 MB RAM priced from 440 ECU.

4.1.5 Distributed Objects

The importance of the object metaphor at all levels in future IT systems has been recognised for many years. So too has the increasingly distributed nature of those systems. It follows that distributed object technologies will be central to the internal plumbing of future systems. Their role is to provide a standardised means of interworking between discreet objects, or “components”, irrespective of their location on a network.

Unlike traditional client-server architectures, which divide applications rigidly into two parts, distributed objects provide a much more flexible form of “middleware”. They allow systems to be assembled through interactions between many much smaller independent software components. Data and behavioural logic are encapsulated within discrete objects, each of which can act as client and server. These modular software components can have any granularity and distribution, and their independence facilitates the easy modification of applications. Characteristic benefits of distributed objects include:

· plug-and-play between components allowing the rapid assembly of new applications;

· interoperability between components and applications which cross platforms;

· portability of objects between alternative delivery platforms;

· co-existence with “legacy applications” through object wrappers;

· inherent self-management of objects.

There are two main contenders for providing the standards that will govern future distributed object environments, CORBA and ActiveX:

· CORBA, or Common Object Request Broker Architecture, which is governed by the Object Management Group (OMG). Since 1989 this consortia (which now has 500 members) has been specifying the architecture for an open software bus on which object components written by different vendors can interoperate across networks and operating systems. This object bus acts as an Object Request Broker (ORB) which allows clients to invoke methods on remote objects. Bindings to the ORB can be static or dynamic (in which case the interface is defined by an Interface Repository). In late 1994 the OMG released specifications for CORBA 2.0 with substantial extensions including inter-ORB communication services31, global identifiers for components, and modular add-on ORB services including object persistence. The CORBA 2.0 standard is well respected, has links to The Open Group, and may well be formally endorsed by ISO. Commercial quality CORBA 2.0 implementations are now available and are being deployed.

· ActiveX has similar objectives but very different roots. ActiveX is the new name for Microsoft’s COM/OLE technologies (see 4.1.2), reflecting the fact that these are being

31 Both TCP/IP and DCE bases services are defined.

Rev. 1.0xlii

Page 43: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

shifted from the desktop domain, where they are very widely implemented, to the network. Microsoft’s Distributed Common Object Model (DCOM) allows developers to use communications services rather than building connectivity into each application. Reflecting it’s different and incremental origins, the technology and terminology of ActiveX differs from CORBA, and is currently less complete. However, Microsoft is determined to see the widespread adoption of ActiveX at an enterprise level. This is by illustrated the fact that control of the ActiveX standard was recently transferred from Microsoft to The Open Group. ActiveX is currently integrated with technologies like HTML, TCP/IP and Java, and is available on Windows and Macintosh, with Unix to follow.

Although there are significant technical differences between CORBA and ActiveX, they provide broadly similar capabilities in relation to distributed objects. The expectation is that they will co-exist for the foreseeable future. Interworking between them has been demonstrated by at least one software vendor, and OMG has approved a specification for bi-directional communication with COM. CORBA is independent of any operating system and available on multiple platforms (although its roots lie closer to the Unix world). ActiveX is a child of Windows, but is being migrated to other platforms.

Given that operating systems will soon become fully object oriented, these technologies have considerable strategic importance. It is impossible to predict their future relative importance, but close integration with the operating system appears to be inevitable. As this implies, ActiveX has a significant advantage if Windows becomes near universal, but it is probably at a significant disadvantage if the is not the case.

4.1.6 Other Emerging Technologies

Three currently emerging technologies that will be influential in the future evolution of the delivery platform are briefly highlighted:

· voice input;

· video conferencing;

· Java and the Network Computer (NC).

The first two are destined to become an integral part of all desktop delivery platforms by 2005, while the last is currently challenging the very concept of a PC:

· Voice Input is rapidly becoming a practical proposition on a standard PC. IBM’s VoiceType 3.0 software can recognise, decode and display speech on the screen at a rate of between 70 to 100 words per minute with 95% accuracy, without the need for special hardware. Most new users do not need to train the software, but words must still be spoken in an over-deliberate way (with a tenth of a second pause between words). Custom voice controls can also be created, both to format word processor input and to control and dictate to other applications. VoiceType is bundled with OS/2, and is available for Windows 3.1 and Windows 95 at a cost in the UK of 1,170 ECU (110 ECU for an entry-level version). The major word processing packages are addressed. Hardware requirements are modest; a SoundBlaster compatible card, 90 MHz Pentium with 16 MB memory and 30 MB of hard disk space.

· Video Conferencing has also been maturing on more specialist hardware, and now is becoming a reality on the PC platform. Suitable video cameras are inexpensive, and standards such as H.320 and H.324 allow interoperability between different vendors systems32. Since video exceeds the bandwidth of current connections such as ISDN,

32 Andrew Davis, Byte, October 1995.

Rev. 1.0 xliii

Page 44: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

hardware or software data compression is employed, and there is a trade-off between the frame rate and picture quality. Less demanding in bandwidth, unless used in conjunction with video conferencing, and able to run without special hardware are real-time collaboration tools based on the shared whiteboard concept. Such software complies with the T.120 standard and has already becoming an accepted extension to the operating system on many platforms. Microsoft has just released NetMeeting, an extension to the Windows Explorer 3.0 browser. NetMeeting comprises an electronic whiteboard, application sharing tools and Internet phone capabilities (audible but grainy). Connections via modem, IPX and TCP/IP are supported. Built on ActiveX conferencing protocols, NetMeeting has good application-sharing and annotation features. Users can also draw on the share whiteboard and exchange files.

Java and the NC may be regarded as cause and effect:

· Java: At one level Java is simply a new object oriented language which is similar to C++, but which addresses some of the deficiencies of that language (for example, by providing automatic garbage collection and eliminating the error prone pointers). It was developed by Sun specifically to write portable programs, and program applets, which can execute over the Internet. Significantly, Java programs are normally interpreted in a Java Virtual Machine for the target platform, and are run in conjunction with a Java-enabled Web browser. Java enables applications to be created that are both portable and Internet oriented. In the twelve months since it was released by Sun, Java has spread extremely quickly. Java development and run-time environments are now available for all delivery platforms. Issues such as the speed of Java applications are being addressed by the introduction of just-in-time compilers which, when embedded in a browser, convert the Java bytecode instructions into native instructions. Today Java is still a relatively immature technology, one which probable has to mature further before its full impact can be judged.

· The NC is a logical consequence for the delivery platform of the expected widespread adoption of Java. It is predicated by the assumption that an adequate network bandwidth will exist from the delivery platform to the supporting servers. If only Java applications are to be used, and these can be down-loaded from the server on an as needed basis (as are the user’s files), local disk storage is not needed. Thus, the NC, or low-cost “thin client”, only needs a screen, keyboard, processor and memory. The centralisation of software and file storage is also expected to simplify administration and maintenance, thus substantially reducing the overall cost of ownership. The NC concept has the additional attraction for many IT vendors that it is not reliant on Microsoft or Intel.

Microsoft’s merging of DCOM and OLE technologies under the ActiveX label can be seen as a response to the threat posed by Java (and the NC).

4.2 Barriers to LSE uptake

IT delivery platforms, such as the PC and the Unix workstations, have been widely deployed within the LSE industries. The level of uptake varies significantly between LSE sectors and between individual companies.

The barriers to LSE uptake are generic IT deployment problems, none are peculiar to LSE. A check-list of the most significant barriers follows:

· Delivery platforms are still relatively expensive to purchase, and the cost of ownership is considerable. Inevitably, many companies also have a large inventory of older equipment. Few LSE companies have reached the strategically important milestone of

Rev. 1.0xliv

Page 45: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

equipping every member of staff with an IT platform.

· Supporting networking and infrastructure are a substantial financial and strategic cost, particularly for the distributed and virtual enterprises found in LSE. Wireless technologies are appropriate for sites, but these remain are limited in capacity and can be expensive.

· Standard desktop and portable PC platforms are unsuitable for many on-site situations. Available ruggedised platforms are expensive, limited in scope and capacity, and present compatibility problems.

· Application software, both engineering and administrative, frequently fails to provide the specific capabilities that a particular LSE company may need. However, in many cases, it is no longer cost effective to develop company specific applications software.

· Although systems and applications are becoming easier to use, the need for adequate training and operational experience (including senior management level) is a limitation.

· Overall productivity gains are limited by a lack of integration and data exchange standards.

4.3 Trends and expectations

4.3.1 Microchip Technology

Some 30 years ago, in the very early days of microchip development - Gordon Moore, the cofounder of Intel - predicted that the number of transistors that could be placed on a single microchip would double every 18 months. This prediction, which has become known as Moore’s Law, has proved to be remarkably good indicator of the approximate rate of progress.

Intel introduced the first microprocessor, the 4004, some 25 years ago. Microprocessor design and fabrication techniques have since improved to the extent that the current generation product contains over two thousand times more transistors:

Byte December 1996 4004 Pentium Pro

Transistors 2,300 5,500,000

Die size 12 mm2 196 mm2

Fabrication 10 microns 0.35 microns

Clock speed 0.75 MHz 300 MHz

MIPS rating 0.06 440

Address space 4 KB 64 GB

Package size 16 pins 387 pins

Similar increases have occurred in the density of memory chips, with DRAM today only costing about 6 ECU per MB to manufacture. Driven by competitive and marketing considerations, the improvements in microchips have been progressive and incremental, enabled by considerable technical innovation.

The critical question is whether this trend alters significantly between now and 2005, by when Moore’s Law suggests that microchip densities will increase 40 fold. No significant

Rev. 1.0 xlv

Page 46: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

discontinuity is anticipated during this period33, but relevant considerations include:

· Densities cannot increase indefinitely. Quantum and other effects are already a concern, but these are unlikely to require radically new microchip technologies within the timeframe.

· Leading-edge microchip fabrication will becomes more capital intensive. This is unlikely to alter the established pattern of rapid price decay for volume products, but will increasingly limit who is able to compete34.

· The trend to multiprocessor systems will continue. Perhaps because of the need for backward software compatibility, processors have failed increased in power as rapidly as the transistor count suggests.

· System integration will increase. The system chip-count will continue to fall, reducing costs and improving reliability, despite the fact more tasks will be supported by hardware rather than software.

· Embedded systems. Integration will allow the production the PC-on-a-chip and other innovative low cost forms of delivery platform.

It can be inferred that rapid improvement in the underlying fabrication process will continue to empower the IT industry, the bigger question is the commercial direction in which it will move.

4.3.2 Disk Technology

After microchips, magnetic hard disks are probably the second most critical hardware technology. Capacities have been increasing by a factor of ten every five years, while drive performance and reliability are also increasing substantially35. With new technologies like magneto-resistive heads, there is no evidence that these trends will not continue beyond 2005.

There is little chance that the long lasting 1.4 MB diskette will remain, their capacity is simply too small. At present the two most likely candidates to replace them are the 120 MB capacity (laser Servo) LS-120 format, which looks similar to existing diskettes (which the LS-120 drives will read and write), and the 100 MB capacity Iomiga Zip format.

However, both face an increasing challenge from optical (and magnetic-optical) disk technologies. CD-ROM discs currently dominate the read-only sector, and the growing write-once sector. Newer formats may extend optical technologies to dominate the whole removable-disk sector. CD-RW, a new rewriteable disk based on phase-change technology, is due to be launched early in 1997. In the longer term, the emerging Digital Video Disk (DVD) discs are expected to replace CD discs. With a range of formats DVD will offer capacities of up to 17 GB for a (read-only) DVD-ROM, 7.8 GB for a (write-once) DVD-R, and 5.2 GB for a (read-write) DVD-RAM disk. These formats should be in general use by 2000 implying that still greater capacities will be available by 2005.

4.3.3 Desktop Hardware

A major problem with today’s PC is that its underlying architecture is old and in need of

33 NEC have predicted that research on massive DRAM technology will result in a 4  Gbit memory chip product by the year 2000.34 The last factory Intel brought into production cost $2.5 billion, and the trend is upward to $10 billion, Andy Grove, Comdex Fall 1996.35 Edmund DeJesus, Byte, June 1996.

Rev. 1.0xlvi

Page 47: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

radical reform. Four new available technologies, have been proposed36 as been key to the evolution of a new more modern PC architecture over the next five years:

· Plug and Play: The ability of the system to automatically configure when new cards and peripherals are installed, advanced plug and play is the key to reducing the cost of ownership of PCs.

· Universal Serial Bus (USB): With an effective bandwidth of 8 Mb/s, the USB external I/O interface is just beginning to appear on PCs and is expected to displace current interfaces such as RS232 serial ports. It will allow multiple peripherals to be daisy-chained with “hot-plugging”. USB is fast, compact, and is cheap to implement.

· Unified Memory Architecture (UMA): Existing graphics adapters incorporate a dedicated frame buffer which can be as much as 4 MB. UMA unifies the frame buffer with the main memory allowing economies to be made, particularly in low cost systems.

· Native Signal Processing: A key question is how future PCs will handle signal-processing tasks as presented by audio, video, and telephony. One solution, favoured by Microsoft, is to employ digital signal processors. However, Intel and a number of system vendors, favour reducing the chip count and exploiting the power of the main processor to perform native signal processing.

The new PC was seen as being more compact and environmentally sensitive through replacement of the CRT display by a LCD, the use of low-voltage components and advanced power management, and a highly integrated design requiring minimal expansion slots. Networking, video, and audio support is also seen as integral.

Experts, drawn from leading IT companies, were recently asked37:

What will the standard PC offer in five years?

· It will have a thin, rich-colour screen; process spoken commands; integrated voice, video, and screen-sharing with powerful visualisation tools; run about 10 times as fast as today. [Netscape]

· Eight times faster per processor (with symmetric multiprocessor options), four times more memory, flat-panel touch displays, speech recognition, eye tracking, and video-conferencing. [Digital]

· Eight times more RAM, eight times more magnetic storage, four times faster processor speed. Built-in support for teleconferencing and intelligent databases. [Synaptics]

· Multiple video windows for desktop conferencing, intelligent agents for filtering and finding information on intranets and the Internet, rich 3D graphics. All managed remotely. [Intel]

· The office PC will have speech enabled human interface with the ability to access, process, and store information all over the world. [Cyrix]

· It will likely operate in the range of 300 to 700 MHz. Its performance will scale linearly with frequency, so a 500 MHz PC will be five times as fast as today’s 100 MHz PCs. [AMD]

What new features will come in ten years?

· Another factor of 10 in performance and storage capacity. People will wear computers tied to lightweight “mirror shades” style display. [Netscape]

36 Tom Halfhill, Byte, October 1995.37 The future of Microcomputing, Byte, December 1996

Rev. 1.0 xlvii

Page 48: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

· Forty times faster per processor (four way symmetrical multi-processor gains depending on software), 16 times more memory {, and who knows what else?}. [Digital]

· Seamless integration of computing and communication, pattern recognition for speech, and simultaneous language translation for simple sentences. [Synaptics]

· Any answer, from anyone, is science fiction. [Intel]

· The desktop PC will disappear, replaced by a pocket-sized mobile device with 10 times the power of contemporary PCs. [Cyrix]

· The networking infrastructure will begin to catch up to the capability of personal computers. Currently, this infrastructure hobbles their capability. [AMD]

Microsoft who published a 378 page document, the PC 97 Design Guide, in September 1996 provides a concrete pointer to the more immediate future.

· The basic PC 97 has a 120 MHz processor, 16 MB of RAM, hardware support for Plug-and-play, OnNow (suspend power consumption with rapid resumption), USB port, support for MPEG-1 playback and graphic adapter capable of 800x600x16 resolution.

· The Workstation PC 97 has a 166 MHz processor with L2 cache of at least 256 KB, 32 MB of RAM with at least 28 MB free for system use, hardware support for Plug-and-play, OnNow, USB port, and graphic adapter capable of 1024x768x16 resolution.

Unlike a previous attempt, this document was based on extensive consultations with hardware and software companies. It appears to have been a largely successful attempt to draw-up cost effective common specifications, and could become an influential yearly publication.

Early 1997 will see the introduction of MMX, a major extension to the x86 architecture, which adds 8 new registers and 57 new instructions to better support multimedia applications. These will substantially improve the speed of activities such as MPEG compression, wavelet compression, motion compensation, motion estimation, colour space conversion, texture mapping, 2D filtering, matrix multiplication, fast Fourier transforms, discrete cosine transformations and phoneme matching38. The anticipated timetable for MMX enhanced processors39 follows:

Company Processor x 86 Generation Available Comments

Intel P55C fifth Q1 1997 improved Pentium

Cyrix M2 sixth Q1 1997 MMX compatibility?

38 Tom Halfhill, Byte, July 1996.39 Tom Halfhill, Byte, November 1996

Rev. 1.0xlviii

Page 49: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

AMD K6 sixth Q1-2 1997 critical for AMD

Intel Klamath sixth mid 1997? Improved Pro

Intel Deschutes sixth late 1997 should reach 300 MHz

IMS Meta6000 sixth late 1997 may never appear!

Intel Merced seventh 1998-99? highly secretive with HP

The first MMX to the marketplace will be the Intel P55C series of enhanced Pentium processors. A complementary new supporting chipset will further enhance performance, and will include support for EDO and SDRAM memory, USB etc.

This should be followed by the Cyrix M2, an uprated 6x86, and the sixth generation AMD K6 that will incorporate technology inherited from NexGen. The K6 will be critical to the competitive survival of AMD. Cyrix also have a low cost 6x86 variant that integrates many other PC functions. Known as the Gx86, this could reduce the price of a full-featured 120 MHz Pentium class PC to about 700 ECU

Intel is also due to release MMX enhanced Pentium Pro processors in 1997 together with new more integrated, and substantially cheaper, supporting chipsets. Klamath, the first enhanced processor, will replace the expensive multi-chip module packaging of the existing Pentium Pro with cheaper (but incompatible) processor card technology40. Poor 16-bit performance will be addressed and the clock speed raised to 233 MHz. Deschutes will build on Klamath and feature larger caches with clock speeds of up to 300 MHz. Klamath will be aimed at desktop systems and low-end servers, while Deschutes will initially be aimed at the high-end server market.

With the possibility IMS also releasing the Meta6000, the micro architecture of the sixth generation x86 processors will become increasingly diverse. The clock speed of a processor will be a less good indicator of relative performance, the supporting chipset and overall system architecture will be increasingly critical. PCs with system buses running above the current Intel maximum level of 66 MHz are likely.

By 1998 the sixth generation processors will be reaching maturity, and attention will focus on the anticipated seventh generation. This will be a major change, extending the x86 architecture to 64-bits and introducing of a new instruction set to be known as IA-64. Intel is collaborating with Hewlett Packard in this work. They are designing a new processor, known as Merced, that will run existing 16-bit and 32-bit x86 software without emulation, but which will run new native IA-64 programmes considerably faster. There is an additional requirement that the Merced can run software written for Hewlett Packard’s existing Precision Architecture RISC processors.

Beyond that little is known. There has been much speculation that IA-64 will depart radically from the current x86 architecture by adopting very-long-instruction-word (VLIW) technology. This remains a possibility, but it would shift responsibility for optimising from the processor (as in the Pentium and the Pro) to the compiler. Given that the transition to 64-bit PC software will (in any event) take some time, it seems probable that the introduction of full seventh generation processors will be phased. The fruits of Merced are likely to appear in servers some time before they appear in desktop platforms. As ever, much will depend on commercial and marketing considerations.

The typical desktop PC of 2005 is likely to be based on x86 processor(s) and provide at least ten times the performance and capacity of today’s PC. It will be compact, highly integrated, 40 Klamath is to be marked as the Pentium II.

Rev. 1.0 xlix

Page 50: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

and reliable with high bandwidth networks, graphics and video capabilities.

4.3.4 Desktop Operating Systems

As hardware platforms have got more powerful, so operating systems become larger and more complex. Three drivers can be identified:

· The need to utilise effectively larger hardware resources,

· The use of an increasing proportion of the available power to make the platform as a whole more productive, and

· The migration of emerging services into the operating system to make the implementation of still more complex applications possible.

32-bit hardware enabled, and now requires, all operating systems to incorporate features such as multi-tasking and GUIs. Internet services are also now rapidly being absorbed into the operating system. The evolution to substantially more powerful 64-bit hardware by 2005 will similarly oblige operating systems to provide new services such as voice input and control, voice output, video conferencing, and distributed objects.

Version 4.0 of Windows NT introduced DCOM, the distributed object support for Microsoft’s ActiveX technologies, and DCOM will soon be added to Windows 95. Windows is evolving towards a becoming a fully object oriented operating system. This was the expected outcome of Microsoft’s Cairo development project, but a more pragmatic outcome is now anticipated. Cairo technologies are now expected to be incorporated into Windows NT progressively over the next few years. These enhancements are likely to include41 support for plug-and-play, rationalised (common) hardware drivers for NT and Windows 95, the GUI will continued evolution of the Windows GUI including a new version of Explorer integrating the current disk and web browsers, a Bookmark API (which will allow the state of the screen and the applications to be saved and restored between sessions), and an object file system. The latter will enable files stored locally, and files on remote sites, to appear as being grouped together in a single folder for ease of location.

Version 5.0 of Windows NT is expected to be released in 1977 together with Windows 97 (Memphis). These will draw the Windows NT and 95 lines closer together with a common kernel and common hardware drivers. The decedents of Windows 95 are likely to have been eclipsed by the NT lineage well before 2005.

Microsoft also faces the difficult transition to a full 64-bit version of Windows. 64-bit addressing is expected soon within Windows NT to better exploit Digital’s Alpa processor, and Microsoft and Intel are collaborating on the full 64-bit version of Windows NT to exploit Intel’s new IA-64 architecture. However, this is as an area where Unix has a clear lead. Digital, IBM, and Silicon Graphics already have full 64-bit versions of Unix, HP and SCO collaborating on Gemini 64 for launching in 1998, and Sun working towards a full 64-bit version of Solaris. The advantages of 64-bit operating systems include faster number crunching, very large address spaces, and more and larger files, better compilers and better applications are however needed to gain the full benefits. Sun also has Neo, a new CORBA based network-oriented object model for Solaris. This is an indication of how the Unix vendors will counter, and probably interwork with, DCOM.

The long term future of the Macintosh environment is uncertain. Apple’s share of the market fell from 9% to 5% last year. They have abandoned Copland, their much troubled next generation operating system for the Macintosh. As an alternative, Apple has just bought NeXT Software with its object-oriented Unix like operating system.

41 Mark Minasi, Byte, November 1996

Rev. 1.0l

Page 51: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

Other than the treat posed by the NC, Microsoft also faces the seemingly on-going possibility of a damaging antitrust case by the US Department of Justice.

Accepting that operating systems are becoming fully object oriented, existing distributed object technologies have considerable strategic importance. It is difficult to predict the relative future importance of CORBA and ActiveX. However, as the close integration of these technologies with the operating system appears to be inevitable, the outcome matters less. ActiveX clearly has a significant advantage if the existing market leadership of Windows is further consolidated. Alternatively, CORBA could become the standards inter-platform object-bus requiring its native support by all operating systems.

By 2005, the operating system of the typical desktop delivery platform will probably be a derivative of Windows NT. 64-bit and fully object and network oriented, this will integrated a range of new services such as distributed object management, voice input and support for collaborative tools (such as video conferencing, shared whiteboards, and application sharing).

4.3.5 The Impact of Java and the NC

Irrespective of the success of the Network Computer (NC), the Java language already shows signs of replacing C++ as the primary programming language. Should this happen, the long term health of Windows will depend on Microsoft’s continued success in defining the APIs which the majority of developers use. This task will be easier if, as seems probable, Microsoft’s Visual J++ achieves the current popularity of Visual C++. For many application developers it seem that ActiveX controls and Java will coexist. However, Java will increasingly offering developers an alternative. Sun’s JavaBeans, for example, provides an object communications and component architecture that is comparable with, but differs from, Microsoft’s ActiveX and DCOM.

During 1997 Sun will launch their new picoJava family of RISC microprocessors. This will drive down the cost of Java platforms, help define more clearly what the term NC means, and open the possibility to Java being widely used in areas such as smart phones, PDAs, and set-top boxes. Companies including Novell, Northen Telecom, IBM and Intel have endorsed the Java Telephony API specification developed by Sun. This will enable the creation of Java applications that support voice response and call centred telephone applications. IBM sees Java as the key to their future software business by providing a cross platform Internet architecture for creating enterprise applications. IBM has invested heavily in the Java Virtual Machine to make it available across its entire line of operating systems including OS/2, AIX, OS/400 and MVS. This will, for the first time, provide application developers with the means of writing software that runs across the full IBM platform range.

A recent report by Bloor Research suggested that, to avoid the high cost of owning and managing networked PCs, the IT industry would turn full-circle and revert to a centralised computer model which involved “thin clients”. However, the threat of the NC has already accelerated moves to reduce the costs of PC ownership. Initiatives by Microsoft, Intel, Compaq and others include the NetPC, networked PC management software, and both hardware and operating system support (the SmartPC). These will bring down management costs significantly over a period of years, reducing the cost advantage of the NC. Additionally, good Java only applications will take time to mature, and current network installations lack the bandwidth to support the full potential of the NC concept.

The longer-term outcome will be that NCs take a proportion (how large a proportion is impossible to predict) of what will be a bigger desktop market. NCs will embrace the lower end of the desktop spectrum, but may well also address niches at other levels. For example, situations where hardware that incorporates a hard disk drives is inappropriate.

Rev. 1.0 li

Page 52: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

4.3.6 Portable Computers

Improved battery life and platforms that are lighter, more compact, and more robust can be confidently predicted. It is too soon to predict how influential Window CE will be. However, having the operating system in ROM, for rapid booting, and the automatic synchronising of files with the users desktop PC are seen as important attributes.

The key development for portable delivery platforms will come when portable digital phone technology is fully and inexpensively integrated, making global wireless networks a practical reality. The strategic importance of the new markets this will open had already been recognised. For example, Psion is to incorporate Oracle Mobile Agents 2.0 in their hand held computers to give access to corporate databases, intranets, or the Internet.

4.3.7 Servers

Even without the impact of the NC, demand for fast, robust and reliable servers will continue to expand rapidly. Drivers include increased networking of PCs, bigger applications and files, intranets and the Internet, the world wide web, databases (corporate and otherwise), client-server applications, and multimedia. Major high-performance growth areas include Video servers.

Many of the hardware trends already described will probably be deployed first within servers, particularly the use of large amounts of memory, leading edge processors and SMP with high numbers of processors.

Novell NetWare will face increasing pressure from the Server edition of NT, but at an enterprise level, Microsoft will find it difficult to challenge Novell’s Netware Directory Services (NDS). Both Microsoft and Novell are to introduce server clustering capabilities to increase robustness, increase scaleablity, and to facilitate load balancing.

4.4 Relevance to LSE

In 2005 the delivery platforms the LSE industries use, and the IT concerns they face, will still be similar other IT users. By then, IT will be so much an integral part of LSE operations that the continued provision of appropriate and reliable IT delivery platforms, at competitive cost, will be a major strategic business concern. In response to this challenge, it is possible that many LSE companies will opt to out-source the provision of their IT infrastructure.

While the overall spectrum of delivery platforms may increase, it seems likely that the diversity within any area will continue to be constrained by software considerations. Indeed, the current distinction between PCs and CAD workstations may be further blurred by the general increase in processor and screen performance. CAD graphics standards such as OpenGL are entering mainstream computing, and are being integrated into silicon. By 2005 a standard PC will be quite capable of being used for serious CAD, and for collaborative distance working. This suggests widespread use of Windows derivatives within LSE, an outcome that could cause some significant migration, or coexistence problems, in those sectors that are currently heavily committed to Unix.

The proportion of LSE staff with their own IT facilities seem likely to rise to approaching 100%. This implies substantial deployment of hand held units on site. It is too soon to predict with confidence the nature of these platforms by 2005, or how they will be deployed. However, the use of some form of radio network, to enable a strong communications role seems certain. As has been predicted, this will include the use of video to relay images back to office-based staff and for record purposes. The on-site collection of digital data will feature strongly and will include survey and site investigation data, monitoring manpower

Rev. 1.0lii

Page 53: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

and plant usage, tracking and usage of materials and components, recording of test results, progress, and quality assurance. The use of hand-held devices for recording as-build information and viewing down-loaded construction information is also anticipated.

· The key opportunity that improvements in delivery platforms will present to the LSE industries will be those which result from all employees having their own IT facilities. The expectation is that these platforms will be highly Internet and communications centric and heavily based on the object metaphor.

The implications for LSE of all staff having IT facilities are substantial. It implies that the migration to digital information centred working will be complete, enabling concurrent engineering to be realised, and closer control of site operations enabling more predictable production. Multimedia and video will be the norm on desktop platforms, allowing its use for presentation, training, and co-operative working

Rev. 1.0 liii

Page 54: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

5. Summary

This report has ranged widely in attempting to identify those supporting technologies, both industry specific Product Data Technology standards and universal IT infrastructure related, which will be influential in shaping LSE by 2005.

Rapid development of the universal IT infrastructure will continue, with no significant technical barriers. The market will still decide between competing technologies, and LSE will continue to have little real influence on the overall direction. However, major new opportunities for LSE can be predicted with confidence. Similar opportunities exist in the area of PDT standards but, because the LSE industries can (and should) exert substantial influence, both progress and direction are inherently less predictable in this area.

· Universal Digital Networks: The distributed and transient nature of LSE Projects makes accessible and open global digital infrastructures a prerequisite for the effective application of IT. As network, Internet, and World Wide Web users can confirm, the existing infrastructures are frequently inadequate for current usage. Rapid progress in performance and connectivity at all levels, including mobile users, will deliver substantial improvements. These may be insufficient to meet the cost-performance expectations of all LSE users.

· Delivery Platforms for All: Current desktop PCs are inexpensive to purchase and have the power for most tasks. Projected developments will considerably increase performance and substantially reducing the cost of ownership. This, together with new lower-cost platforms, will make it cost-effective for all LSE office staff to have a networked computer. Similarly, rapid evolution of small portable computers and digital “communicators” should also make it feasible for all LSE site operatives to have a mobile access point. Experience in the commercial sector suggests that the opportunity for IT to precipitate radical changes in an industry occurs after all participants are equipped.

· Information Centric Working: With these infrastructures in place, PDT can be exploited to the full. The advantages of EDI within the LSE supply chain have been already demonstrated in narrow fields. Overall, the potential for electronic trading within LSE is substantial (for example, by supporting just-in-time delivery). A wider shift from paper-based to digital information centred working is anticipated, providing considerable opportunities for more efficient and effective engineering processes. These lead towards the goal of a formal shared LSE Project Model, although many issues relating to PDT and PDT standards will first need to resolved. By 2005 digital information will be the norm on LSE projects, although the degree of integration will vary between projects and partners.

· User Friendly Systems: Although the functionality of IT delivery platforms will continue to increase, they will also become easier to use. More powerful hardware plus progress towards the creation of conceptually integrated task-centred environments, enhanced by new capabilities such as voice input, will minimise the learning curve and increase productivity. This is essential if all the participants in an LSE project are to make effective use of IT. Such considerations will be a major force in the continued drive for a unifying metaphor across all IT platforms (be it Windows or Java).

· Distributed Systems: Shifts in the underlying nature of IT systems will better support LSE technical processes. The current transition towards distributed object orientedsystem architectures, with implicit network connectivity, will have reached maturity. This, combined with increased computational power and high bandwidth connectivity, will support a new generation of advanced distributed LSE applications software.

Rev. 1.0liv

Page 55: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

· Support for Collaborative Working: Similarly, support for collaborative work patterns will become an in-built feature of IT systems. This trend is already seen in the conceptual integration of web browsers and local file management utilities - the location of information is becoming transparent to the user. Systems for holding addresses and other contact information are becoming unified, and linked to utilities for diary management, scheduling meetings, etc. Collaboration through Email is already well supported, but will be extended to embrace telephone, video conferencing and shared access to applications and whiteboards. Such in-built support for collaborative working has considerable relevance to LSE, and could be utilised to support both distant and local collaboration.

· GPS and Remote Data Capture: The integration of geographic positioning systems and wireless data capture technologies into IT systems will provide new opportunities for LSE. For example, in the more effective utilisation of earthmoving plant and the monitoring of sensitive facilities.

The most significant barriers are likely where LSE has particular needs that may not be cost-effective for the market to identify and address. The future ability of all segments of the IT industry to respond cost-effectively to more specialist markets will be an important factor.

· Open Industry Standards: The most difficult barrier facing LSE is the relative absence of agreed PDT standards, compounded by overlapping PDT technologies and the global nature of LSE. Open standards are essential if the IT systems used by the participants in LSE Projects are to share digital engineering information. Progress has been made in some sectors, process plant for example, by long term co-ordinated efforts. However, the overall picture is of uncoordinated initiatives and a failure to recognise the true scale of the task. Industrial leadership and commitment are needed to bring about convergent standards.

· Construction Sites: The availability of suitable cost-effective IT delivery platforms for use on site will influence uptake across the whole of LSE. Robustness will clearly be a major concern as will practicality, convenience, and both the reliability and the capability of wireless networks.

· Cultural Change: Large-scale deployment of IT implies the need for a major shift in culture within LSE42, raising education and staff-training issues (at all levels).

· Recurrent Costs: While the cost-benefit of IT continues to improve, the recurrent costs of supporting a substantial IT infrastructure will be a considerable burden for LSE. The cyclic nature of the construction sector underlines this problem.

· Industry Stratification: It is inevitable that LSE companies will embrace IT at different rates, and this may create new barriers.

42 This is also seen as a major opportunity for LSE.

Rev. 1.0 lv

Page 56: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

6. Index

A

Abstract Test Suite 12ActiveX 30; 40; 41; 42; 48; 49ADSL 28AIM (Application Interpreted Model) 12; 15; 18analogue modem 22; 28AP (Application Protocol) 11; 12; 15; 16; 18; 19AP221 12; 15AP225 12; 15AP227 12; 15AP230 12; 15ARM (Application Reference Model) 12; 15; 18ATM 21; 22; 23; 28; 29

B

broadband modem 28browser 23; 24; 30; 31; 42

C

C++ 37; 42; 49CAD 8; 13; 15; 17; 25; 38; 51CD-R 25; 27CD-ROM 25; 32; 35; 39; 44CD-RW 45CIS 12; 15; 17CISC 36Conformance Class 12CORBA 40; 41; 49CSMA/CD 21CT (Computer-Telephony) 28

D

DCOM 41; 42; 48; 49DMAC 13; 14DOS 37; 38DVD (Digital Video Disk) 32; 33; 45

E

EAN 10; 15EANCOM 10EDI (Electronic Data Interchange) i; iii; 6; 7; 8; 10; 14; 15; 16; 17; 19; 20; 52EDIFACT iii; 7; 8; 10; 14; 17email 23; 24; 25; 26; 27; 31; 40; 53Ethernet 21; 28; 29EXPRESS 11; 13; 18; 19EXPRESS-X 18

Rev. 1.0lvi

Page 57: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

F

Fast Ethernet 21FDDI 21; 22FTP 23; 24; 27

G

Gigabit Ethernet 28GOPHER 24GPS (Geographic Positioning System) 26; 33; 53GSM 26; 29

H

H.320 42H.324 42HTML 17; 24; 30; 41

I

IAI (International Alliance for Interoperability) iii; 12; 13; 15; 16; 17; 18; 19IFC (Industry Foundation Classes) iii; 6; 12; 13; 15; 16; 18Intel 34; 35; 36; 42; 43; 44; 45; 46; 47; 49; 50Internet iii; 6; 15; 16; 22; 23; 24; 25; 26; 27; 29; 30; 31; 32; 33; 38; 42; 46; 48; 49; 50; 51; 52intranet31; 32ISDN 22; 23; 26; 42

J

J++ 49Java iv; 6; 30; 37; 41; 42; 49; 50; 52

L

LAN 21; 22; 26; 28; 29

M

Microsoft 6; 13; 24; 28; 30; 34; 37; 38; 40; 41; 42; 45; 46; 48; 49; 50MMX 46; 47mobile phone 26; 29MPEG 46MPOA 22multimedia 35; 36; 39; 46; 50

N

NC iv; 41; 42; 49; 50NDS 50NetPC 50NetWare 21; 39; 50

O

object oriented 19; 41; 42; 48; 49; 52

Rev. 1.0 lvii

Page 58: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

ESPRIT 20876 D301 - Task 3000: The Supporting Environment Restricted

OLE iii; 6; 13; 14; 15; 16; 38; 41; 42OLE for D&M (OLE for Design & Modelling) iii; 13; 14; 15; 16OMG 40; 41optical disk 25ORB 40OS/2 37; 39; 41; 50

P

Part 106 (of ISO 10303) 18Part 21 (of ISO 10303) 11; 13PC16; 21; 23; 24; 25; 28; 29; 32; 34; 35; 36; 37; 38; 39; 40; 41; 42; 43; 45; 46; 47; 48; 50; 51; 52PDA 40PDT (Product Data Technology) iii; 5; 6; 7; 52; 53portable computer 39; 40; 52product model iii; 10; 11; 12; 15; 17; 18; 19; 20

R

RAID 39RAM i; 35; 40; 45; 46RISC 36; 48; 49

S

satellite 26; 29; 33SGML 17; 24SMP 35; 36; 37; 39; 50STEP (ISO 10303) iii; 6; 10; 11; 12; 13; 15; 16; 17; 18; 19; 32Switched Ethernet 21; 22

T

TCP/IP 23; 29; 41; 42telephone 9; 22; 24; 26; 28; 29; 40; 42; 49; 50; 53TELNET 23; 24; 27The Open Group 38; 41thin client 42; 50Token Ring 21

U

UMA 45Unix 21; 23; 36; 37; 38; 39; 41; 43; 49; 51USB 45; 46; 47

V

VAN 14VC 22video 21; 23; 29; 33; 34; 35; 36; 38; 41; 42; 45; 48; 49; 51; 53voice 6; 22; 26; 28; 29; 34; 41; 45; 48; 49; 52VRML 31

W

WAIS 24WAN 22; 23whiteboard 42; 49; 53

Rev. 1.0lviii

Page 59: Introduction - Loughborough University  · Web viewCustom voice controls can also be created, both to format word processor input and to control and dictate to other applications

Restricted D301 - Task 3000: The Supporting Environment ESPRIT 20876

Windows 3.1 37; 38; 41Windows 95 37; 38; 39; 40; 41; 48Windows CE 40Windows NT 28; 36; 37; 38; 39; 48; 49WWW (World Wide Web) iii; 6; 17; 23; 24; 26; 27; 30; 32; 33; 39; 52

Rev. 1.0 lix