rajeev r. raje andrew m. olson barrett r. bryant carol c. burt mikhail auguston
Post on 31-Dec-2015
22 Views
Preview:
DESCRIPTION
TRANSCRIPT
1
Rajeev R. RajeAndrew M. Olson
Barrett R. Bryant Carol C. Burt Mikhail Auguston
funded by the DoD and Office of Naval Research under the funded by the DoD and Office of Naval Research under the CIP/SW ProgramCIP/SW Program
2
Overview (SERC Showcase, Dec. 6, 7, 2001)
ObjectiveUniFrame ApproachParts of UniFrameUMMStandards and OMGQoSSummary
3
Objective
To create a unified framework To create a unified framework (UniFrame) that will allow a seamless (UniFrame) that will allow a seamless integration of heterogeneous and integration of heterogeneous and distributed software componentsdistributed software components
4
Sounds like“Web Services”!
ServiceBroker
ServiceProvider
ServiceRequestor
Publish Bind
Find
Web Services focus is on using the Internet protocols for amessaging (SOAP/HTTP and XML), and a UDDI directory for
locating services defined in WSDL
5
That leaves a lot of interesting problems…Need for a component meta-model in support of generative techniques for mappings to existing components modelsNeed for multi-approach highly intelligent location services QoS instrumentation and metricsUnified approach to using generative techniques with strict QoS requirements Validation of dynamic system compositions
6
7
UniFrame Approach
UMM• Components, QoS, Infrastructure
GDM• Domain model, Composition/Decomposition
Rules, Generative Programming
TLG• Formalism based two-level grammars
Process• For integration
8
Unified Meta-Model (UMM)
Component• Autonomous and non-uniform
Service and its guarantees• Offered by each component with QoS
Infrastructure• Environment
– Headhunters– Internet Component Broker
9
Aspects of Components
ComputationalReflects the tasks carried out by an object
CooperativeExpected collaborators
AuxiliaryMobility, Security, Fault-tolerance
10
Service and GuaranteesEach component must specify its QoS and ensure it
QoS Parameter Catalog• static – design oriented
• dynamic – run-time oriented
QoS of a component/integrated DCS• based on “event traces”
11
Infrastructure
Head-huntersPro-active discovery of new componentMulti-level and multi-approach
Internet Component BrokerAllows heterogeneous components talk to one anotherAnalogous to Object Request BrokerGenerated adapter technologyInstrumentation as a part of the architecture
12
Architecture For Component Discovery
Head-hunter
Head-hunter
Head-hunter
Head-hunter
S2
S4
S3
S1
RMIRegistry
11
11
RMI Model
S6
S5
S7
S8
ORB
CORBA Model
11
1
1
S5
S5
S5
EJB Container
EJB Model
1
1
1
Meta-Registry
Meta-RegistryMeta-RegistryMeta-Registry
3
33
3
4
4
4
4
2
22
2
2 2
Client 5
2
DomainSecurity Manager
55
Search Engine ICB
Client
13
Component Development and Deployment Process
Translator IG Imp QV
Domain KB
UMMSpec
TLG
ComponentImplementation
Satisfy?
Yes
NoRefine Specifications/implementations
Deploy
Interface Generator QoS validation
14
System Integration
QueryProcessor
System GeneratorIterative
Experiments
UMM TLGDomain KB
QuerySatisfy?
Yes
No
Refine Query
Deploy
HHs
GDMGenerative Rules
Select Another option
QoS Constraints
15
Leverage & Drive OMG work
Infrastructure & InteroperabilityCORBA, CORBA Services, CCM, IIOP, COM/CORBA, SOAP/CORBA, CSIv2, Head-hunters, Internet Component BrokerValidation Metrics / Instrumentation
Model Driven ArchitecturePIM to PSM mapping Consistent with our Meta-model approachConcept of a QoS Catalog & Interface generation
16
Interoperability & Infrastructure
Internet Component Broker• Leverage “lessons learned” in development of
orbs – standard protocol, standard component mappings & portable component adapters
• Leverage SOAP/CORBA and XML valuetypes• Native protocol?
Headhunter• Use Naming/Trading, Interface Repository • Need Standard Implementation Repository? • Federation, Native protocol, API?
17
Model Driven Architecture
We need a standard QoS catalog for Model Driven Architectures
Static – design orientedDynamic – runtime oriented
We need to standardize the way that QoS parameters are used to generate interfaces (static QoS)We need to standardize how QoS parameters are used for generated instrumentation (dynamic QoS)
18
Quality of Service Reference Model
A general categorization of different kinds of QoS; including QoS that are fixed at design time as well as ones that are managed dynamicallyIdentification of the basic conceptual elements involved in QoS and their mutual relationships. This involves the ability to associate QoS parameters to model elements (specification)
19
Qualify of Service ParametersThese are parameters that describe the fundamental aspects of the various specific kinds of QoS based on the QoS categorization identified in the reference model. This includes but is not limited to the following:
• time-related characteristics (delays, freshness)• importance-related characteristics (priority,
precedence)• capacity-related characteristics (throughput, capacity)• integrity related characteristics (accuracy)• safety-related characteristics • availability and reliability characteristics
20
QoS Catalog
Motivation:Creation of a QoS Catalog for Software Components would help the component developer by:
• Acting as a reference manual for incorporating QoS attributes into the components being developed
• Allowing him to enhance the performance of his component in an iterative fashion by being able to quantify their QoS attributes
• Enable him to advertise the Quality of his components by utilizing the QoS metrics.
21
QoS Catalog
The system developer by• Enabling him to specify the QoS
requirements of the components that are incorporated into his system
• Allowing him to verify and validate the claims of the component developer
• Allowing him to make an objective comparison of Quality of components having the same functionality
• Empower him with the means to choose the best suited components for his system
22
QoS Catalog
The catalog is broadly based upon the software patterns catalog.The catalog follows the following format:
NameIntentDescriptionMotivationApplicabilityModel UsedMetrics UsedError SituationAliases
Influencing FactorsEvaluation ProcedureEvaluation FormulaeResult TypeStatic / DynamicConsequenceRelated ParametersDomain of UsageResources
23
QoS CatalogIncorporation of methodologies into the catalog is made based on their:
• Reproducibility• Indicativeness (capability to identify parts of the
component which need to be improved)• Correctness• Objectivity• Precision• Meaningfulness of measure• Suitability to the component framework• Error Situation• Aliases• Resources
24
QoS CatalogName: DEPENDABILTYIntent: It is a measure of confidence that the component is free from errors. Description: It is defined as the probability that the component is defect free. Motivation:
It allows an evaluation of degree of Dependability of a given component.It allows Dependability of different components to be compared.It allows for modifications to a component to increase its Dependability.
Applicability:Can be used in any system, which requires its components to offer a specific level of dependability. Using the model, the Dependability of a given component can be calculated before being incorporated into the system.
Model Used: Dependability model by Jeffrey VoasMetrics used: Testability Score, Dependability ScoreTestability is a measure of the likely hood that a particular statement in a component will hide a defect during testing.
25
QoS CatalogInfluencing Factors:
1. Degree of testing2. Fault hiding ability of the code3. The likelihood that a statement in a component is executed4. The likelihood that a mutated statement will infect the component’s
state5. The likelihood that a corrupted state will propagate and cause the
component output to be mutated
Evaluation Procedure:1. Perform Execution Analysis on the component2. Perform Propagation Analysis on the component3. Calculate the Testability value of each statement in the component4. From the Testability scores of each statement of the component;
Select the lowest score as the Testability score of the component5. Calculate the Dependability Score of the Component
26
QoS CatalogEvaluation Formulae: T = E * P
TTestability ScoreEExecution EstimatePPropagation Estimate D = 1-(1-T)N
DDependability ScoreNNumber of successful tests
Result Type: Floating Point Value between 0 to 1 Static/Dynamic: Static Consequence:
1. Greater amounts of testing and greater Testability scores result in greater Dependability
2. Lower amounts of testing and lower Testability scores result in lesser Dependability
3. Doing additional testing can improve a poor score
27
QoS Catalog4. Lesser amount of testing is required to provide a fixed
dependability score for higher Testability Scores
Related Parameters: Availability, Error Rate, Stability
Domain of Usage: Domain Independent
Error Situation: Low dependability results in:
1. Unreliable component behavior.2. Improper execution/termination.3. Erroneous results.
Aliases: Maturity, Fault Hiding Ability, Degree of Testing
28
Summary of ApproachAddress key issues that need to be resolved to assist organizations to manage their distributed software systems
Meta-model allows a seamless integration of heterogeneous componentsFormal specifications assist in automated construction and verification of parts and the whole of a distributed computing system (DCS)Support a unified approach to iterative as a pragmatic solution for software development of DCSIncorporation and validation of QoS implies the creation of more reliable DCSInteractions with the industry and standards organizations provide practical feedback and enable proliferation of research results in a timely manner
29
Salient Features
A meta-model and a unified approachQoS-based generative processGeneration based on distributed resources in the form of components – use of HHsEvent grammars for dynamic QoS metricsAutomation (to the extent feasible) for system generation
30
Webpage
Http://www.cs.iupui.edu/uniFrame.html
top related