autonomous unmanned aircraft systems2914

Upload: edwardsilva

Post on 04-Apr-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/29/2019 Autonomous Unmanned Aircraft Systems2914

    1/9

    Autonomous Civil Unmanned Aircraft SystemsSoftware Quality Assessment and Safety Assurance

    (Web Version 2, September, 2007)

    Keywords:

    autonomy, autonomous systems, unmanned aircraft (UA), Unmanned Aircraft System

    (UAS), System Safety Assessment (SSA), RTCA/DO-178B, software developmentassurance, software verification, software validation, software quality

    1. Introduction:

    This document discusses methods used by the civil aviation community to assure the

    quality and safety of airborne software. Our objective is to consider these methods withinthe context of designing and achieving regulatory acceptance for autonomy software in

    safety-critical systems of civil Unmanned Aircraft Systems (UAS). We begin by noting

    that the methods we will consider here have been developed to support airworthinesscertification for conventional manned civil aircraft. In their current form, these methods

    anticipate neither UAS nor autonomy software.

    Section 2: Presents a definition ofsystem autonomy that is suited to the Unmanned

    Aircraft systems context, and highlights the fact that system autonomy is

    realized in software;

    Section 3: Outlines the civil aircraft methodology of RTCA/DO-178B for regulatoracceptance of airborne software. Developed for manned aircraft, these

    methods are relevant to UAS;

    Section 4: Relates conventional software Verification and Validation (V&V) processes tothe DO-178B approach;

    Section 5: Raises issues relating to the application of DO-178B to autonomy software,

    and points to a major revision of DO-178B, soon to be published - possibly asDO-178C.

    2. Definition of System Autonomy

    Readers will be forgiven for questioning the need for another definition of autonomy inthe UAS context, since various definitions already in use by the UAS community cover a

    wide range of meanings. We hope to demonstrate the soundness and utility of our

    definition.

    A consistent feature of all of the system-autonomy definitions is the understanding that

    autonomy is implemented in software; therefore, at the most fundamental level, systemautonomy comprises sets of software programs. The following definition of system

    autonomyacknowledges that we are dealing with software, and it associates autonomous

    function with decision-making, specifically, performing decision tasks that human

    operators and software systems must perform to complete a UAS mission.

    AeroVations Associates 1 August, 2007

  • 7/29/2019 Autonomous Unmanned Aircraft Systems2914

    2/9

    (a) Autonomy:

    In this definition statement:

    substantial decisions are decisions that could affect the safe operation of theUnmanned Aircraft and would normally be made by human operators or mission

    managers; and

    Unmanned Aircraft system, depending on context, can mean an individual

    sub-system of the UAS (for example, the autopilot or the health-monitoring system),

    or the complete UAS.

    This definition is consistent with dictionary meanings of autonomy, which emphasizeself-governing independence. In the near-term and medium-term future of civil UAS,

    such unqualified autonomy will be very scarce, and the term is likely to be more usefulas a theoretical extreme than as a practical operational architecture.

    (b) Qualified Autonomy:

    (i) Supervised Autonomy: The term Supervised Autonomy is used to depict an

    autonomous system operation that is observed by a human, who has some visual

    representation of the system situation or state. In a UAS context, the supervisionwould be safety-related, and would imply potential human control over the UAV

    in the event that a situation threatened the UAS or other aircraft - or persons orproperty on the ground. If the human supervisor has sufficient control of thevehicle to intervene in the threatening situation the most likely arrangement -

    then the supervised autonomy is not autonomous at all! It is more accurately

    described as independent system control with an off-board (human) safety pilot.

    (ii) Shared Decision-Making: Figure 1. depicts an operational UAS architecture

    that involves decision responsibility shared among aircraft, ground controlcomputers and human system managers. The arrangement is often referred to as

    shared autonomy, but is more accurately called ashared-decision configuration.

    Shared decision-making is likely to be the standard configuration for all future

    civil UAS, perhaps with the exception of the very small and very simple systems.

    AeroVations Associates 2 August, 2007

    An Unmanned Aircraft system exhibits autonomy when the system software is

    capable of making - and is entrusted to make - substantial real-time decisions,

    without human involvement or supervision.

  • 7/29/2019 Autonomous Unmanned Aircraft Systems2914

    3/9

    Conceptually, the degree of autonomy sharing or sharing of decision-makingresponsibility - between human operators and system software, may range from:

    the case where human managers make all substantial decisions and the

    unmanned aircraft responds to the human demands; to - the converse case where the human managers authorize decisions when

    requested, and the UAS makes all mission decisions and initiates related

    actions.

    3. Airborne Software Safety Assessment

    In manned civil aviation, system safety is addressed formally during airworthiness

    certification or type certification of the aircraft. In current airworthiness practice, airborne

    software is generally not certified as a separate entity, but is considered to be part of thephysical system in which it is embedded; nevertheless, acceptance by the Regulator of

    flight critical software is a distinct process. In their Advisory Circular 115B (Reference1.), the U.S. Federal Aviation Administration recognizes RTCA document DO-178B

    (Reference 2.) as an acceptable means of compliance when securing FAA acceptance of

    software in airborne systems and equipment. DO-178B addresses SoftwareConsiderations in Airborne Systems and Equipment Certification. (See also References

    AeroVations Associates 3 August, 2007

    CONTROL STATION

    Intelligent

    Software

    UNMANNED AIRCRAFT

    UNMANNED AIRCRAFT SYSTEM GENERAL

    Shared Autonomy

    TARGETS

    Intelligent

    Software

    Human

    Intelligence

    MissionManager

    LaptopComputer

    DATALINK

  • 7/29/2019 Autonomous Unmanned Aircraft Systems2914

    4/9

    3,4.) We will describe briefly how the guidance of DO-178B is applied, and what this

    might mean if applied to UAS autonomy software.

    DO-178B Methodology

    The stated purpose of RTCA DO-178B is: to provide guidelines for the production ofsoftware items for airborne systems and equipment that perform their intended functions

    with a level of confidence in safety that complies with airworthiness requirements.

    (Quoted from Reference 2.) When applying these methods to UAS, we would beconcerned not only with airborne software, but also with Control Station software that

    may affect the air vehicle safety-of-flight.

    We can begin our understanding of DO-178B by considering the failure conditioncategories and required software levels that provide the foundation for safety assessment

    of software elements:

    (a) First, DO-178B provides Failure Condition Categories, describing the effectsof thefailure or anomalous behaviourof software items using the following

    scale:

    Catastrophic : would prevent continued safe flight and landing; (Note that

    continued safe operation may be accomplished by an auto-flight system,

    without human interaction or supervision.)

    Hazardous : would cause large reduction in safety margins and functional

    capabilities;

    Major : would cause significant reduction in safety margins and functionalcapabilities;

    Minor : would not significantly reduce aircraft safety;

    No effect : would not affect capabilities.

    (b) Secondly, a Software Level is associated with each of these Failure Effect

    Categories, specifically:

    Software Level A :

    o Is software whose anomalous behaviorwould cause or contribute to a

    failure of system function resulting in a Catastrophic Failure Condition;

    Software Level B :

    o Is software whose anomalous behaviorwould cause or contribute to a

    failure of system function resulting in a Hazardous failure condition; Software Level C :

    o Is software whose anomalous behaviorwould cause or contribute to a

    failure of system function resulting in a Major failure condition;

    Software Level D :

    AeroVations Associates 4 August, 2007

  • 7/29/2019 Autonomous Unmanned Aircraft Systems2914

    5/9

    o Is software whose anomalous behaviorwould cause or contribute to a

    failure of system function resulting in a Minor failure; and

    Software Level E:

    o Is software whose anomalous behaviorwould have no effect on UA

    operational capability condition.

    For a specific airborne software module, therefore, we must be able to answer the

    question:Would the failure or anomalous behavior of this software item cause or

    contribute to a failure of system function resulting in a Catastrophic, Hazardous,

    Major, Minor or No Effect condition?The answer is found by determining the consequences of failure of the physical

    systems that interface with and are activated by the software item under

    consideration. In the certification process, this determination is a result of a System

    Safety Assessment or SSA. (See Reference 5.) The SSA determines, first, theprobability-per-flight-hour that a specific physical system might fail or malfunction;

    and secondly, the probability-per-flight-hour that failure of that system will affectsafety-of-flight.

    (c) Finally then, using the SSA results, DO-178B assigns a Required Software Level

    to the airborne software module under consideration.

    Software Production and Management

    With each airborne software item categorized by Software Level, DO-178B guidance

    ensures the required quality and safety of the software by demanding disciplined care,

    structured management and thorough documentation at each stage in the developmentand use of the software item. The rigor of the objectives and outputs prescribed by DO-

    178B for the associated processes increases as the Software Level progresses form Level

    D ( Minor) to - Level A (Catastrophic) as we would expect.

    DO-178B objectives and outputs are specified in ANNEX A of DO-178B, entitled,

    Process Objectives and Outputs by Software Level. This is the heart of DO-178B. Foreach software level, DO-178B ANNEX A provides tabulated guidance concerning

    process management and documentation throughout each process of software production

    and use.

    Implications for Unmanned Aircraft Systems

    UAS applications of DO-178B must address anomalous behaviorof some GroundStation software items - those that would affect flight safety as well asairborne

    software.The extension to Ground Station software will present no fundamentally new

    challenges.

    AeroVations Associates 5 August, 2007

  • 7/29/2019 Autonomous Unmanned Aircraft Systems2914

    6/9

    Furthermore, we believe that, in general, the DO-178B approach to software development

    assurance can be adapted to UAS software quality and safety assurance. Although some

    UAS software programming techniques and languages are not anticipated in the currentDO-178B, UAS designers and regulators will, in due time, accommodate the new

    software architectures to DO-178 guidance. (More on this subject in section 5.)

    4. Verification and Validation Does DO-178B Validate?

    Both verification and validation are generally considered essential elements of software

    quality assessment. In the simplest terms, system designers distinguish verification and

    validation in the following way:Verification answers the question have we built the system right; whereas

    Validation answers the question have we built the right system.

    Clearly, we must be able to affirm both of these rights.

    In a review of the DO-178B processes, although we see many references to verification

    techniques and processes, validation is nowhere indicated. This would appear, at firstglance, to be an anomaly in the guidelines.

    Specialists familiar with the development and use of DO-178B point to a validation

    equivalentthat is integral to the guidelines. Specifically, validation is inherently

    performed by the use and management of system requirements.

    To illustrate this point, we note that software developers define two levels of softwarerequirements, defined in DO-178B as:

    High-Level Requirements: these are software requirements that are developed from

    the system requirements, safety-related requirements and system architecture.

    Low-Level Requirements: these are software requirements derived from high-level

    requirements and design constraints from which software code can be directlyimplemented.

    Throughout DO-178B, emphasis is placed on these requirements, to ensure at every stage

    that they are correct and complete. Clearly, assuring the correctness and completeness ofthe High-Level system requirements - assures that we are building the right system.

    This is validation by another name! Planning, Development and Verification

    5. Applying DO-178B to UAS Autonomy Software

    We have painted a rosy picture here: autonomy is an emergent property of what we have

    called autonomy software. Ostensibly, the quality and safety of UAS autonomy software

    will be assured by applying the SSA methodology of SAE ARP 4761 and the Software

    Development Assurance methods of DO-178B. In fact, however, the applicability of DO-

    AeroVations Associates 6 August, 2007

  • 7/29/2019 Autonomous Unmanned Aircraft Systems2914

    7/9

    178B to some system-autonomy architectures is problematic. Quoting from Reference 6,

    Handbook for Object-Oriented Technology in Aviation, (October, 2004):

    When DO-178B was published in 1992, procedure programming was the

    predominant technique for organizing and coding computer programs. Consequently,

    DO-178B provides guidelines for software developed using a functional techniqueand does not specifically consider software developed using Object Oriented

    Technology (OOT)

    Reference 6 describes a study that the FAA, NASA and several industrial users of DO-

    178B undertook to identify concerns associated with assessing Object-Oriented

    Technologies in airborne software modules. Reference 6 gives extensive insight into the

    dangers inherent in adopting OOT for airborne systems software development.

    This subject is still in an unsettled state, and the future may hold more troubles for the

    DO-178B Software Development Assurance methodology. Some software practitioners

    in the Artificial Intelligence (AI) community are generating UAS autonomy softwareusing cognitive-modeling to implement autonomous functions. The software in these

    cases is likely to be programmed using what is called Agent-Oriented Software or AOS.(See Reference 7.) In a 2006 example, a small UAS was widely touted as having made

    the worlds first truly autonomous UAS flight. The autonomy software in this case was

    developed usingJack, Intelligent Agent(See Reference 8.), a so-called Belief, Desire,Intention, or BDI, cognitive-model architecture. Reference 9 provides a link to the

    World First announcement.

    Whether this autonomous mission claim has merit or is wildly exaggerated may beargued, however the event does draw attention to the fact that system autonomy projects

    are using cognitive-modeling and Agent Oriented Software. These models attempt to

    reproduce specific human characteristics such as beliefs, desires, intentions, reasoningand learning. The question is: Can autonomous system software that is based on a

    cognitive-modeling architecture be verifiedin the DO-178B sense? Could Agent-

    Oriented Software be accepted by airworthiness regulators using DO-178Bmethodology?. These are questions that may be answered in the planned new DO-178C.

    The subject matter would certainly fit within the following DO-178C Objectives, quoted

    from website - www.rtca.org (See link at Reference 10.):

    OBJECTIVES OF DO-178C:

    (a) To promote safe implementation of aeronautical

    software;(b) To provide clear and consistent ties with the systems

    and safety processes;

    (c) To address emerging software trends andtechnologies;

    (d) To address an approach that can change with the

    technology.

    AeroVations Associates 7 August, 2007

    http://www.rtca.org/http://www.rtca.org/
  • 7/29/2019 Autonomous Unmanned Aircraft Systems2914

    8/9

    Items (c) and (d) would appear to place UAS and Agent-Oriented Software

    on the Subcommittee revision agenda. Reference 10 provides a link to the

    RTCA DO-178C project, where the project specific revisions and progressare documented. The revision is being undertaken by RTCA Subcommittee

    SC-205 (jointly with European EUROCAE WG-71).

    6. Closure

    To close, we will consider an imaginary UAS autonomy scenario: a UA awakensbefore dawn at its home-field, near but outside the city; the UA checks fuel level, checks

    date and time, senses local weather and gets a regional forecast; after checking all

    systems including sensors, and checking NOTAMS, it decides to carry out one of several

    possible sensing missions at one of several designated locations.

    Following autonomous start-up and taxi, the UA then performs the mission with ATC

    data-link clearances, returns, lands, taxies and shuts-down all without mission-manager

    supervision or interaction. Technically, this can be done. Superficially, the operationcould besafely performed in controlled airspace on an IFR flight plan with ATC

    providing traffic separation and conflict resolution.

    But in real-life, the UA lacks an important capability that would keep it on the ground. It

    cannot adequately see-and-avoid other aircraft in-flight. Conventional pilot-like see-and-avoid capability is a regulatory requirement even under IFR flight rules, and it is still

    beyond the ken of UAS. Although progress is being made on what is being called sense,

    detect and avoid, a system that meets a standard acceptable to the Regulators has not yet

    been achieved.

    Next-generation air traffic technologies (NGATS) may provide a way over this hurdle.

    (See related essay on NGATS.)

    Returning to the real-world, however, it seems clear that the very thought of anautonomous civilUAS will send shivers up the spine of airworthiness Regulators. As

    UAS knock on the door for access to unrestricted civil airspace, we believe that

    regulators will for quite some time to come - have little time in their schedules for

    considering flight-critical software that implements autonomy. Civil UAS willincorporate ever increasing levels of automation, throughout the UA systems, but human

    supervision will remain essential.

    Finis.

    AeroVations Associates 8 August, 2007

  • 7/29/2019 Autonomous Unmanned Aircraft Systems2914

    9/9

    References:

    1. Advisory Circular (AC) 20-115B, RTCA, Inc. Document RTCA/DO-178B,

    2. DO-178B, Software Considerations in Airborne Systems and Equipment

    Certification, RTCA Inc. document, December, 1992.

    3. FAA Order 8110.49

    4. The Job Aid, Conducting Software Reviews Prior to Certification, FAA

    Certification Service, Published: 16 January, 2004.

    5. SAE document ARP 4761, Guidelines and Methods for Conducting the Safety

    Assessment Process on Civil Airborne Systems and Equipment, SAE

    International, Issued 1996.

    6. Handbook for Object-Oriented Technology in Aviation (OOTiA), Co-Sponsored

    by U.S. FAA and NASA, published 26, October, 2004.

    7. Agent-Oriented Software Engineering, Nicholas Jennings and Michael

    Wooldridge, Queen Mary and Westfield College, University of London, LondonU.K.(1999).

    8. Jack, Intelligent AgentTM Summary of an Agent Infrastructure,N. Howden et al,

    Agent Oriented Software Pty. Ltd., Victoria, Australia

    9. First Flight True UAV Autonomy at Last (Link below.)

    http://www.agent-software.com/shared/profile/consulting.html10. RTCA DO-178C - link to RTCA Project of Revision to DO-178B

    http://www.rtca.org/comm/Committee.cfm?id=55

    AeroVations Associates 9 August, 2007

    http://www.agent-software.com/shared/profile/consulting.htmlhttp://www.rtca.org/comm/Committee.cfm?id=55http://www.agent-software.com/shared/profile/consulting.htmlhttp://www.rtca.org/comm/Committee.cfm?id=55