seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/capstone/msseorprojectsfall17/m… · web...

58
Decision Support System (DSS) for future Multi-Operational Wireless Ranging and Low Power LIDAR Exploitation of Subterranean Structures (MOWLES) Omar AbuRealh Systems Engineering Robert Collier Team Lead, Customer Liaison Joseph Pack Architect and Reporting

Upload: others

Post on 24-Sep-2019

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Decision Support System (DSS) for future Multi-Operational Wireless Ranging and Low Power

LIDAR Exploitation of Subterranean Structures (MOWLES)

Omar AbuRealhSystems Engineering

Robert CollierTeam Lead, Customer Liaison

Joseph PackArchitect and Reporting

7 December 2017

Page 2: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Contents1 Introduction..............................................................................................................................1

1.1 Problem Context...............................................................................................................1

1.2 Tasking Description..........................................................................................................2

2 System Description...................................................................................................................3

2.1 System Concept of Operations..........................................................................................3

2.2 Platform.............................................................................................................................4

2.3 Sensors..............................................................................................................................4

2.4 Software Package..............................................................................................................5

2.5 Radio.................................................................................................................................5

2.6 Control Unit......................................................................................................................6

2.7 User and Environmental Factors.......................................................................................6

3 Solution Methodology..............................................................................................................6

3.1 DSS Concept of Operations (CONOPS)...........................................................................7

3.2 Verification and Validation...............................................................................................7

3.3 Methodology for Designing..............................................................................................7

4 Tool Design..............................................................................................................................9

4.1 Package Diagram............................................................................................................10

4.2 Block Definition Diagram (BDD)...................................................................................11

4.3 Parametric Diagram........................................................................................................12

5 Decision Support System (DSS) Prototype............................................................................13

5.1 DSS Concept...................................................................................................................13

5.2 Tools................................................................................................................................15

5.3 Inputs...............................................................................................................................15

5.3.1 Input Breakdowns....................................................................................................16

5.3.2 Modifying Inputs.....................................................................................................16

5.4 Calculations.....................................................................................................................17

5.5 Outputs............................................................................................................................18

5.6 Weighting Schema..........................................................................................................18

5.7 DSS Use and Operation..................................................................................................20

ii

Page 3: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

5.7.1 Introduction Tab......................................................................................................20

5.7.2 MOWLES Scenarios...............................................................................................20

5.7.3 Environmental Scenarios.........................................................................................21

5.7.4 Scoring & Weights..................................................................................................22

5.7.5 Intermediate Calculations........................................................................................23

5.7.6 Results......................................................................................................................25

5.7.7 Hidden Enumerations Tab.......................................................................................27

5.8 Performing Analysis.......................................................................................................27

6 Findings..................................................................................................................................28

7 Recommendations..................................................................................................................29

7.1 Total System Accuracy...................................................................................................29

7.2 Tactical/Operational Objectives for the System.............................................................30

7.3 Development of Baseline Configurations.......................................................................30

8 Summary................................................................................................................................30

Appendix A: DSS Operation Use-Case.........................................................................................32

Appendix B: Detailed Input Descriptions......................................................................................34

Appendix C: Detailed Output Descriptions...................................................................................37

Appendix D: Calculations of Measures of Effectiveness..............................................................38

iii

Page 4: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

FiguresFigure 1: Strategic offsets with current GMU-ERDC effort highlighted........................................2Figure 2: Notional MOWLES CONOPS.........................................................................................3Figure 3: PackBot by iRobot as shown in US Army media handouts (2002).................................4Figure 4: A photogrammetry sensor such as could be considered for the MOWLES system........5Figure 5: Methodology for creation of MOWLES DSS..................................................................7Figure 6: Methodology for creation of MOWLES DSS..................................................................8Figure 7: Revised Problem Statement.............................................................................................8Figure 8: Subset of MOWLES DSS package diagram showing MOWLES components and

external/environmental factors (with parameter definition suppressed).......................10Figure 9: Subset of MOWLES DSS package diagram showing sample of calculations

implemented in the final DSS prototype.......................................................................11Figure 10: Block Definition Diagram for notional MOWLES design and external (contextual)

factors (with parameter definition suppressed).............................................................12Figure 11: Subset of MOWLES DSS parametric diagram showing sample of calculations

implemented in the final DSS prototype.......................................................................13Figure 12: Conceptual graphic demonstrating data flow into and through the DSS.....................15Figure 13: Conceptual graphic demonstrating data flow into and through the DSS.....................16Figure 14: Conceptual graphic demonstrating data flow into and through the DSS.....................16Figure 15: MOEs, listed in the ‘values’ compartment, as identified in the MBSE model for the

MOWLES DSS.............................................................................................................18Figure 16: Example of scoring and weighting schema implemented in the DSS prototype.........20Figure 17: Definitions and MBSE alignment of DSS prototype tabs contained within developed

spreadsheet tool.............................................................................................................20Figure 18: Layout of the ‘MOWLES Scenarios’ tab within the DSS prototype...........................21Figure 19: Layout of the ‘Environmental Scenarios’ tab within the DSS prototype.....................22Figure 20: Screen capture of the ‘Scoring Criteria’ tab with sample values displayed.................23Figure 21: Screen capture of the ‘Intermediate Calculations’ tab with sample values displayed. 24Figure 22: Screen capture of the ‘Results’ tab with sample values displayed...............................26Figure 23: Right clicking the tab bar in Excel displays a menu which allows the user to 'unhide'

the enumerations tab.....................................................................................................27

iv

Page 5: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

1 IntroductionStudents enrolled in Masters programs for Systems Engineering and Operations Research at George Mason University (GMU) must complete a capstone project which pairs the academic community with industry partners to solve real-world problems. In 2017, the US Army Engineer Research and Development Center (ERDC) Geospatial Research Laboratory (GRL) Topographic Engineering Center (TEC) approached GMU with a problem related to ongoing science and technology (S&T) efforts focused on improving the situational awareness of soldiers in urban or enclosed spaces. Given the initial stages of ERDC’s current S&T efforts, the challenge posed to the GMU team was whether a tool could be developed to assist in defining what material components were needed to ensure their S&T efforts resulted in an effective system for the warfighter in a myriad of operational environments.

1.1 Problem ContextUnderstanding the larger national defense picture helps in understanding the context surrounding ERDC’s current development effort. The Department of Defense (DOD) strategically strives to achieve what it calls ‘offsets’ which are essentially advantages it seeks to achieve over peer competitors. Since the 1950’s, these offsets have evolved into three distinct categories.

The first offset was developed in the 1950’s as included the DOD’s nuclear arsenal and nuclear triad which includes land-based missiles, submarine launched missiles, and strategic bombers. In the days prior to proliferation of nuclear technologies, this provided the US with a strategic deterrence against aggression from adversaries and near-peer rivals.

The second offset was established in the 1970’s following the Vietnam War. During this time, carpet bombing, barraging, and other warfighting techniques designed to win wars of attrition gave way to precision technologies which allowed US forces to dominate numerically superior adversaries. During this time is seen the rise of phrases like ‘force multiplier’ which indicates that a single unit is capable of the same warfighting impacts as several of its competitors.

In 2014, the US began engaging in the third offset which emphasizes the ability to operate in areas where adversaries are attempting to enforce anti-access and area-denial (A2/AD). A2/AD capabilities are achieved through a wide range of adversarial capabilities including electromagnetics, jamming, use of physical barriers and improvised explosives. To achieve the third offset, the DOD has engaged in a series of technological advancements to determine how US forces best operate within hostile A2/AD environments with minimal impact of warfighter safety or operational success.

1

Page 6: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Figure 1: Strategic offsets with current GMU-ERDC effort highlighted.

ERDC’s efforts to develop an unmanned platform capable of surveying tunnels and buildings supports the third offset by providing warfighters the ability to survey hidden or obstructed areas using advanced technologies without risking potential loss of human life. A myriad of sensors, platforms, and control systems could be used to support development of such a technology. ERDC’s collaboration with GMU focuses on the ability to make an informed decision regarding which technologies would best meet the needs of the warfighter and support the third offset of accessing areas that would otherwise be dangerous due to A2/AD efforts by hostile forces.

1.2 Tasking DescriptionMembers of ERDC articulated that they are currently working on a five-year effort to develop an unmanned system capable of surveying enclosed areas (e.g., buildings, caves, tunnels, etc). Current efforts are focused on evaluating the relative benefit of different system components to optimize the over system effectiveness delivered to the end user. Due to the complexity and wide breadth of options available to ERDC, some mechanism is needed to assist in rating or ranking different combinations of system components.

The graduate team from GMU proposed developing a decision support system (DSS) which accepts inputs from component measures of performance (MOP) and determines the overall system measures of effectiveness (MOE). This would provide the basis for a comparative analysis between system configurations and better inform ERDC’s decisions on which components to select for initial system design. To develop the DSS, a list of component MOPs and total system MOEs would need to be developed in conjunction with the team at ERDC. In lieu of a formal statement for expect system capabilities, design, or requirements, the GMU team

2

Page 7: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

used modeling-based systems engineering (MBSE) to develop preliminary object-oriented block diagrams of the notionally proposed system and parametric diagrams to proposed MOP inputs and MOE outputs of the DSS.

To support ongoing research efforts with ERDC, the GMU team provided materials related to the DSS design, all MBSE modeling artifacts, and recommendations for how to best facilitate system design with academia in future phases of development.

2 System DescriptionThe current system pursued by ERDC’s S&T efforts is referred to as a Multi-Operational Wireless Ranging and Low Power LIDAR Exploitation of Subterranean Structures (MOWLES) system. ERDC initially defined the system as a sensor suite mounted on some form of a ground-based platform powered by some sort of software suite. In developing the MBSE models for the notional system design, the GMU team extended these initial three components into five separate components including the platform, sensor, software package, radio, and control unit.

2.1 System Concept of OperationsThe intent of MOWLES is to be an unmanned system capable of accessing areas that would otherwise be difficult or hazardous for personnel to explore. This supports the concepts of the third strategic offense as previously described.

The concept of operations (CONOPS) for MOWLES involves a user, likely attached to a unit and mounted in a vehicle, controlling the MOWLES unit remotely as it transverses a tunnel or other contested area.

Figure 2: Notional MOWLES CONOPS.

The system will need to transmit imagery data back to the user to scan for objects of interest. There are several different additional features that may aid the user in identifying potential threats such as image recognition technologies. Whether the expected benefit of such features

3

Page 8: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

outweighs the expected increase in processing time would be an example of a tradeoff with which the DSS could assist.

2.2 PlatformThe platform is an entity (vehicle, robot, person, satellite, etc.) that carries the sensors to a location where the sensors can collect the necessary data. ERDC’s current plans have been focusing on the use of readily available platforms, such as the PackBot from manufacturer iRobot. These platforms are plentiful in DOD inventories, have proven to be robust in a number of environments, and have already been adapted to host a wide variety of technologies making development of a MOWLES system less complex from an integration perspective.

Figure 3: PackBot by iRobot as shown in US Army media handouts (2002).1

Current plans are to utilize the platform in a man-in-the-loop fashion with long-term plans to expand capabilities into autonomous control. To assist in scoping the current GMU effort, this capstone project will omit decisions associated with tradeoffs in autonomous control since this represents a much more advanced field of study and likely a more focused effort solely on platform control.

2.3 SensorsThe sensors collect the raw data and transmit that data to a software package. The sensors will need to accomplish relatively unique tasks under sub-optimal environmental conditions (low light, smoke, dust, etc.) to provide the most accurate data of the terrain as possible. In discussions with ERDC, the GMU team identifies three basic sensors that are currently being considered for initial S&T sensors: Microsoft’s Kinect, Microsoft’s Kinect 2, and another photogrammetry option which optically scans an area.

1 “New robots well trained for war: Next generation learns lessons of Afghanistan”, Associated Press, January 14, 2003. < http://www.nbcnews.com/id/3078710/ns/technology_and_science-science/t/new-robots-well-trained-war/#.WgcmdmhSxPY> Accessed: 11 November 2017

4

Page 9: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Figure 4: A photogrammetry sensor such as could be considered for the MOWLES system.2

Each technology has benefits and challenges to overcome. Many of these tradeoffs are associated with the hardware requirements of each units. The current DSS effort will focus more on sensor performance, but definition of the integration tradespace for each unit is an area of significance for the overall development effort and could represent follow-on work between ERDC and academia.

2.4 Software PackageThe software package is used to process the raw data and provide the information to the soldiers in a manner that allows them to determine the location, and nature of a possible threat. Processing includes any modifications from the raw sensor output to make it understandable, legible, and accessible to the user and any image processing required to identify potential objects of interest within the picture. Software features under consideration include image processing, image recognition, and user-interface features which may provide more flexibility in the data presented to the operator but will increase software and user data processing time.

2.5 RadioThe radio will send data to the control unit and receive data on actions and orders. The radio will have to operate in underground conditions that will interfere with normal operations. While radio is referred to as a singular, monolithic component, it is possible the certain hardware configurations will involve multiple radios on both the transmitter and receiver ends of the system. If this is the case, system MOPs will need to be built into the DSS in such a way as to accept values for both units. However, it is the assumption of the GMU team that most users will be equipped with a legacy unit with which the MOWLES will have to integrate. If this is the case, any latency or data processing delays experienced by the legacy equipment can be merged into delay variables identified for the user.

2 Lievendag, Nick. “Updated: Structure Sensor vs. Intel Realsense SR300 vs Kinect V2”. 3DScanExpert, 2 February 2017. < https://3dscanexpert.com/structure-sensor-realsense-sr300-kinect-v2-3d-scanning/> Accessed: 11 November 2017

5

Page 10: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

2.6 Control UnitThe control unit is the central piece of computing hardware responsible for sending movement commands to the platform or sensor control signals to the sensor. It hosts the software packages as previously described and communicates externally using the available radio. Different hardware components within the control unit may impact the speed at which data is processed or disseminated to the user.

2.7 User and Environmental FactorsThought not a component of the MOWLES system, the system will be impacted by several external factors. These factors include the user, climate, adversaries, terrain, and the building or structures within which the system will have to operate. These factors do not contain measures of performance, but variables associated with how they impact the system will have to be considered when looking at total system effectiveness.

3 Solution MethodologyDeveloping a solution for the problem faced by ERDC requires identifying several challenges currently faced by the effort.

First, due to being in the initial stages of development for the MOWLES system, formal requirements engineering has not taken place. This complicates the development of a DSS since the standard of ‘suitability’ or ‘effectiveness’ for the total system has not been robustly explored. This drove the GMU team to define notional MOPs for each component and MOEs for the total system. As the MOWLES effort matures, leveraging the MBSE products to determine where changes in MOP/MOE definition impact the overall decision analysis will become crucial.

Second, while a user community has been notionally identified, it is a wide-ranging community composed of users in many different environments and with several different objectives. Within traditional DOD acquisitions, formal development of a system would begin with a Capability Description Document (CDD) which would define at the capability level what the proposed system is to accomplish. This would drive the development community towards a list of measures of effectiveness and eventual top-level requirements. Being an S&T effort, MOWLES is not at the stage where a CDD is mandated; however, development of a draft CDD would be beneficial as it would provide a framework to drive ongoing requirements definition.

Finally, due to unavoidable restrictions on the availability of MBSE tools and government information technology (IT) services, automated tools from SysML will not be widely usable by most in the ERDC community. This limits the amount of machine-to-machine interfacing that can take place using SysML to directly drive a DSS. To mitigate this issue, the GMU team is going to use MBSE to provide a visual design of the logical and computational relations between MOPs and MOEs. The parametric models generated within the MBSE will become the blueprint for design of a spreadsheet-based tool accessible using common Microsoft Office products.

6

Page 11: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

3.1 DSS Concept of Operations (CONOPS)The DSS is designed to accept inputs from the user on system MOPs and environmental factors related to the user and other external variables. The DSS will then perform a series of calculations and output a number of measures of effectiveness for each combination of components. Between configurations, the DSS will also provide the user with a series of rankings which assist in determine which configuration best meets the user and program’s combined needs.

Figure 5: Methodology for creation of MOWLES DSS.

3.2 Verification and ValidationIn discussions with ERDC, the GMU team identified several lab-based testing capabilities which will allow an eventual prototype to be operated in a controlled environment. When prototypes have been developed the system can be placed in this controlled environment and measured to determine how accurate the DSS predicts total system effectiveness.

At this point in development, ERDC does not have a fully integrated and functioning MOWLES prototype so verification and validation (V&V) cannot be performed on a completed system. Detailed explanation of all equations and logic will be provided to ERDC so the logic can be reviewed and modified prior to delivery of the final DSS and in follow-on efforts.

3.3 Methodology for DesigningThe GMU team’s methodology surrounds three basic thrust areas: client interaction, MBSE, and DSS prototyping.

7

Page 12: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Figure 6: Methodology for creation of MOWLES DSS.

The GMU team’s initial interactions with the client sought to understand their circumstances and gather background details for their overall effort. ERDC explained they were in the initial stages of a multi-year effort. They acknowledged their study objectives were broad, and required additional definition. The GMU team provided a problem statement and a revised problem statement to ensure the underlying issue was mutually understood. The problem statement, listed below, provided the necessary motivation to adequately scope this project.

Figure 7: Revised Problem Statement

Once the problem statement was agreed upon the GMU team continued to define the basic components of the MOWLES system. The team’s focus at this stage was to understand the component interactions within the system. The GMU team developed several MBSE products to illustrate their understanding of the MOWLES system. These products were progressively developed in consultation the ERDC team.

8

Page 13: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Generation of MBSE products assisted in codifying the GMU team’s understanding of ERDC’s problem space, definition of a notional MOWLES component design, and design of the DSS. This was accomplished using a package diagram, a block definition diagram (BDD), and a parametric diagram. To assist in facilitating interactions with the client, an MBSE reference guide was also created. This reference guide walks the reader through the products created and explains the purpose behind each model. The equations contained within the parametric diagram are built in such a way that the DSS can directly lift equations and references from each equation’s “constraint block” and input it directly into the spreadsheet. This approach will also assist follow-on efforts in modifying or updating the DSS by providing them a graphic roadmap of the computations contained within the prototype.

The prototype DSS will take inputs as defined in the MBSE models and manipulate them per the logic provided in the parametric diagram. Outputs will be provided for each MOE and a notional weighting schema will be presented to the user in a results tab which allows them to more effectively make decisions based on component configuration types. ERDC expressed a desire to maintain compatibility of the DSS with commonly installed software packages already present on typical government computer systems. This drove the GMU team to develop a spreadsheet-based solution which could be loaded in a commonly available software suite such as Microsoft Excel.

4 Tool DesignThe MOWLES DSS was developed using a MBSE approach which included generation of several Systems Modeling Language (SysML) models.

One of the largest challenges faced by the GMU team was the lack of a defined system design or existing documentation on MOWLES design concepts. This drove the MOWLES team to develop a package diagram to establish what objects exist inside and outside of the system boundary and a block definition diagram (BDD) to establish the relationships between each block.

With objects inside and outside of the system boundary identified, the GMU team had to determine what component level MOPs and total system MOEs would drive decision making for future system design. These parameters would become inputs and outputs for the DSS, respectively. To relate inputs to outputs, the GMU team also developed a series of computations which became the blue-print for an executable spreadsheet tool, forming the basis for a functional DSS. These calculations, including all MOP/inputs and MOE/outputs, were robustly defined in a parametric diagram.

9

Page 14: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

4.1 Package DiagramThe package diagrams for the MOWLES DSS includes three basic packages: MOWLES system components, external or environmental factors which impact MOWLES performance, and a series of constraint blocks which represent the equations that are executed in the DSS tool.

Figure 8: Subset of MOWLES DSS package diagram showing MOWLES components and external/environmental factors (with parameter definition suppressed).

Notations in the package diagram indicate which components were added as a result of GMU’s efforts to develop a DSS. As explained previously in this report, these components resulted in a more complete definition of the total system and allowed the GMU team to define more robust inputs than would be afforded by abstracting the notional MOWLES design to a higher level of abstraction.

The external, or environmental, factors identified were those which were determined to impact performance of the total MOWLES system. Identification of these components becomes a critical part of defining what factors may result in the operator experiencing different system performance than what may be expected in a more controlled environment. It should be noted that ‘adversaries’ were included in the package and following BDD models, but no calculations were made based on action an adversary may take to negate the system. This omission of adversary actions was made out of a desire to simplify the problem space within the available schedule for the current scope of work. However, it is reasonable to expect that ERDC will eventually need to consider what impact an adversary may have on a system and whether that impact presents a useful tradeoff between possible system configurations.

The calculations package contains a series of constraint blocks, which is the typical block used in SysML to represent calculations.

10

Page 15: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Figure 9: Subset of MOWLES DSS package diagram showing sample of calculations implemented in the final DSS prototype.

The current set of calculations within the DSS were selected based on a cursory study of what a notional MOWLES would seek to accomplish, feedback from ERDC, and study into some basic performance parameters for optical sensors. Follow-on tasking for the client would include (a) verifying the applicability of the calculations presented towards their particular problem space and (b) identification of additional calculations which would reveal more about the relative trade-offs between system configurations. Within the MBSE model, editing an equation in the package diagram will automatically change the equation in the parametric model. It is recommended that someone with experience in SysML be responsible for making such changes to ensure that implementation of constraint blocks between the package diagram and parametric models are consistent.

4.2 Block Definition Diagram (BDD)The BDD expresses components of the DSS in terms of ownership. Lines, also called ‘associations’, represent relationships between blocks. Lines with black diamonds demonstrate a composite relationship where something on the diamond-end of the association is composed of something on the arrow-end of the line.

11

Page 16: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Figure 10: Block Definition Diagram for notional MOWLES design and external (contextual) factors (with parameter definition suppressed).

The BDD for the MOWLES DSS serves two primary purposes. The first is to provide some baseline for notional system design. Since no conceptual designs were provided to the GMU team, it became necessary to assert how components of MOWLES would interact. To confirm whether these assertions were reasonable or not, the GMU team used the BDD to facilitate conversations with the clients and to ensure everyone internal to the GMU team had the same system concept in mind. The second purpose in the BDD was to ensure all MBSE models were truly integrated. Each block within the BDD was defined in the package diagram and the parametric model was generated from the ‘DSS Model’ block. This places the BDD in the middle of the diagram used to define DSS objects and the diagram used to define DSS computations. Without use of a BDD, the GMU team would risk having parametric computations which reference inputs or outputs not defined in each component’s blocks.

4.3 Parametric DiagramThe parametric diagram serves as the blueprint for the MOWLES DSS. Computations executed I the prototype match directly with calculations presented in the parametric model. This model moves valuable in allowing the GMU team to identify what inputs are needed to generate an output and how many intermediate calculations must take place to determine an MOE.

The parametric diagram has the additional benefit of allowing builders of the DSS to define their logical statements graphically before transferring to a more executable DSS mechanism. This is especially helpful when building a DSS which may contain dozens of intermediate calculations before resulting in a single MOE output. For example, the MOE for ‘expected range’ requires up to 4 separate intermediate calculations before a final expected range can be provided. This is much more easily organized graphically than using a string of spreadsheet code.

12

Page 17: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Figure 11: Subset of MOWLES DSS parametric diagram showing sample of calculations implemented in the final DSS prototype.

The parametric model also provides the GMU team the ability to identify potential calculations that are necessary, but are not yet identified. Note in the previous figure that the calculation for ‘radio range’ has not yet been defined. While the DSS prototype contains nothing but completed calculations, the parametric model can identify potential calculations not yet implemented in the DSS prototype but that may be significant contributors to a final decision on a MOWLES configuration.

5 Decision Support System (DSS) PrototypeTo address the problem statement posed by ERDC, the GMU team developed a DSS prototype which implements several calculations as laid out in the several MBSE products previously described. Furthermore, this DSS implements a weighting system which allows the user to compare several different design configurations. The final output of the prototype is a score and ranking for up to three different configurations in three user-defined environments.

5.1 DSS ConceptThe DSS implements MBSE as described in the DSS CONOPS in figure 5. Inputs for the DSS are defined in the BDD while the computations executed in the DSS are identified in the parametric diagram. As outlined in the BDD, inputs come from a combination of component MOPs and environmental factors. The user will identify up to three different system configurations and inputs component MOPs. Recall that “components” refers to a notional

13

Page 18: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

MOWLES design consisting of a (1) platform, (2) sensor, (3) radio, (4) software package, and (5) control unit. The user will also input environmental factors for up to three different environments. The DSS will then execute several calculations, as modeled in the parametric diagram, and output the respective MOEs for each component configuration in each environment. This means that there will be nine sets of MOEs presented to the user (one set for each component configuration in each of three environments.

The decision to support up to three different system configurations is based on the current component candidates considered by ERDC. ERDC has currently identified a single platform, two computer systems, and three sensors which could potentially meet the needs of the MOWLES user-base. Since the largest number of candidates currently considered for a single component is three, the GMU team decided to allow for up to three configurations in the initial DSS delivery. The decision to include up to three different environments was based on discussions with the client where the concept of comparing each system configuration in multiple environments was proposed and an initial commitment to analyze system effectiveness across three environments was established. A decision to include more than three environments and compare more than three system configurations in parallel represents the basis for future collaboration with academia.

Comparing nine different sets of MOEs is difficult, even when the number of MOEs is relatively low. This is why the GMU team has also implemented a user-configurable weighting system. This allows the DSS to generate a ‘score’ for each configuration for a quick determination of which notional system design performs best in each environment. The conditions of ‘performs best’ is set using the configurable weighting system.

The conceptual graphic, below, provides a visual representation of data flows through the tool.

14

Page 19: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Figure 12: Conceptual graphic demonstrating data flow into and through the DSS.

5.2 ToolsWhile MBSE products were created using a common architecture tool, ERDC expressed a desire to have the ability to run or execute the DSS created using software tools typically available on government laptops. This drove the GMU team to rely heavily on Microsoft Office products, particularly Microsoft Excel. By implementing all input processes, calculations, and scoring in Excel without the use of macros or third-party plugins, the GMU team hopes to maximize the amount of exposure ERDC and other stakeholders are able to achieve with the final product.

It should be noted that freeware exists which permits MBSE products within MagicDraw, the tool used to produce all MBSE products for this effort, to interface directly with Excel or other executable tools. Using this approach, a client without restrictions in software rights on their computer could achieve a much higher degree of integration between MBSE products and the executable DSS. This would increase the efficiency at which a systems engineer could modify and validate the DSS and would assist the DSS and MOWLES design teams in reaping the benefits of proper MBSE practices. However, due to the limited data and software rights available to ERDC and most government stakeholders, these freeware options are not presently an option for the current effort.

5.3 InputsInputs fall into two categories: MOPs and environmental factors. MOPs are inputs specific to the performance of a particular component while environmental factors are related to the overall environment experienced by the total system.

It should be noted that it is possible for a component to possess an MOP which appears similar to a MOE for the total system. For example, a sensor will have a scan rate which is specified by the hardware contained within the sensor. This is similar to the type of scan rate one would find on a specification document when purchasing the sensor from a vendor. Likewise, the total system may also have a metric called “scan rate”; however, the scan rate of the total system is much more complex. The total system scan rate would include environmental factors, the speed of the vehicle, and other parameters which would impact the ability of the fully integrated system to perform a sensor scan. During identification of component MOPs, this becomes problematic as it becomes easy to confuse component-level MOPs and total system MOEs, the later of which the DSS is responsible for computing.

To mitigate this confusion, MOEs for the total system are contained within a single ‘DSS’ block in the MBSE’s BDD. This block contains ‘value’ entries which correlate with MOEs for the total system. Each component block contains ‘value’ entries which correlate with component MOPs. This segregations between MOPs and MOEs is also reinforced in the DSS tool by physically separating MOPs in a tab entirely independent from total system MOEs. This helps eliminate confusion between metrics and provides the context for how each metric is applied to a particular system component or total system effectiveness.

15

Page 20: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

A detailed description of each input can be found in Appendix B.

5.3.1 Input BreakdownsMOPs for each component can be found in the MBSE architecture or in the breakdown, below.

Figure 13: Conceptual graphic demonstrating data flow into and through the DSS.

Environmental factors can also be found in the MBSE models or in the breakdown, below.

Figure 14: Conceptual graphic demonstrating data flow into and through the DSS.

5.3.2 Modifying InputsFuture efforts may determine that additional inputs are necessary to compute critical MOEs. Since the MBSE and DSS prototype are not connected through a machine-to-machine interface, it is possible to change the DSS without modifying the MBSE products. As a matter of practicality, due to limited availability of systems engineering professionals and access to MBSE

16

Page 21: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

software, this may be the route taken by ERDC. However, it is highly advisable to modify or at least track deviations from the MBSE products provided. By adding additional inputs into the BDD and re-integrating them into the parametric diagram to demonstrate how they are invoked to compute an MOE, the user base at ERDC is less likely to implement redundant calculations or add highly coupled, or even duplicate, inputs.

The specific process for adding or deleting inputs will change depending on the MBSE software suite chosen. Using MagicDraw, inputs can be deleted by removing the parameter from the tool’s containment tree or by highlighting the graphic box representing the input and pressing “CTRL + D”. This will cause every instance of that input to be removed throughout the model. If the deleted input was actively being used by a constraint box, the user will be able to see the missing input in the respective constraint box in the parametric model. If the user wishes to add an input, it must be added as a ‘value’ in the BDD, then displayed using the ‘show all parts’ feature in the parametric model. Again, these steps are specific to MagicDraw but similar features can be found in most MBSE software packages.

5.4 CalculationsThe calculations performed within the DSS can be found in the parametric model or in the constraint blocks of the package diagram within the MBSE product. These calculations connect MOPs and environmental factors to the MOEs for the total MOWLES system.

As with inputs, modifications to DSS calculations should be made within the MBSE set of products, specifically the parametric model. Making these changes in the parametric model allows the user to determine how changing a calculation will impact existing inputs. The user can define new calculations by creating a ‘constraint’ block in the BDD and integrating it with available inputs and outputs in the parametric model. This task is not intended to be conducted by someone without knowledge in SysML and MBSE and may represent the basis for future collaboration with academia. The need for someone trained in MBSE becomes even more apparent when attempting to add the newly defined constraint block to the excel spreadsheet. A systems engineering would need to add the new calculation to the appropriate tab within the DSS spreadsheet in accordance with the metadata defined in the parametric model.

If MagicDraw were being integrated with the Excel sheet through some machine-to-machine interface, such as openly available freeware, changing the equation in the constraint block would automatically update the spreadsheet. However, in absence of more integrated tools, changes in the DSS must be reconciled with the parametric model manually. As with inputs, the DSS can be changed without modifying the MBSE products, though this is not recommended due to the difficulty of managing the numerous inputs and outputs contained within the DSS in absence of an integrated tool.

A detailed description of all calculations within the DSS can be found in Appendix D.

17

Page 22: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

5.5 OutputsRaw output corresponds with MOEs for a given component configuration. This means that if the user inputs the maximum of three component configurations, one complete set of MOEs will be provided under each environmental condition for a total of nine MOE sets.

The MOEs considered in the current iteration of the DSS can be found in the ‘values’ compartment of the ‘DSS Model’ block within the MBSE models.

Figure 15: MOEs, listed in the ‘values’ compartment, as identified in the MBSE model for the MOWLES DSS.

A detailed description of all outputs can be found in Appendix C.

5.6 Weighting SchemaAfter computing the raw scores, the user needs a way to contextualize the raw data for each configuration to help inform analysis between potential system configurations. To accomplish this, a schema must be implemented to compare raw MOE values against some form of scoring criteria related to requirements for total system performance. Furthermore, the scoring mechanism must also allow the user to prioritize the relative value or importance of specific metrics. The DSS accomplishes this by using a scoring system coupled with a swing weights to generate both a ‘raw’ and ‘weighted’ score for each configuration in a given environment.

MOEs for a given MOWLES configuration are computed based on the MOP and environmental factors provided as inputs. The resulting values can be referred to as ‘raw’ performance values. These raw performance values can then be placed into a set of ranked bins which results in a ‘raw score’ for each MOE. One can see how this simplifies the decision-making process by presenting the user with a simple integer score rather than a series of multi-digit metrics of non-similar units. Furthermore, the scoring system simplifies comparisons between MOWLES configurations by presenting the user with a single integrate grade for each MOE rather than multi-digit empirical values.

18

Page 23: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

The DSS implements a scale ranging from 0-5, with ranges of each numerical grade being defined by the user. For example, the user may decide that a maximum system range between 4,000ft and 5,5000ft is a grade 4. If the actual computed range for a single system was 4,750ft, the user would be presented with both the empirical result as well as the much more easily comparable “4” grade. Users could then determine whether other system configurations exceeded a grade of “4” to determine what combination achieved the best overall performance. This particular aspect of implementing a scoring system speaks to ease-of-use for the user, but there is also a practical reason for implementing such a scoring system.

The actual benefit of a scoring system is realized when the scores are summed across a total system to create a total system ‘grade’. If a system results in five MOEs containing grades of 2, 4, 4, 3, and 5, the total system grade would be ‘18’. This gives the user a single number by which they can compare different system configurations. Furthermore, since MOEs are calculated uniquely in each environment, the DSS will also indicate how scores change as the environment changes. These summed scores are called ‘unweighted’ in the DSS.

While the unweighted score gives a general idea of the comparable effectiveness between system configurations, it ignores the fact that some MOEs may carry more importance to the user community. This is accommodated using weight values which are essentially user-defined multipliers assigned to each MOE. These multipliers allow the user to define which MOEs are valued more highly in comparison to others. Each grade is multiplied by it’s corresponding weight value to produce a ‘weighted score’. By summing each weighted score for a given component configuration, the DSS presents the user with a weighted score which can be used to compare different configurations.

Weighted Total for Configuration= ∑i=First MOE

All ConfigurationMOEs

( Raw Score∗Weight Value )

An example of the score and weighting schema implemented in the DSS can be seen in the graphic, below.

Figure 16: Example of scoring and weighting schema implemented in the DSS prototype.

19

Page 24: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

5.7 DSS Use and OperationThe DSS is operated by loading the prototype into Excel, providing inputs as prompted by the tool, and assessing the output in a tab labeled ‘results’. The tab structure, and alignment between each tab and the larger MBSE effort, can be seen in the graphic, below.

Figure 17: Definitions and MBSE alignment of DSS prototype tabs contained within developed spreadsheet tool.

5.7.1 Introduction TabWhen first opening the file, the user is presented with an introduction tab containing the names of each input the user will modify, their definitions, and their ‘ID’ or index. These ‘ID’ values correspond to indexes for each MOWLES component and environmental factor contained in the package diagram.

5.7.2 MOWLES ScenariosThe second tab available to the user is a ‘MOWLES Scenarios’ tab containing all currently implemented inputs for each MOWLES component. The tab contains three repeated input tables which allows the user to input up to three different sets of MOPs corresponding to a specific configuration of components. Each table is given a ‘scenario ID’ and a ‘scenario name’.

20

Page 25: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Figure 18: Layout of the ‘MOWLES Scenarios’ tab within the DSS prototype.

For example, if the user were to have two identical MOWLES configurations with the exception of a different sensor, the user may change the scenario name of the first configuration to ‘MOWLES with Sensor A’ and the second to ‘MOWLES with Sensor B’. The user would then provide the MOPs as indicated, verifying that the MOPs corresponding to unique sensor performance differs between scenarios.

The DSS expects a value for every available input and any input field not populated with a number will be read by the DSS as a ‘zero’. Due to the early stage of development for the ERDC team and the potentially substantial number of possible components considered over the next several years of development, default values can not be established at this point for any given parameter. One recommendation from the GMU team following development of the DSS is to establish a baseline of performance for specific components where likely component candidates have already been defined. For example, if platform candidates appear to be coalescing on the PackBot manufactured by iRobot, MOP values for the PackBot can be defaulted into the DSS increasing the ease-of-use for the user.

5.7.3 Environmental ScenariosThe ‘Environmental Scenarios’ tab works in a similar manner to the ‘MOWLES Scenarios’ tab. Users identify up to three different environments to consider and inputs values for each environmental factor identified within the table provided.

21

Page 26: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Figure 19: Layout of the ‘Environmental Scenarios’ tab within the DSS prototype.

As with each MOP field in the MOWLES scenario tab, the DSS expects a value for each field and will read any empty fields as a ‘zero’. The GMU team recommends that ERDC establish a set of MOWLES system use-cases and operational situations to serve as design reference missions as development continues. Each design reference mission will allow ERDC to establish quantifiable environmental parameters within which the system is expected to perform. This would allow the GMU team, or follow-on organizations, to establish default values thereby increasing the ease-of-use for the user.

5.7.4 Scoring & WeightsThe ‘Scoring & Weights’ tab provides the user with the ability to customize two aspects of the DSS: a set of scores for each MOE and weights across MOEs for a given system.

Scores give the user the ability to assign an integer, ranging 1 through 5, to specific sets of values for a given MOE. This simplifies the analysis of a particular MOWLES configuration by allowing the user to assess an integer score for each MOE rather than interpreting empirical data which may overwhelm the user. Within the ‘Scoring & Weights’ tab, the user will identify a value which constitutes the minimum numerical performance required to achieve the corresponding ‘score’. This can be seen in the ‘scoring values’ tab of the figure, below.

22

Page 27: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Figure 20: Screen capture of the ‘Scoring Criteria’ tab with sample values displayed.

Note that for values such as time, where lower numerical values correspond with higher performance, the empirical value represents the largest amount of time allowable to achieve a given score. This scoring mechanism is implemented within the DSS spreadsheet and is not found in the accompanying MBSE since it is analytically driven and not object oriented.

Weights give the user the ability to identify which MOEs weigh more heavily in the determination of which MOWLES configuration best meets the end-user’s needs. Default values are provided which equally weigh all MOEs; however, the ERDC expressed an interest in being able to more heavily weight specific MOEs. By defining numerical weights, ERDC can influence the final scoring for each MOWLES configuration which will be discussed more robustly in the DSS results section of this report. The user should note the recommendation provided in the ‘Scoring & Weights’ tab regarding weight values; ensuring all weights equal ‘1.00’ helps normalize the scoring and avoid customization bias in the final scoring results.

5.7.5 Intermediate CalculationsPopulating values in the environmental and MOWLES scenario tabs completes the user’s obligation to provide data to the DSS. In practice, most users will move directly to the ‘Results’ tab to begin assessing the relative performance of each component configuration. In theory, it is possible some users will prefer to verify that the calculations resulting in MOE outputs in the ‘Results’ tab are being performed correctly. This can be verified using the ‘Intermediate Calculations’ tab. The phrase ‘intermediate’ implies that several calculations may be necessary to generate a single MOE. This can be observed by surveying the parametric model within the MBSE products.

23

Page 28: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Figure 21: Screen capture of the ‘Intermediate Calculations’ tab with sample values displayed.

24

Page 29: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Providing the user all intermediate calculations serves two purposes. First, this tab allows the user to more directly modify an equation if review of the parametric diagram shows a calculation needs to be modified or updated. Each row in the tab corresponds to a different constraint block in the MBSE products. Identifying the matching row with the constraint block which requires updating making it easier to ensure the prototype is in compliance with the MBSE drive design. The second benefit of providing intermediate calculations is as a simple mechanism to enable verification that each MOE is being appropriately computed in the ‘Results’ tab. If an MOE output yields an unexpected result, the ‘Intermediate Calculations’ tab allows the user to determine where the unexpected result originates.

5.7.6 ResultsValues for each MOE can be found on the ‘Results’ along with a customizable weighting system which allows the user to determine a relative score for each MOWLES scenario. A scoring system has also been implemented to allow the user to identify how well each component configuration meets required MOE performance in each environment. More information on the scoring criteria will be covered in a following section.

25

Page 30: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Figure 22: Screen capture of the ‘Results’ tab with sample values displayed.

26

Page 31: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

The user may modify values in the ‘Weights’ column to influence how important different components are in a determination of total system performance. These weights are customizable for each individual configuration to allow the user maximum flexibility; however, most users will want a common weighting scale across all MOWLES combinations and therefore will want to verify that weighting values are consistent for all three combinations.

5.7.7 Hidden Enumerations TabThe DSS contains one tab which is hidden from the user by default. This tab contains two tables containing values called ‘enumerations’. Enumerations are word phrases used to represent data which may impact calculation of an MOE. Enumerations also allow the GMU team to limit the number of inputs expects for a given environmental field by restricting inputs permitted to those present in a pre-defined list.

For example, platform range is computed using several intermediate calculations, one requiring a value known as a ‘radio tunnel shape factor’. This factor changes depending on the shape of the tunnel for a given environment. Since this is a value which most users may not understand, the DSS allows the user to select the general shape (e.g., circular, rectangle, etc) of a tunnel’s cross-section. The DSS will then use the look-up tables in the hidden enumerations tab to determine the appropriate shape factor to use when computing range.

The DSS current implements enumerations for the ‘terrain cross section’ and ‘infrastructure material’ fields within the environmental scenarios tab. The enumerations provided in the DSS by the GMU team are intended to be sufficient for the environments discussed with the ERDC team. None of the values in the enumerations tab are intended to be altered and the values do not serve any practical purpose in the analysis of DSS results; therefore, the tab is hidden by default.

Figure 23: Right clicking the tab bar in Excel displays a menu which allows the user to 'unhide' the enumerations tab.

5.8 Performing AnalysisAnalysis is performed on DSS results contained in the ‘Results’ tab. The intent of the ‘raw total’ value for each set of MOEs is to provide some relative scoring between configurations for how effective each system is across all criteria identified. The ‘weighted total’ implements the values

27

Page 32: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

from the ‘weights’ column (column D in the ‘Results’ tab) to normalize each score and apply the appropriate importance to each value.

Weights are applied to MOEs and not to particular environments. Due to the wide number of operational communities which may want to employ a MOWLES system, it is difficult to determine how a DSS user would want to weight different environments. Therefore, the GMU team elected to compute raw and weighted scores for each environment independently rather than generating an average score for all environments. The concern with including a total score is that it may give the DSS user the ability to draw a false conclusion from the values provided resulting in a false confidence level regarding a particular system’s expected performance.

The color-coding for each individual MOE is intended to give the user some ability to perform rudimentary requirements analysis. Once scoring criteria have been established for a given operational community and implemented in the hidden ‘Scoring Criteria’ tab, the red-yellow-green color schema provides instant feedback to the user regarding which total-system MOEs fail in a given environment. This becomes critical when defining requirements for the baseline system since it is reasonable to expect that the MOWLES system may be employed in environments where the total system performance differs greatly from what is experienced in more controlled environments.

6 FindingsThe problem posed by ERDC demonstrates one typical of science and technology (S&T) efforts early in the design process. When an S&T initiative is not explicitly driven by an operator need or context provided by a user-group, the design team is left with so few constraints that it becomes difficult to determine what constitutes a ‘good’ design and how alternative designs can be compared. The proposal by the GMU team was to develop a DSS assuming a number of total system MOEs. This assumes that the MOEs asserted by the GMU team were accurate or reasonable. Through several interactions with the clients at ERDC, it was determined that the MOEs selected were reasonable for the purpose of demonstrating how a DSS could aid in the decision making process.

Inputs were derived by determining what computations were necessary to determine the selected MOEs. This bottoms-up approach of beginning with the MOE and working backwards to a DSS input revealed that many assumed MOPs were not useful. In other words, in absence of a MOE, it was easy to assume a set of MOPs were needed to make a decision on how effective the total system would perform; however, these MOPs could not be traced to a single calculation and were therefore omitted or modified to support calculation of the MOEs. Beginning with determination of relevant MOEs also assisted the GMU team in determining a reasonable level of detail with which to consider the surrounding environment. Particularly when considering factors external to the MOWLES system boundary, it is easy to errantly assume many variables are critical that are not actually needed for the MOEs selected. In short, the GMU team

28

Page 33: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

identified that a critical first step in the process for developing a DSS was to determine the factors to be computed prior to defining the scope of inputs to be included.

Once implemented the final DSS could be loaded on any machine equipped with current versions of Microsoft Excel. Since software suites which allow the active modification of MBSE products are not commonly accessible by the ERDC user community, the GMU team determined that there was a high risk of the MBSE products being rarely referenced, particularly when modifying or adding parameters. For this reason, the team determined that it was best to develop an indexing system for inputs, outputs, and equations. By adding unique identifiers to all MBSE products consistent with the indexing system used in the spreadsheet, the GMU team is enabling the ERDC community to continue using the MBSE products as references as they continue to develop the DSS or pass the tasking on to another group.

Finally, the lack of availability of ERDC systems engineers became apparent in some of the initial discussions with the clients. Clearly providing a series of MBSE products which require significant formalized training to read would be of limited use to an organization who does not current possess engineers with a background in architecture or SysML. For this reason, the GMU team determined it necessary to deliver a reference guide to the clients at ERDC which assisted them in reading and comprehending the data contained within the MBSE products provided. This gives the clients a basic understand of the products produced and how they are intended to inform the DSS which they can manipulate more readily on their standard work machines.

7 RecommendationsThe GMU team felt reasonably equipped and informed to provide ERDC with products that could advance their understanding of the problem space and with artifacts which represent decision making aids for design decisions or future development. With that said, several recommendations were identified which could greatly increase the ability of future academic teams to support development of a future MOWLES system.

7.1 Total System AccuracyThe MOE which appears to intuitively hold the most weight is total system accuracy. Both due to the academic and professional backgrounds of the GMU team, it is recommended that ERDC stakeholders place a special emphasis on defining what accuracy requirements would be needed by the user-group to a level of detail that can be quantified and measured. The number of external factors which apply to total system accuracy will change depending on the level of detail required of the MOWLES system. In order to determine both the inputs necessary and the performance required of the total system, a survey of the user-base or intended objects of interest for the sensor should be performed to determine what quality is required of the sensor across all environment types.

29

Page 34: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

7.2 Tactical/Operational Objectives for the SystemSimilar to the requirement for total system accuracy, the overall objectives for the MOWLES system appear, at this point, unclear. This makes it difficult to determine if the current set of inputs is robust enough to examine all contributing factors to a given MOE. Furthermore, it becomes difficult to establish a credible weighting system without knowledge of what operational factors will contribute the most to overall mission success.

The MOEs considered the current effort were selected based on a set of assumptions for what might be required to achieve the tactical and operational objectives of forces employing a MOWLES-like system. These determinations were made based on the professional backgrounds of the GMU team. However, these backgrounds are not fully informed on the specific community to which MOWLES will be deployed and does not consider the myriad of different missions a MOWLES type system could potentially support (e.g., searching for explosives, terrain mapping, etc).

One artifact which is common in more formalized acquisition programs is a Capabilities Description Document (CDD). While most S&T projects are typically exempt from creation of such products, these artifacts do serve a purpose which may meet the need for a tactical/operation objectives statement. It is recommended that ERDC pursue development of an informal or draft CDD to establish the tactical and operational capabilities to be achieved and the standard to which tasks must be performed by the system or by units operating the system.

7.3 Development of Baseline ConfigurationsSeveral components appear to already have likely candidates. A series of platforms manufactured by iRobot appear to be favorites for current platform considerations. By baselining a single component or set of components, ERDC should be able to determine a baseline performance for each MOE. This would provide ERDC a baseline for future configurations and serve as a standard for “goodness” against which other configurations could be measured. Note that establishing a baseline does not require that the baseline be the intended deployable system; only that it is a configuration which is feasible and reasonable for an S&T prototype. Using this baseline, the team at ERDC may be able to perform live tests in controlled environments enabling them to determine more detail on the suitability of MOEs within the DSS to track the effectiveness of an eventual MOWLES system.

8 SummaryCurrent DOD initiatives have been focusing on achieving a strategic “offset” from foreign forces related to the use of technology to achieve an advantage over potential adversaries. This offset includes the use of unmanned systems to gain situational awareness of regions that would otherwise be denied to US forces. ERDC has begun the effort of developing a MOWLES system capable of transverse terrain in regions of limited mobility (e.g., buildings, tunnels) but lack a tool which will allow them to evaluate different combinations of technologies necessary to build

30

Page 35: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

a complete system. The work proposed by the graduate team at GMU included development of a DSS based on MBSE principles. Development of a DSS would require establishing initial MOEs and determining what inputs would be needed to compute each metric. This drove the GMU team to establish a list of MOPs for proposed system components and environmental factors which will influence the overall MOWLES design.

MBSE products were created using an integrated architecture tool and drove design of a DSS. The DSS was created in a common spreadsheet tool accessible by all stakeholders at ERDC. To aid in the understanding of MBSE practices, a reference guide was created and delivered to the ERDC clients as value added.

The total effort resulted in a series of findings for the GMU team and recommendations to be delivered to ERDC. The GMU team recommends that the ERDC team continue work with academia to determine all factors which contribute to total system accuracy and update the DSS and MBSE models as needed. The GMU team also notes that efforts to develop MOWLES could advance significantly by selecting baseline components against which future component configurations could be assessed. Along with a system baseline, the ERDC team would also benefit from definition of tactical and operational objectives for the system. This can be accomplished in conjunction with academia to include design reference missions, high level capability descriptions, and established operational requirements for each reference mission defined.

ERDC could benefit greatly from continued collaboration with academia by providing technical and procedural input into ongoing MOWLES development.

31

Page 36: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Appendix A: DSS Operation Use-CaseThe User should be allowed to Run an analysis, add an object to be compared in relation to the other objects that are to be used and change the parameter of an object. When a user runs an analysis they will enter the values of the environment that the MOWLES system will encounter. After finishing they will receive a ranked scoring system from the DSS system that will rank and score each combination on a basis of how well it meets the requirements laid out in the parameters given to describe the environment. From there the user can make a decision on how effective the combination is to meet the mission requirements.

Use Case #1: User Runs Analysis Characteristic Information

Goal In Context: User wishes to run an analysis

Scope: DSS SystemLevel: Primary or sea levelPre-Condition: User Must have access to DSS

User must have access to MOWLESSuccess End Condition: User Runs an analysis of the environment the MOWLES system will run in

Minimal Guarantees: DSS worksPrimary Actor: UserTrigger Event: User wishes to run an analysis

Main Success ScenarioStep Actor Action Description1 User Opens DSS tool

2User Enters MOWLES Platform Parameters3User Enters MOWLES Sensor Parameters4User Enters MOWLES Software Parameters5User Enters MOWLES Radio Parameters6User Enters MOWLES Control Unit Parameters7User Pages to Input Tab8User Enters Environmental Parameters9User Enters MOWLES User Parameters

10User Enters Terrain Parameters11User Enters Infrastructure Parameters12User Enters Measure of Effectiveness Weights13DSS Outputs results14User Reads Output

Related InformationSchedule: Anytime when the DSS is availablePriority: WantPerformance Target: Access to Computer with DSSFrequency: Every time a User needs to analysis

32

Page 37: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

33

Page 38: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Appendix B: Detailed Input DescriptionsWhen it comes to the platform it is assumed that it has associated values for:

ID Parameter Description

1.a Platform Cost Cost to purchase a single Platform unit. Maintenance and other procurement costs are not included.

1.b Platform Range Maximum distance the platform can move in an environment without obstruction, and no physical interaction from the user (i.e. user does not replace a battery after a certain amount of time / distance.). This is an indirect measure of the platform's fuel / battery capacity and efficiency when fully configured with all necessary components to complete the MOWLES system.

1.c Platform Speed Maximum speed the platform can achieve when fully configured without obstructions in its path.

1.d Platform Capacity Maximum weight platform can carry and maintain minimum range and speed thresholds.

When it comes to the sensor it is assumed that it has associated values for:

ID Parameter Description

2.a Sensor Cost Cost to purchase a single Sensor unit. Maintenance and other procurement costs are not included. Unit could consist of more than one device if image collecting devices are combined in a single MOWLES system configuration (i.e. If a configuration utilized a forward looking Kinect 2 and a rearward looking Kinect 2 than the Sensor cost for that configuration would be the cost of two Kinect 2 sensors.)

2.b Sensor Weight Weight of the Sensor component when fully configured to the Platform.

2.c Sensor Scan Delay Length of time required for the Sensor to collect an acceptable amount of information within its FOV.

2.d Sensor Scan Range Maximum distance the Sensor can collect usable data within its FOV. 2.e Sensor Field of View The Sensor's actual Field of View (FOV) when fully configured.2.f Sensor Accuracy Probability the Sensor will collect data associated with a threat along the targeted

route.

When it comes to the Radio it is assumed that the associated factors are:

ID Parameter Description3.a Radio Cost Cost to purchase a single Radio unit. Maintenance and other procurement costs are not

included.3.b Radio Weight Weight of the Radio component when fully configured to the Platform.

3.c Radio Range Maximum distance the Radio will be able to receive data from the User, and send data from the Sensor to the User.

34

Page 39: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

When it comes to the Control Unit it is assumed the associated factors are:

ID Parameter Description4.a Control Unit Cost Cost to purchase a single Control Unit. Maintenance and other procurement

costs are not included.4.b Control Unit Weight Weight of the Control Unit component when fully configured to the Platform.

Finally when describing the Software loaded onto the MOWLES system the Associated Factors are:

ID Parameter Description5.a Software Cost Cost to purchase a single Software Build. Maintenance and other

procurement costs are not included.5.b Software Processing

TimeLength of time required for the Software to receive data transmitted by the Radio, process this data, and display it to the User.

5.c Software Accuracy Probability the Software will identify an actual threat.

Environmental factors that affect the performance of the system can be broken down hierarchically by: the MOWLES User, the Terrain, the Atmospheric Environment, and the Infrastructure. Each of these has a set of parameters that influence the performance of the MOWLES system itself.

The MOWLES User is the user who guides and controls the MOWLES with the hardware at hand. The User is constrained by their:

ID Parameter Description

6.a User Latency Length of time required by the User component to determine the targeted area to collect data.

6.b User Processing Time Length of time required by the User component to make a decision based on Software output, i.e. the amount of time the User needs to determine whether the MOWLES system identified a threat.

6.c User Accuracy Probability that the User will correctly identify an actual threat from the Software interface.

The Terrain describes the physical environment which the MOWLES system operates in which is constrained by:

ID Parameter Description

7.a Terrain Height The lowest height present along the targeted route the MOWLES system will be required to travel through.

7.b Terrain Cross Section The largest cross section along the targeted route that the MOWLES system will need to collect data for.

7.c Terrain Length Maximum distance of the targeted route that the

35

Page 40: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

MOWLES system will need to operate along. 7.d Terrain Obstacles Determines whether there are obstacles present that will

have an adverse impact on the MOWLES system's ability to operate in the targeted environment.

7.e Terrain Surface The ground material that the MOWLES system will operate on along the targeted route.

The Atmospheric Environment Relates to how the quality of the air affects how well the sensor can read the surroundings.

ID Parameter Description

8.a Environment Light The minimum and maximum levels of ambient light present along the targeted route.

8.b Environment Humidity The maximum and minimum amount of himidity present in the air along the targeted route.

8.c Environment Temperature The maximum and minimum temperature present along the targeted route.

8.d Environment Visibility Minimum distance the MOWLES system Sensor can accurately collect data along the targeted route.

The Infrastructure of the Environment is defined by:

ID Parameter Description

9.a Infrastructure Volume The minimum size, by volume, of a room that the MOWLES will be required to move through and collect data in along the targeted route.

9.b Infrastructure Walls The maximum and minimum levels of wall thickness present along the targeted route.

9.c Infrastructure Material The material of the walls that the MOWLES system will operate along the targeted route.

36

Page 41: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Appendix C: Detailed Output DescriptionsThe Measures of Effectiveness (MOEs) for the system are broken down and from those MOEs the score is generated. For the MOWLES configuration it was determined that the MOEs that would be used to form a score of the configuration. From the raw score of the MOEs a swing weight scoring is created in order to determine how well the configuration will match the conditions that the user wants.

ID Name DescriptionA Range Determines whether or not the MOWLES configured system can move through the targeted route and

back, as well as transmit data at the furthest point along the route. If the MOWLES system can cover the necessary distance, the calculation returns the amount of excess the MOWLES can still achieve.

B Time Combines the time that it will take the platform to move across the length of the terrain (average speed), plus the amount of time the sensor will need to stop along the route to take a reading (a function of the sensor range and the length of the route), plus the processing time for the software, plus the User's ability to move the platform to the appropriate place on the route to capture the image, plus the time it takes the User to interpret the data from the software.

C Cost Combines the Platform, Sensor, Radio, Control Unit, and Software cost inputs.D Accuracy Combines the Sensor, Software and User accuracy.E Capacity Combines the Sensor, Radio, and Control Unit weight inputs, and determines whether Platform

component can carry the combined component weight. If the Platform can accommodate the weight of the other components the response will provide how much additional weight the Platform can still carry.

37

Page 42: seor.vse.gmu.eduseor.vse.gmu.edu/~klaskey/Capstone/MSSEORProjectsFall17/M… · Web viewseor.vse.gmu.edu

Appendix D: Calculations of Measures of EffectivenessEach of the MOEs have associated calculations for determining the values. In each calculation below, a system of indexes are used to refer to each parameter. The interpretation of each index can be found in other appendixes in this document. The intent of this section is to demonstrate the varying levels of complexity common across all MOEs which speaks to the need for a decision support tool to ensure all parameters, regardless of complexity, are considered equally.

For determining the Maximum Range of the Configuration:

MOE . ARange=min(MOP .1 . b . ,MOP .3 . c .)

For Determining the Time:

MOE . BTime=( MOP .7 . c .MOP .1. c . )+( ( MOP .2 . c .+MOP .6 . a . )∗( MOP .7 . c .

MOP .2 . d . ))+MOP .5 . b .+MOP .6 . b .

For Determining the Cost of the Configuration:

MOE .CCost=∑x=1

5

MOP . x i .a

For Determining the Accuracy

MOE . DAccuracy=MOP .2 . f∗MOP .5 . c .∗MOP .6 . c .

For determining the Capacity

MOE . ECapacity=∑x=2

4

MOP . xi . b

38