use of dynamic models and operational … use of dynamic models and operational architecture to...

5
175 Use of Dynamic Models and Operational Architecture to Solve Complex Navy Staffing Challenges Darby Grande*, J. Todd Black**, Jared Freeman*, Tim Sorber**, Daniel Serfaty* *Aptima, Inc., [email protected],[email protected], [email protected] **Klett Consulting Group Inc., [email protected], [email protected] Abstract. The United States Navy established 8 Maritime Operations Centers (MOC) to enhance the command and control of forces at the operational level of warfare. Each MOC is a headquarters manned by qualified joint operational-level staffs, and enabled by globally interoperable C41 systems. To assess and refine MOC staffing, equipment, and schedules, a dynamic software model was developed. The model leverages pre-existing operational process architecture, joint military task lists that define activities and their precedence relations, as well as Navy documents that specify manning and roles per activity. The software model serves as a "computational wind-tunnel" in which to test a MOC on a mission, and to refine its structure, staffing, processes, and schedules. More generally, the model supports resource allocation decisions concerning Doctrine, Organization, Training, Material, Leadership, Personnel and Facilities (DOTMLPF) at MOCs around the world. A rapid prototype effort efficiently produced this software in less than five months, using an integrated process team consisting of MOC military and civilian staff, modeling experts, and software developers. The work reported here was conducted for Commander, United States Fleet Forces Command in Norfolk, Virginia, code N5-0LW (Operational Level of War) that facilitates the identification, consolidation, and prioritization of MOC capabilities requirements, and implementation and delivery of MOC solutions. 1. INTRODUCTION The Navy developed the Maritime Operations Center (MOC) concept to enhance its command and control of forces at the operational level of warfare [1]. To oversee the development of the MOC concept, the Navy gave United States Fleet Forces Command (USFFC) the responsibility to standardize MOC staff functions and processes. This standardization will enable interoperability with the joint community and promote commonality across all Fleet and principal headquarters. USFFC code N5-0LW (Operational Level of Warfare) used the Department of Defense Architecture Framework (DoDAF) to develop Business Process Models (BPM) for MaC processes. These BPMs, called Operational Views (OV-6c) in DoDAF [2], define MaC processes, their sequence, the organizational elements that execute them, and the products of those work activities. The dynamic modeling work reported in this paper transformed the static DoDAF documents into an executable software model called the MOC Performance Assessment Tool (MOC-PAT). The MOC-PAT is designed to support decisions regarding MaC staffing, such as whether a staffing plan is adequate to execute the many MOC processes required to support a specific mission set at a specified operational tempo. The first application of this tool supports planning and execution of Navy exercises to accredit Fleet MOCs. This paper outlines how a multi-disciplinary team developed an innovative solution for the Navy leveraging existing architecture products and software modeling approaches. Section 2 of this paper defines the problem. Section 3, describes the technical development of the initial version of the MaC-PAT. This is followed by a discussion of the data used to exercise the model and a presentation of initial results in Section 4. Finally, in Section 5 presents our conclusions and the directions for our future work. 2. PROBLEM DEFINITION The MOC concept is a recent development in the Navy. In order to ensure MOCs meet mission objectives for Fleet and Combatant commanders while implementing necessary interoperability standards, USFFC tasked Commander Second Fleet to establish a MOC Project Team to explore and document MaC doctrine, organization, training, material, leadership, personnel and facilities. As this effort evolved and the MOC Project Team transferred to USFFC as code N5- OLW, it was evident that a means of linking mission tasking to MaC manning and performance was needed to ensure MOCs are staffed and equipped to mission requirements. USFFC N5-0LW developed BPMs of over 30 MaC processes, documenting hundreds of activities within each process. These BPMs were created using the DoDAF standard OV-6c format. Typically, these diagrams are developed to support acquisition decisions and reside in a central Navy architecture repository, the Syscom Architecture Development and Integration Environment (SADIE). The MOC-PAT leverages https://ntrs.nasa.gov/search.jsp?R=20100012888 2018-06-29T15:34:51+00:00Z

Upload: phungliem

Post on 24-May-2018

216 views

Category:

Documents


2 download

TRANSCRIPT

175

Use of Dynamic Models and Operational Architecture toSolve Complex Navy Staffing Challenges

Darby Grande*, J. Todd Black**, Jared Freeman*, Tim Sorber**, Daniel Serfaty** Aptima, Inc., [email protected],[email protected], [email protected]

**Klett Consulting Group Inc., [email protected], [email protected]

Abstract. The United States Navy established 8 Maritime Operations Centers (MOC) to enhance thecommand and control of forces at the operational level of warfare. Each MOC is a headquarters mannedby qualified joint operational-level staffs, and enabled by globally interoperable C41 systems. To assessand refine MOC staffing, equipment, and schedules, a dynamic software model was developed. Themodel leverages pre-existing operational process architecture, joint military task lists that define activitiesand their precedence relations, as well as Navy documents that specify manning and roles per activity.The software model serves as a "computational wind-tunnel" in which to test a MOC on a mission, and torefine its structure, staffing, processes, and schedules. More generally, the model supports resourceallocation decisions concerning Doctrine, Organization, Training, Material, Leadership, Personnel andFacilities (DOTMLPF) at MOCs around the world. A rapid prototype effort efficiently produced thissoftware in less than five months, using an integrated process team consisting of MOC military andcivilian staff, modeling experts, and software developers. The work reported here was conducted forCommander, United States Fleet Forces Command in Norfolk, Virginia, code N5-0LW (Operational Levelof War) that facilitates the identification, consolidation, and prioritization of MOC capabilitiesrequirements, and implementation and delivery of MOC solutions.

1. INTRODUCTION

The Navy developed the Maritime OperationsCenter (MOC) concept to enhance its commandand control of forces at the operational level ofwarfare [1]. To oversee the development of theMOC concept, the Navy gave United States FleetForces Command (USFFC) the responsibility tostandardize MOC staff functions and processes.This standardization will enable interoperabilitywith the joint community and promotecommonality across all Fleet and principalheadquarters.

USFFC code N5-0LW (Operational Level ofWarfare) used the Department of DefenseArchitecture Framework (DoDAF) to developBusiness Process Models (BPM) for MaCprocesses. These BPMs, called OperationalViews (OV-6c) in DoDAF [2], define MaCprocesses, their sequence, the organizationalelements that execute them, and the products ofthose work activities. The dynamic modeling workreported in this paper transformed the staticDoDAF documents into an executable softwaremodel called the MOC Performance AssessmentTool (MOC-PAT). The MOC-PAT is designed tosupport decisions regarding MaC staffing, suchas whether a staffing plan is adequate to executethe many MOC processes required to support aspecific mission set at a specified operationaltempo. The first application of this tool supportsplanning and execution of Navy exercises toaccredit Fleet MOCs.

This paper outlines how a multi-disciplinary teamdeveloped an innovative solution for the Navy

leveraging existing architecture products andsoftware modeling approaches. Section 2 of thispaper defines the problem. Section 3, describesthe technical development of the initial version ofthe MaC-PAT. This is followed by a discussion ofthe data used to exercise the model and apresentation of initial results in Section 4. Finally,in Section 5 presents our conclusions and thedirections for our future work.

2. PROBLEM DEFINITION

The MOC concept is a recent development in theNavy. In order to ensure MOCs meet missionobjectives for Fleet and Combatant commanderswhile implementing necessary interoperabilitystandards, USFFC tasked Commander SecondFleet to establish a MOC Project Team to exploreand document MaC doctrine, organization,training, material, leadership, personnel andfacilities. As this effort evolved and the MOCProject Team transferred to USFFC as code N5­OLW, it was evident that a means of linkingmission tasking to MaC manning andperformance was needed to ensure MOCs arestaffed and equipped to mission requirements.

USFFC N5-0LW developed BPMs of over 30MaC processes, documenting hundreds ofactivities within each process. These BPMs werecreated using the DoDAF standard OV-6c format.Typically, these diagrams are developed tosupport acquisition decisions and reside in acentral Navy architecture repository, the SyscomArchitecture Development and IntegrationEnvironment (SADIE). The MOC-PAT leverages

https://ntrs.nasa.gov/search.jsp?R=20100012888 2018-06-29T15:34:51+00:00Z

176

these preexisting BPM documents and uses themto develop accurate models of the operatingMaC.

This is accomplished by linking MaC processesback to Joint Mission Essential Tasks Lists(JMETL), first identifying core missions a MaCstaff is required to execute and then relating thosemission tasks to the associated JMETL tasks.Manning information based on existing MaCmanning documents and role data (developedfrom surveys and onsite observation) is combinedwith process activity workload observations (Le.,time to complete activities, or work productsrequired to complete activities) to populate theOV-6c BPM documents in the MaC-PAT. Thesedata are then available to support model runs toanalyze MaC staff execution and supportaccreditation events.

3. MODEL DEVELOPMENT

The initial version of the MaC-PAT is designed toenable skilled analysts at USFFC N5-0LW to testthe impact of MaC manning estimates on MaCperformance at the Numbered Fleets executingNormal & Routine (N&R) Missions. In this sectionwe introduce the model, its assumptions, thedynamics, and the output capability for the users.

3.1 Model Introduction

A mission scenario that the user constructscontains a number of processes, each of which ismade up of a series of activities. While theprocesses are executed, a fixed schedule of battferhythm events (BRE) occurs. It includes specialworking group meetings and regular briefs tosenior staff. The BRE and the activities produceand consume (that is, require) informationproducts, and these well-defined products serveas the linkages between different parts of theorganization and their many processes and BRE.For example, a planning activity may produce aplan (a document) that is a required input to anassessment activity to communicate whichindicators of progress should be monitored. TheMaC organization that will accomplish thismission is made up of multiple organizational units(aU). Each au has several billets (individuals)assigned to it, and each billet is assigned acollection of roles he may take on, one at a timethroughout the mission. These roles currentlyserve as proxies for more detailed informationabout billets' associated knowledge and skills,which we hope to incorporate in future versions ofthe MaC-PAT.

The work discussed in this paper was conductedfor an initial proof-of-concept phase, so a numberof simplifying assumptions were necessary. Asthe work continues, we are re-visiting each ofthese to refine and enhance the model. Weassume:

• Billets are available to work 24 hours eachday

• The MaC is operating under Normal andRoutine conditions

• Each process begins at scheduled times,according to user-specified cycles

• Each activity cannot begin until its precedingactivities (within the process) are concluded

• If an activity is prompted to start at time t (bythe schedule or by the conclusion of itspreceding activities), then it must conclude ata deadline created by adding the longestrequired processing time by any of its roles tothis earliest triggered start time

• Information products have a user-specifiedshelf life, after which their level of completiondecays. This is to ensure that we capture thefact that an activity which is unable to updateor produce an information product on timewill affect the ability of an activity whichrequired the information product to executecompletely.

3.2 Model Dynamics

The purpose of this effort is to help the Navydetermine whether the MOCs as envisioned andinstantiated are meeting the mission support andinteroperability goals. This specific, evaluationgoal led us to implement our model of the MaC ina simulation, rather than pursue optimization ofthe many variables - staff size, schedule, processstep configuration, communication strategies, etc.With this simulation, the MaC-expert user is ableto configure the particular mission he would likethe simulation to "play," and the MaC organizationis then evaluated against this mission scenario.

More specifically, the model enables analysts toanswer several questions about MaC activities:

• The Activities: Do activities get the resourcesthey need? Which processes & activitiesbegan with incomplete resources: human,information, time? Which activities could notbegin at all?

• The Organization: Do we have enough staffin the right roles? Which organizationalelements & staff were overloaded? Whichwere under-loaded?

• The Information Products: Are theinformation products complete and currentwhen they are needed? What information wasincomplete or missing when it was needed?

Analysts answer these questions in a process thatconsists of four stages: (1) Populate a database,(2) Configure the data and the model thatprocesses them, (3) Run a mission simulation,and (4) Analyze the results. The analyst then

177

typically returns to step (2), to refine theconfiguration and continue analysis iteratively.

Data entry and configuration is conducted using acomponent, called Adaptive ModelingEnvironment, that imports data specifying missionactivities (tasks), activity informationrequirements, activity schedules, humanresources (number and roles of staff), andorganizational structure. The AME provides userswith standard lists and graph representations ofthese data, through which the user can add,delete, or edit most data objects.

The mission simulation, designed collaborativelyby the USFFC N5-0LW and the developmentteam, is a discrete-event simulation engine. Itdrives the assignment and execution of theprocesses over the course of a mission. Thissimulation operates as follows.

For the purpose of the model, let

dij = the amount of time role i is required to spendon activity j. (If the role is not required for theactivity, then dij =0);

d max = max(dij);l=l...R

cmf = the completeness of information product m

at time t;

Vi = calculated completeness percentageattainable for the current execution of activity i;

v/ = completeness percentage attained in themost recent execution of activity i ;

a = activity repair coefficient, that is, the rate atwhich deficient information products input to anactivity are improved by that activity;

l3i =minimum completeness threshold for activity i;

T i = minimum execution time for activity i;

a =completeness decay rate for activities;

W1, Wz, W3 = the weights used in calculating Vi tobalance the importance of preceding activitycompleteness, information product inputcompleteness, and fulfillment of roles required,

3

such that I Wk =1;k=l

Np =the number of activities in process p;

M = the total number of information product types.

Additionally, we employ the following variables:

r.. ={I if role i is required for activity j

IJ 0 otherwise

{I if billet k is assigned to activity j at time t

X~j = o otherwise

a . ={I if info. prod. m is an input to activity jm; 0 otherwise

I.. = {I if activity i directly precedes activity j .

IJ 0 otherwise

Each time an activity within a process is promptedto begin (either by the process schedule, or by thecompletion of all the preceding activities), theactivity's potential completion score is computedto determine whether the activity has available theresources it needs: the required roles amongavailable staff members; recently updatedinformation products required by the activity; andrequired preceding activities. The scorecalculated for activity i at time t is

Np M B

_ WI ("I * ') W2 " * t W3 (" t)Vi --N- L. hi V h +-M-(L.am; Cm)+-R- L.Xb;~ I h=1 "a . m=1 " r. b=1L...Jhi £...Jml LJmh=l m=l n=l

The activity begins immediately if Vi ;::: Pi ' that is,

if the score is above the minimum completenessthreshold. (The user can define this thresholddifferently for each activity to reflect varyingpriorities for the resources). If the score is notsufficient (the activity does not have enough of therequired resources available), the activity willdelay its start. The required completion deadlinefor the activity remains fixed, so any delay in theactivity start time reduces the overall duration ofactivity execution. As the duration of the activity isreduced, the overall quality of the actions,communications, and products of an activitydeclines.

At each time interval after the initial time at whichthe activity was prompted to begin, the activity'sscore is recalculated: increased with the possibleaddition of any newly available resources, anddecreased by the decay rate due to the shortertime for execution and any resources that havebecome unavailable during the delay. That is, thenew score is computed after a starting delay, 0, as

Vi (0) = Vi - ((1 +0-)0). If Vi (0) ;::: Pi' then the

activity may begin. Otherwise, the delay iscontinued until (1) the activity is able to begin, or(2) the delay has lasted too long (dmax - 0 < T )

at which time the activity fails.

The overall quality of activity is measured by theactivity's completeness score, which conveys tosubsequent activities thus propagating the effectsof shortages of input resources and time. The staffof subsequent activities can partially repair thedeficiencies of prior activities, and this isrepresented by a multiplier on incomplete input wecall the repair rate, whose effect grows with the

178

actual duration of the task1. This repair rate isemployed to calculate the concluded activity's

completeness: v; = min(a *vi' 1). The

calculations given for activities throughout thissection are used similarly to computecompleteness percentages for the BRE.

3.3 Model Output

The output of the simulation consists of severalmeasures, which are presented graphically withinthe software tool to help the analyst rapidlydiagnose deficiencies in the staffing plan andmission schedule, and to refine their configuration.These measures are:

• Activity Completeness: For each activity thatis executed in the mission, we calculate itscompleteness as the weighted sum of thestates of its required inputs at the start of theactivity, augmented by a 25% repair rate.The Input Weights are configurable by theuser for each activity in order to capturevariations in requirements across the threeinput categories: required informationproducts, required roles, and requiredcompletion of prior activities.

• Manning Employment: As a missionsimulation evolves, organizations dedicatestaff (billets) in suitable roles (specificknowledge and skill packages) to activities.Each staff member takes on one ofpotentially many) roles at a time. For eachorganization element, we return the percentemployment (0 - 100%) of its staff over time.For each role, we return over time thepercentages of all the billets capable offulfilling the role that are currently employedin the role. Finally, for each billet, we returnthe instantaneous and average workloadover the course of the mission.Instantaneous workload is currentlydichotomous, as the billet is either employedor is idle.

• Information Product Completeness: Eachinformation product has a shelf life that isconfigurable by the user. Each time aninformation product is updated by an activityor battle rhythm event, its completenessreturns to 100% and remains there for theduration of the shelf life. After this time, thecompleteness of the information productdecays as the information becomesincreasingly outdated. For each informationproduct, we return its completeness measure

1 By design, none of the algorithms implementedin MaC-PAT are stochastic in nature at this time;that is, none injects variance into the dataset.

each time that it was required as input by anactivity or a battle rhythm event.

The output of the model has thus far provedaccurate and useful when compared to actualMaC staff process execution as observed byUSFFC N5-0LW Subject Matter Experts, andduring an initial application to a MaC accreditationexercise, as discussed below.

4. ACCREDITATION DATA AND RESULTS

In 2008, the Chief of Naval Operations mandatedthat each MaC be accredited to validate itsproficiency at MaC core tasks. The MaC-PAT isused to support this process by analyzing theperformance of selected MOCs duringaccreditation. MaC accreditation is accomplishedby USFFC via on-site observation of the MaCstaff during a "stressing" event such as a majormilitary exercise. These exercises can spanweeks and involve hundreds of MaC staffmembers exercising a complex combination of theprocesses based on an assigned mission. Theaccreditation team must place its few observerswhere and when stress is likely to show its effects,and conduct analyses that help the MOC refine itsstaffing, schedule, and processes. In the sectionbelow, we discuss the types and sources of thedata used for the initial MaC-PAT demonstrationand evaluation, and present our initial findingsbased on these data.

4.1 MOe Data Types and Sources

Because the emphasis of this work was todevelop a model that could be in use by the endof its initial six-month development period,populating the model with operationally-relevantdata was of vital importance. The data requiredto run the model are billet information, which canbe imported from existing command manningdocuments or manually input through the MOC­PAT configuration interface; role information,which specify the jobs or roles needed toaccomplish activities (note that multiple roles canbe assigned to individual billets); processdiagrams imported from the approved OV-6cdiagrams; the "Battle Rhythm", or daily scheduleof leadership meetings and roles of attendees;and the information products that each activity inthe process model requires and creates. Thedata used in the MaC-PAT originate fromauthoritative sources: billet information fromActivity Manning Documents; role informationfrom on-site observation, as well as survey resultsand workshop interviews; process diagrams fromthe SADIE architecture repository; and the BattleRhythm from the MaC's schedule. Additionally,an analysis of the assigned mission is conducted,and mission specific tasks from the UniversalJoint Task List (UJTL) are identified. The analystselects these mission tasks in the software

179

configuration editor, and the MaC PAT thenautomatically identifies the processes to runbased on a mapping by USFFC N5-0LW of tasksto processes. After this initial data import andinput, the analyst can generate additionalconfigurations easily in the model, and specify thelength of a given mission to test the durability andreliability of an organizational configuration.

The software typically runs each modeled missionsimulation in less than a minute, allowing users torapidly assess and reconfigure the organization asrequired. Each simulation run produces graphsillustrating workload on staff, process executionsuccess, and the availability of informationproducts to subsequent processes during thesimulation. Analysts use these outputs to assesseffectiveness of an organizational configuration, todiagnose potential failures, and to specifysolutions.

4.2 MOC-PAT Initial Outcomes

The MaC-PAT was tested during a major Fleetexercise in the spring of 2009. Initial testingindicated that the MaC-PAT results are consistentwith observed outcomes in the MaC when reliabledata are used and processes in the modeladjusted to reflect how the MaC staff conducts itsmission tasking,.

During the spring 2009 exercise, the MaC-PATidentified several areas of interest that were notnoted during on-site observation. These findingswere discovered during the exercise, becausereconfiguring and running the MaC-PAT was sorapid. The findings helped focus the efforts andattention of on-site observers, and allowedidentification of how the MaC staff hadspontaneously developed workarounds for someissues. These discoveries were documented as"best practices" to share with other MaC staffs.Observers confirmed other problem areasidentified in model runs during on-siteobservation. These discoveries providedconfidence that the model was accuratelydescribing how a MaC staff coped with anassigned mission set. The MaC-PAT was alsoused to explore how process synchronization andstaffing issues might evolve over time, by runningthe model for missions sets that were far longerthan those executed in the live exercise. Thisanalysis identified issues for the MaC staff toexplore after the exercise was complete.

5. CONCLUSIONS AND FUTURE WORK

This first iteration of the MaC-PAT proved thevalue of executing an operational architecture insoftware to assess complex Navy organizationsand their processes. The MaC-PAT accuratelymodeled an operational staffs performance, andcan provide analysts with insights into issues of

staffing and scheduling of complex process flows.The speed of configuration and simulationenabled analysts to rapidly revise the model todiagnose performance failures and test alternativeconfigurations of the organization.

The next iteration of the MaC-PAT will includemore advanced analysis tools, including reportsthat will support analysis and reporting by a MaCassessment team. In addition, the model is beingrevised to show the impact of role experience andproficiency on process execution speed (e.g.,inexperienced personnel in a billet should slowactivity execution while experienced staffaccelerate activities.) The next version will alsomodel shifts with greater fidelity than the currentversion.

In the Fall of 2009, the MaC-PAT will be used tosupport accreditation team observation of a FleetMaC staff. The model is also intended to supportMaC manning levels determination using datafrom a separate effort to identify MaC staffcompetencies and activity durations.

The MaC-PAT makes innovative use of anoperational architecture (DoOAF OV-6) byproviding a configurable, scalable, valid andexecutable representation of Fleet MOCs. Thisfusion of authoritative architectural data withsimulation technology has proven to be a costeffective way to analyze complex organizationalstructures and human interactions.

REFERENCES

1, Department of the Navy (2009), MaC FactSheet (Updated April 30, 2009).https:/Iwww.portal.navy.millmhq­moc/fltcomm/Shared%20Documents/Fact%20Sheets/MOC%20Fact%20Sheet_043009.pdf.

2. 000 (2007) 000 Architecture FrameworkVersion 1.5. 23 April 2007.http://jitc.fhu.disa.milljitc_dri/pdfs/dodaCv1 v1.pdf