Research Article | Open Access
Integrated Simulation Environment for Unmanned Autonomous Systems—Towards a Conceptual Framework
The paper initiates a comprehensive conceptual framework for an integrated simulation environment for unmanned autonomous systems (UAS) that is capable of supporting the design, analysis, testing, and evaluation from a “system of systems” perspective. The paper also investigates the current state of the art of modeling and performance assessment of UAS and their components and identifies directions for future developments. All the components of a comprehensive simulation environment focused on the testing and evaluation of UAS are identified and defined through detailed analysis of current and future required capabilities and performance. The generality and completeness of the simulation environment is ensured by including all operational domains, types of agents, external systems, missions, and interactions between components. The conceptual framework for the simulation environment is formulated with flexibility, modularity, generality, and portability as key objectives. The development of the conceptual framework for the UAS simulation reveals important aspects related to the mechanisms and interactions that determine specific UAS characteristics including complexity, adaptability, synergy, and high impact of artificial and human intelligence on system performance and effectiveness.
The increasing need of avoiding the exposure of humans to “dull, dirty, or dangerous”  missions in conjunction with the sustained multidisciplinary technological progress in the past two decades is the main reasons behind the exponential growth rate in the development, deployment, and operation of unmanned autonomous systems (UASs). More than twenty countries have invested substantial resources towards the development and the manufacturing UASs for a wide range of applications, both in the military and civilian domains . Although human personnel are part of the overall system, the UAS includes as primary component one or several unmanned vehicles (UVs) accomplishing their role within the mission with different levels of autonomy, ranging from remote control to fully automated mission completion [3, 4] including adaptation and decision making in response to changing operational conditions. Most UAS applications rely on aerial vehicles (UAVs); however, land, maritime, and space vehicles are also commonly used fulfilling primary or secondary roles within the system. The US military has currently more than 3 dozens of UAV platforms in service  ranging from small size to full size aircraft with different propulsion systems, including fixed wing UVs, rotary wing UVs, and airship. The Predator and the Global Hawk (with the Air Force), the Hunter (with the Army), and the Pioneer (with the Marine Corps) are just a few of the best known. More than 20 systems are based on unmanned ground vehicles (UGVs) and more than 15 on maritime unmanned vehicles (UMVs).
The requirement for increased complexity missions and high levels of autonomy determines the current and future UAS designs to become more and more systemic and intelligent. The individual agents composing UAS are numerous and sophisticated and feature different capabilities and characteristics. Additionally, UASs are called to operate within extended, uncertain, and, at times, extreme environmental conditions, possibly in complicated socio-political contexts for fulfilling missions and tasks requiring high levels of intelligence. The novelty and complexity of individual agent characteristics and interactions within and outside the UAS pose new challenges at all stages—that is, research, design, testing, and operation—of the product lifecycle. To fully benefit from the UAS systemic characteristics and synergistic capabilities, methodologies at each of the stages are necessary and need to be applied across platforms, subsystems, operational domains, environments, missions, and applications. These methodologies must be based on a “system of systems” perspective addressing issues related to the complexity, integration, communicability, and interoperability within and outside the UAS. In this context, the Department of Defense (DOD) Information Technology Standards Registry (DISR) (formerly DOD Joint Technical Architecture)  formulates a general framework for information technology across all DOD systems. The 4D/RCS architecture  establishes a reference model for the structure and organization of unmanned vehicles software to ensure system integration, effectiveness, and interoperability. In conjunction with such standards and architectures, the Joint Architecture for Unmanned Systems (JAUSs) establishes “a common language consisting of well-defined messages, enabling internal and external communication between unmanned systems”  providing a framework that is independent with respect to vehicle platform, mission, computer resource, technology, and operator use. This strategy adopted for the design and operation of UAS must be extended to the development of adequate simulation tools  that are called to support the UAS developer in determining early to which extent the desired capabilities of the entire complex system within complex operational environments are reached. This goal can be achieved if the mechanisms through which characteristics of the subsystems—from within the UAS and outside—impact the system performance are correctly identified, thoroughly investigated, and well understood. Key aspects to be addressed are interdependencies, robustness, synergy, as well as the potential for developing dedicated strategies, methodologies, metrics, and software for modeling and testing through simulation. To face all these challenges and find solutions to all these diverse, complex, and systemic problems, simulation tools that are themselves different in nature, complex, flexible, maintainable , and designed from a “system of systems” perspective are needed.
Typically, UAS modeling and simulation efforts have been directed towards supporting specific applications and were focused primarily on vehicle dynamics and control [10–15]. Commercially available flight simulation packages can be customized and used for limited-scope UAS analysis . Larger-scale efforts focused on generality, integration, and interoperability include U.S. Army’s Multiple Unified Simulation Environment (MUSE) , the Naval Air Systems Command (NAVAIR) UAV Unmanned Combat Aerial Vehicle (UCAV) Distributed Simulation Infrastructure , the United States and United Kingdom joint distributed simulation environment, Project Churchill , Boeing’s Man-In-the-Loop Air-to-Air System Performance Evaluation Model (MILAASPEM) , the MissionLab for the design and testing of robotic configurations , and the Modular Semi-Automated Forces (ModSAF) to evolve into OneSAF [21, 22].
The simulation tools currently available focus on limited/isolated design issues and/or on personnel training. In general they are lacking the capabilities of addressing the high complexity and systemic nature of UAS across all types of agents, missions, operational, and environmental conditions. They also fail to provide adequate means for comprehensive testing and evaluation. In this effort, an attempt has been made for laying the foundation towards the formulation of a conceptual frame for an integrated UAS simulation environment (UAS-SE) that will provide flexible and comprehensive simulation tools for the development, testing, verification, and validation of UAS. The main components and their interactions are analyzed from a “system of systems” perspective aimed for testing and evaluation (T&E) of UAS across all operational domains to include all types of vehicles, on-board and external mission equipment, individual agent intelligence, human operators and managers, static and dynamic environment, communications and flow of information, and different mission scenarios and objectives. The availability of models and performance assessment tools through simulation is investigated and directions for future developments are identified.
In the next section, the main strategic objectives in defining the conceptual framework are outlined. The top level architecture of the simulation environment is shown in Section 3. In Sections 4 and 5, the input and output modules are described. The central components of the UAS-SE grouped in the Simulation Nucleus are described in Section 6 followed by conclusions and a list of references.
2. General Strategy for the Design of the Conceptual Framework for the UAS Simulation Environment
UASs typically consist of sets of sophisticated and different entities including several categories of human personnel. A comprehensive UAS-SE must model all these components and include specific characteristics related to system intelligence, complexity, autonomous operation, and collaborative operation. A hierarchical modular structure must be defined that is highly flexible such that it can be organized ad hoc depending on specific simulation modes and combinations of modes such as(i)testing and evaluation,(ii)system behavior analysis and prediction,(iii)personnel training,(iv)artificial intelligence components training,(v)hardware in the loop simulation,(vi)software in the loop simulation,(vii)human in the loop simulation.
Specific simulation scenarios are associated with each of these simulation modes. The capabilities of the UAS-SE must allow the selection and the customized and flexible formulation of these scenarios to include a variety of(i)levels of agent and system autonomy,(ii)levels of component models sophistication,(iii)types of subsystem interaction,(iv)types of missions,(v)levels of risk and event occurrence probability,(vi)levels of external agents and systems intelligence,(vii)testing and evaluation objectives,(viii)auxiliary and connected actions for system analysis.
The primary strategic requirements for the conceptual frameworks can be summarized as follows.(i)Address directly the complexity and systemic character of UAS.(ii)Ensure generality and completeness by including all operational domains, types of agents, external systems, missions, and interactions between components.(iii)Ensure flexibility and expandability through a modular architecture and standard internal interfaces between components that allow further development, additions, and modifications.(iv)Accommodate testing and evaluation methodologies pertinent to all domains relevant to the operation of UAS.(v)Manage uncertainties and dynamics of the environment and assessment of their effects on overall system performance and operation.(vi)Manage the evaluation of system synergy, effectiveness, autonomy, and intelligence.
3. Top-Level Architecture of the UAS Simulation Environment
The top-level block diagram of a comprehensive UAS-SE is shown in Figure 1. The UAS-SE consists of a Simulation Nucleus, an Input Module, and an Output Module operated by two categories of personnel, the Simulation Human Manager and the Simulation Human Operator. For specific modes of operation of the UAS-SE, interactions with external hardware and/or human operators are possible.
Prior to the simulation, the Simulation Human Manager (SHM) provides all the necessary data for a complete characterization of all systems and subsystems as well as the complete menu of compatible simulation options according to objective requirements and availability of data and/or routines. Note that the SHM should not be confused with other automatic/virtual “managers”, which are components of the UAS or the external systems.
The Simulation Human Operator (SHO)—also referred to as the “user”—interactively defines the simulation process by providing specific information regarding the type and number of system components, general simulation mode, simulation sophistication level, specific objectives or missions, general mission conditions, mission details, and testing and evaluation scenario. Note that the SHO is a real person using the simulation and must not be confused with several other categories of “human operators” who are part of the UAS and/or the external systems and are modeled within the UAS-SE.
4. Input Module
The Input Module (see Figure 2) includes the following four submodules:(i)Operator Interface,(ii)General Systems Data Files,(iii)Simulation Manager,(iv)Simulation Modes.
The Operator Interface consists of all the software elements allowing user-program interaction for simulation definition and initialization. User-friendly graphical user interfaces are necessary to allow the setup of the simulation scenario by selecting all the relevant parameters such as simulation mode, number and type of system components, mission type and details, level of complexity of individual agent operation, and system integration.
The General Systems Data Files contain all the preloaded information necessary for performing correctly the entire set of eligible simulation configurations for all systems and components. They are provided by the simulation human manager.
The Simulation Manager submodule prepares the basic simulation frame according to operator’s options. It identifies the simulation configuration, monitors the compatibility between operator’s interactive options and available data, processes the data, and distributes them to the modules of the Simulation Nucleus.
The Simulation Modes sub-module groups the simulation scenarios in several categories—as listed in Section 2—and manages the specific parameters and simulation configurations accordingly.
5. Output Module
The Output Module (see Figure 3) includes the following four sub-modules:(i)Output Manager,(ii)Output Data,(iii)Output Interface,(iv)Evaluation Metrics.
The Output Manager organizes the selection, processing, storage, and displaying of the data according to operator’s options.
The content of the Output Data is the result of the numerical simulation and their structure can be either predetermined or specifically requested by the operator.
The Output Interface consists of all the software elements allowing the program to present, display, and store the numerical results of the simulation according to the user’s needs and options.
The Evaluation Metrics sub-module processes the output data to determine parameters relevant to performance and effectiveness assessment based on user input and testing and evaluation scenario. The metrics and mission scenarios must be capable of capturing the systemic, adaptive, and intelligent nature of the UAS. The UAS-SE must provide the capability to assess component performance but also subsystem and system synergy, in other words, evaluate the impact of individual agent interactions and cooperation on overall performance and capabilities.
With the assessment of overall operational effectiveness as the ultimate objective, the T&E of UAS must address the following issues assumed at all architectural levels, that is, component, sub-system, and system:(i)performance,(ii)availability,(iii)operational efficiency,(iv)operational versatility.
Performance assessment implies the evaluation of the level of agent/system significant capabilities in fulfilling its designed role. Some of these capabilities are easily expressed as physical quantities and form basic design parameters such as maximum altitude and speed for an aircraft or measurement accuracy for a sensor. Many such capabilities of the individual components at the lowest hierarchical level are embedded into the simulation through adequate modeling. It is important to determine the level of component performance throughout the operational envelope and in the presence of extreme, adverse conditions as well as after the occurrence of failures or malfunctions. The more challenging task of defining metrics, algorithms, and testing scenarios for quantitative evaluation of complex systems is currently under investigation [23, 24]. First steps for a general framework for multiagent performance evaluation have been performed [25–27]. However, the formulation of more appropriate general metrics and procedures needs to be addressed. The assessment of the level of intelligent behavior of the UAS and its impact on overall performance is a key aspect. Both human and artificial intelligence must be considered for this task. Formulating metrics for this purpose is very challenging considering that the mechanisms of cognitive processes and intelligent behavior are not well understood and adequate models are not yet available. In recent years, significant efforts have been performed towards the evaluation of human operator and UAS capabilities such as situational awareness [28, 29], perception , and human-robot team interaction [31, 32]. Although the social implications of general robotic applications are occasionally acknowledged , the social and political impact of UAS has only been discussed in the context of acceptance of airborne UAS into the unrestricted airspace . However, it is expected that UAS will affect more and more people in more different ways. It should also be recognized that the human decision process is very often influenced by social, cultural, and political aspects. It is reasonable to anticipate that artificial intelligence will have to incorporate such features too. Therefore, it is necessary to consider them in future comprehensive UAS-SE for proper modeling of the decision process and for generalized performance assessment.
Availability of UAS and its components is determined by reliability, maintainability, and supportability. Reliability can be defined as the expectancy for a system to accomplish assigned tasks at a certain level of performance under a specific set of conditions . This characteristic can be measured using the statistics of failure occurrence such as mean time between failure or mean number of failure per hour of operation [15, 35]. The failures are assumed to prevent the completion of the task or reduce significantly the level of performance. They can be classified as hardware failures or the result of human or artificial intelligence failures due to lack of situational awareness, wrong or risky decisions, fatigue, and so forth. Maintainability is the characteristic of a system to sustain continuous full capability operation under normal corrective, predictive, or preventive maintenance. Limited information is available in this area. Metrics for software maintainability have been developed . Analytical models to assess the impact of reliability and maintainability on service quality of mass-transit systems were developed based on Markov chains . The Supportability of a system can be essentially considered as its ability of sustaining continuous full capability operation with reduced logistical resources [38, 39].
Operational efficiency is determined by the amount and complexity of maintenance, logistic support, and system operations required to fulfill the UAS mission. Cost of ownership and operation is also an important aspect to be considered. Other evaluation metrics have been formulated with respect to system operation only .
Operational versatility incorporates system characteristics such as flexibility, adaptability, and composability. Flexibility represents the capability of a system to perform the same or very similar tasks with different components and/or architectures. Metrics for flexibility evaluation were proposed by Robby et al. . Adaptability is the ability of a system to modify, redistribute, or incorporate resources to maintain a certain level of performance in the presence of modified operational requirements, changes of internal parameters (such as failures), or different environmental conditions. Metrics have been proposed for software architecture only [41, 42]. Finally, composability can be defined as the capability of a system to use building blocks for reconfiguration to achieve specific objectives . This concept has recently received substantial attention in relation to the development of composable software and simulation environments. However, assessment of composability level for UAS has yet to be addressed.
It should be noted that the direct evaluation of UAS availability, operational efficiency, and operational versatility must be accompanied by an assessment of their impact on the general performance capabilities.
6. Simulation Nucleus
The Simulation Nucleus includes six major components. The first four modules form the UAS:(i)Individual Agent Module,(ii)Auxiliary Systems Module,(iii)Mission Equipment Module,(iv)UAS Management and Decision Module,(v)Environment Module,(vi)External Systems Module.
6.1. Individual Agent Module
All the components of the Individual Agent Module are assumed to be parts of the UAS. They are grouped in 5 sub-modules as presented in Figure 4.
(i) The Vehicle Submodule includes manned and unmanned (tele-operated, automatic, autonomous) vehicles, operating with different levels of autonomy and on-board intelligence, in various environments, in the air [2, 44, 45], on the ground , on sea surface , underwater , or in space . An example of a simplified Individual Agent Module for an unmanned aerial vehicle is shown in Figure 5. The sub-module includes models of the aircraft general dynamics [50, 51] and subsystems such as on-board sensors , actuators , landing gear , engine , and control system [56, 57].
(ii) The Fixed Entities Submodule includes—even if mobile—field ground stations  possibly with operator interfaces , command centers, and communication relays. Implementations of comprehensive models of these elements within UAS simulations have not yet been performed.
(iii) The Humans Submodule includes all human personnel with active and passive attributions within the UAS. Most efforts to model intelligent behavior are based on the framework provide by Rasmussen . In this sub-module, aspects such as operator attention, information perception and processing , human-machine interface issues , and operator workload  are addressed.
(iv) The Communication Networks Submodule includes node interactions, material pathways, information flow, and processing. These aspects have been addressed mostly in social-political contexts but not related to UAS. Analyses for technical characteristics such as latencies and errors are available for specific systems such as in Kongezos and Allen  for wireless communications. Aspects of UAV communications in urban environment are analyzed by Poh .
(v) The World Model Submodule includes the representation of the external world of each individual agent as a result of sensor fusion, intelligent data processing, and information transfer with other individual agents within and outside the UAS [66, 67]. The situational awareness of the system and its components has its source in this sub-module.
6.2. Auxiliary Systems Module
The Auxiliary Systems Module includes sub-modules representing systems whose action may be considered passive in nature, or secondary, temporary, limited in authority, limited in time, replaceable/redundant, and so forth, from the point of view of the UAS operation and/or the simulation environment. This module includes the following sub-modules as presented in Figure 6.
(i) Trajectory Planning Submodule. This sub-module hosts algorithms to compute the desired trajectory subject to an optimization process unless the trajectory is imposed due to other requirements. To determine that trajectory is an important feature of system autonomy and has received careful consideration as part of the navigation and control system development  resulting in a large variety of methodologies and algorithms [69, 70].
(ii) Transportation Network and Subsystems. The deployment of UAS may include an important phase requiring delivery of the UAS or some of its components to the operational area.
(iii) Maintenance and Support. General frameworks for the design, operation, and impact on performance of logistics are developed . However, modeling and simulation for UAS performance assessment is not yet available.
(iv) Human operators. Modeling the human operator performing auxiliary system tasks will address similar issues as modeling the human personnel part of the UAS, possibly at a lower level of sophistication. This distinction is considered necessary to allow more detailed analysis of performance and component contributions to the complex system.
(v) Objective Mission Risk and Threats. One objective of this sub-module is to manage the risk models associated with the components of the UAS-SE and trigger events such as failures or environmental conditions changes. These models rely mostly on statistics of event occurrence given specific conditions . Another objective is to process data for UAS mission risk assessment [73, 74].
(vi) Component failure, malfunction, and abnormal conditions typically have a direct important effect on system performance, capabilities, and mission success. Models for certain classes of actuator and sensor failures affecting the unmanned vehicle and on-board equipment are available [75–77]. Malfunctions of human operators, at the level of subsystem interaction, and man-machine interaction require further analysis and modeling.
6.3. Mission Equipment Module
The Mission Equipment Module has the following two main components as presented in Figure 7:(i)Mission Equipment Control Submodule,(ii)Equipment Components Submodule.
The equipment that serves the purpose of the individual agents or the UAS as a whole in conjunction with its mission may consist of sensors and actuators, data acquisition system, data processing system, weapon systems, robotic arms, auxiliary robots, on-board laboratory, delivery systems, or other payload. It should be noted that only those elements and characteristics directly related to the UAS operation need to be modeled and not necessarily the specific functioning of the mission equipment systems.
6.4. UAS Management and Decision Module
The UAS Management and Decision Module incorporates models of the most important features of human and artificial intelligence required for the operation of the UAS. It hosts the following processes as shown in Figure 8:(i)UAS Management Submodule,(ii)Individual Agent Intelligence and Decision Making,(iii)Human Intelligence and Decision Making,(iv)Intelligent Adaptive Control of Agents, Groups, and Systems,(v)Agent Status Self-Assessment.
The models in this sub-module focus primarily on cognitive processes  leading to decision making [79–81], situational awareness [29, 82], agent status self-assessment , risk assessment, mission evaluation, system status assessment, mission redefinition according to system failure, environment changes, and/or operator intervention . The decision algorithms incorporate a certain level of authority and multicriterial strategies. Artificial intelligence techniques such as fuzzy reasoning, machine learning, swarm intelligence, and artificial neural networks can concur to perform these tasks.
An important problem of UAS operation is the air space management  (ASM). The main characteristics and capabilities of the real UAS system in this respect must be adequately present in the simulation environment. Using faster than real-time parallel computation, the future states of any dynamic sub-system within the simulation environment can be determined and used to model the “perfect” situational evaluation and prediction tool. Known statistics-based performance metrics of the real ASM system—such as detection rates, evaluation error, accuracy—can be used to alter the “perfect” system for a realistic simulation. However, for ASM analysis and design, more accurate simulation tools are needed. Advanced ASM development typically relies on machine learning [85, 86] and a four-dimensional-trajectory representation (position and time) [87, 88] to support a variety of conflict detection and conflict resolution algorithms [89–93].
6.5. Environment Module
The Environment Module (shown in Figure 9) is expected to capture the complexity and diversity of the operation domains that the UAS are/will be called to cover. The models required for a comprehensive implementation can be grouped in the following categories:(i)Atmospheric Submodule,(ii)Terrain and Underground Submodule, (iii)Sea Surface Submodule,(iv)Undersea Submodule,(v)Extra-Atmospheric Submodule,(vi)Urban and Indoors Submodule,(vii)Social and Political Environment.
The first six sub-modules must include models of physical characteristics and natural and artificial obstacles. Autonomous operation implies high situational awareness and adaptability with respect to the environment including obstacle detection, identification, and evaluation followed by avoidance . Designing such a system is an extremely challenging task. Simulating it may be less complex if its main characteristics can be well estimated such as: computational delay, accuracy, correct identification probability, and operational range.
Modeling the social and political environment for UAS evaluation has not yet been attempted. However, there is a growing concern with respect to issues such as impact on the environment, impact on human general activity (e.g., traffic), acceptability of “intelligent” robots, artificial intelligence decisions with moral implications, collateral damage, social perception, and acceptability of risks [95, 96].
Modeling environmental uncertainty, variability, extreme, and unusual situations is critical for a comprehensive and detailed testing and evaluation.
6.6. External Systems Module
The External Systems Module includes entities that are not part of the UAS or the environment but interact with the UAS in a significant manner for the purpose of the mission (shown in Figure 10). They are categorized as(i)Targets,(ii)Hostile External Systems,(iii)Friendly and Neutral External Systems.
The target models must correspond to the on-board detection algorithms and provide pertinent information such as infrared or radar signature to allow the performance evaluation of the detection algorithms.
A comprehensive conceptual framework has been initiated with the purpose of developing an integrated simulation environment focused on the testing and evaluation of UAS.
The high-level structure and building blocks of the simulation environment have been formulated based on current and future required capabilities and performance.
The conceptual framework addresses directly the complexity and systemic character of UAS. It empowers the development of tools for performance and effectiveness evaluation of highly adaptive intelligent systems.
The comprehensive simulation environment features a high level of generality and covers a diversity of unmanned autonomous systems and their missions over all operational domains.
The current state of the art of modeling and performance assessment of UAS and their components has been assessed and needs and gaps identified for future investigation.
- Anonymous, Unmanned Systems Roadmap 2007–2032, Office of the Secretary of Defense, Washington, DC, USA, 2007.
- Anonymous, Unmanned Aircraft Systems Roadmap 2005–2030, Office of the Secretary of Defense, Washington, DC, USA, 2005.
- H. M. Huang, Ed., Terminology for Specifying the Autonomy Levels for Unmanned Systems, Version 1.0, NIST Special Publication 1011, National Institute of Standards and Technology, Gaithersburg, Md, USA, 2004.
- H.-M. Huang, K. Pavek, B. Novak, J. Albus, and E. Messina, “A framework for autonomy levels for unmanned systems (ALFUS),” in Proceedings of the AUVSI's Unmanned Systems North America, pp. 849–863, Baltimore, Md, USA, 2005.
- Anonymous, Information Technology Standards Registry, U.S. Department of Defense, 2007.
- J. Albus et al., 4D/RCS: A Reference Model Architecture for Unmanned Vehicle Systems. Version 2.0, U.S. Department of Commerce, National Institute of Standards and Technology, Technology Administration, Gaithersburg, Md, USA, 2002.
- Anonymous, The Joint Architecture for Unmanned Systems—Domain Model, Version 3.2, Office of the Under Secretary of Defense for Acquisition, Technology and Logistics, 2005.
- E. H. Page and R. Smith, “Introduction to military training simulation: a guide for discrete event simulationists,” in Proceedings of the Winter Simulation Conference, vol. 1, pp. 53–60, Washington, DC, USA, 1998.
- A. Verbraeck and E. C. Valentin, “The use of building blocks to enhance flexibility and maintainability of simulation models and simulation libraries,” in Proceedings of the 13th European Simulation Symposium (ESS '01), Gent, Belgium, 2001.
- N. M. Jodeh, P. A. Blue, and A. A. Waldron, “Development of small unmanned aerial vehicle research platform: modeling and simulating with flight test validation,” in Proceedings of AIAA Modeling and Simulation Technologies Conference, vol. 1, pp. 369–391, Keystone, Colo, USA, 2006.
- E. R. Mueller, “Hardware-in-the-loop simulation design for evaluation of unmanned aerial vehicle control systems,” in Proceedings of AIAA Modeling and Simulation Technologies Conference, vol. 1, pp. 530–543, Hilton Head, SC, USA, 2007.
- M. G. Perhinschi, M. R. Napolitano, G. Campa, and M. L. Fravolini, “A simulation environment for testing and research of neurally augmented fault tolerant control laws based on non-linear dynamic inversion,” in Proceedings of the AIAA Modeling and Simulation Technologies Conference, vol. 1, pp. 147–157, 2004.
- S. J. Rasmussen and P. R. Chandler, “MultiUAV: a multiple UAV simulation for investigation of cooperative control,” in Proceedings of the Winter Simulation Conference, vol. 1, pp. 869–877, San Diego, Calif, USA, 2002.
- M. Yasar, D. O. Bridges, G. Mallapragada, and J. F. Horn, “A simulation test bed for coordination of unmanned rotorcraft and ground vehicles,” in Procedings of AIAA Modeling and Simulation Technologies Conference, vol. 1, pp. 400–412, Keystone, Colo, USA, 2006.
- C. Patchett and V. Sastry, “A preliminary model of accident causality for uninhabited autonomous air systems and its implications for their decision architectures,” in Proceedings of the UKSim 10th International Conference on Computer Modelling and Simulation (EUROSIM/UKSim '08), pp. 487–492, Cambridge, UK, 2008.
- J. Craighead, R. Murphy, J. Burke, and B. Goldiez, “A survey of commercial & open source unmanned vehicle simulators,” in Proceedings of IEEE International Conference on Robotics and Automation, pp. 852–857, Rome, Italy, 2007.
- I. Clark, R. Miksa, D. Childs, and K. Mead, “The benefits of simulation to support training,” Armour Bulletin, vol. 32, no. 1, 1999.
- J. Twesme and A. Corzine, “Naval air systems command (NAVAIR) unmanned aerial vehicle (UAV) unmanned combat aerial vehicle (UCAV) distributed simulation infrastructure,” in Proceedings of the 2nd AIAA “Unmanned Unlimited” Systems, Technologies, and Operations—Aerospace, Land, and Sea Conference and Workshop and Exhibit, San Diego, Calif, USA, 2003.
- K. E. Lawson and C. T. Butler, “Overview of Man-in-the-Loop Air-to-Air System Performance Evaluation Model (MIL-AASPEM) II, D658-10485-1,” The Boeing Company, 1995.
- D. C. Mackenzie, R. C. Arkin, and J. M. Cameron, “Multiagent Mission Specification and Execution,” Autonomous Robots, vol. 4, no. 1, pp. 29–52, 1997.
- J. M. Sardella and D. L. High, “Integration of fielded army aviation simulators with ModSAF: the eighth army training solution,” in Proceedings of Interservice/Industry Training, Simulation and Education Conference, Orlando, Fla, USA, November 2000.
- P. Drewes, “Simulation based approach for unmanned system command and control,” in Proceedings of Florida Conference on Recent Advances in Robotics, Miami, Fla, USA, 2006.
- S. O'Day, M. Steinberg, C. Yglesias et al., “Metrics for intelligent autonomy,” in Proceedings of Performance Metrics for Intelligent Systems Workshop (PerMIS '04), Gaithersburg, Md, USA, 2004.
- A. Steinfeld, T. Fong, D. Kaber et al., “Common metrics for human-robot interaction,” in Proceedings of the 1st ACM Conference on Human-Robot Interaction, vol. 2006, pp. 33–40, Salt Lake City, Utah, USA, 2006.
- J. Albus, “Metrics and performance measures for intelligent unmanned ground vehicles,” in Proceedongs of Performance Metrics for Intelligent Systems Workshop (PerMIS '02), Gaithersburg, Md, USA, 2002.
- J. L. Harbour, D. J. Bruemmer, and D. A. Few, “Measuring unmanned vehicle system performance: challenges and opportunities,” in AUVSI Unmanned Systems North America, Orlando, Fla, USA, 2006.
- C. Dimou, A. L. Symeonidis, and P. A. Mitkas, “Towards a generic methodology for evaluating MAS performance,” in Proceedings of the International Conference on Integration of Knowledge Intensive Multi-Agent Systems (KIMAS '07), pp. 174–179, Waltham, Mass, USA, 2007.
- S. S. Mulgand, K. A. Harper, G. L. Zacharias, and T. Menke, “SAMPLE: situation awareness model for pilot-in-the-loop evaluation,” in Proceedings of the 9th Conference on Computer Generated Forces and Behavioral Representation, Orlando, Fla, USA, 2000.
- J. A. Adams, “Unmanned vehicle situation awareness: a path forward,” in Proceedings of the Human Systems Integration Symposium, Annapolis, Md, USA, 2007.
- B. Touchton, T. Galluzzo, D. Kent, and C. Crane, “Perception and planning architecture for autonomous ground vehicles,” Computer, vol. 39, no. 12, pp. 40–47, 2006.
- A. Freedy, E. Freedy, J. DeVisser et al., “A complete simulation environment for measuring and assessing human-robot team performance,” in Proceedings of IEEE Safety, Security, and Rescue Robotics Conference, and Performance Metrics for Intelligent Systems Workshop, Gaithersburg, Md, USA, 2006.
- P. E. Pina, M. L. Cummings, J. W. Crandall, and M. Della Penna, “Identifying generalizable metric classes to evaluate human-robot teams,” in Proceedings of the Metrics for Human-Robot Interaction Workshop at the 3rd Annual Conference on Human-Robot Interaction, Amsterdam, The Nederlands, 2008.
- Anonymous, Unmanned Aircraft System Operations in UK Airspace—Guidance, Civil Aviation Authority, Directorate of Airspace Policy CAP-722, London, UK, 2008.
- Anonymous, Unmanned Aerial Vehicle Reliability Study, U.S. Department of Defense, Office of the Secretary of Defense, Washington, DC, USA, 2003.
- Anonymous, Reliability Modeling and Prediction, Military Standard MIL-STD-7566, Department of Defense, Washington, DC, USA, 1982.
- D. E. Peercy, “A software maintainability evaluation methodology,” IEEE Transactions on Software Engineering, vol. 7, no. 4, pp. 343–351, 1981.
- P. Dersin and J. Durand, “Mass-transit system service quality: tradeoff analysis on reliability, maintainability and logistics,” in Proceedings of the Annual Reliability and Maintainability Symposium, pp. 515–528, Washington, DC, USA, 1995.
- D. E. Mortin, “Analysis of logistic supportability for complex systems,” in Proceedings of the Annual Reliability and Maintainability Symposium, pp. 250–253, Los Angeles, Calif, USA, 1990.
- C. G. John and R. E. Schultz, “Simulation modeling for military logistics and supportability studies,” in Proceedings of the Winter Simulation Conference, pp. 485–490, Phoenix, Ariz, USA, December 1991.
- Robby, S. A. DeLoach, and V. A. Kolesnikov, “Using design metrics for predicting system flexibility,” in Proceedings of the 9th International Conference on Fundamental Approaches to Software Engineering (FASE '06), vol. 3922 of Lecture Notes in Computer Science, pp. 184–198, Vienna, Austria, 2006.
- L. Chung and N. Subramanian, “Process-oriented metrics for software architecture adaptability,” in Proceedings of the 5th IEEE International Conference on Requirements Engineering, pp. 310–311, Toronto, Canada, 2001.
- O. P. Rotaru and M. Dobre, “Reusability metrics for software components,” in Proceedings of the 3rd ACS/IEEE International Conference on Computer Systems and Applications, pp. 81–88, Cairo, Egypt, 2005.
- M. D. Petty, “Two aspects of composability: lexicon and theory,” in Proceedings of Defense Modeling and Simulation Office Workshop on Composable Modeling and Simulation, Monterey, Calif, USA, 2002.
- H. Shim, T. J. Koo, F. Hoffmann, and S. Sastry, “A comprehensive study of control design for an autonomous helicopter,” in Proceedings of the IEEE Conference on Decision and Control (CDC '98), vol. 4, pp. 3653–3658, Tampa, Fla, USA, 1998.
- G. S. Sukhatme and J. F. Montgomery, “Heterogeneous robot group control and applications,” in Proceedings of the International Conference of the Association for Unmanned Vehicle Systems, Baltimore, Md, USA, 1999.
- H. Durrant-Whyte, “A critical review of the state-of-the-art in autonomous land vehicle systems and technology,” Sandia Report SAND2001-3685, Sandia National Laboratories, 2001.
- A. Leonessa, J. Mandello, Y. Morel, and M. Vidal, “Design of a small, multi-purpose, autonomous surface vessel,” in Proceedings of IEEE Oceans Conference, vol. 1, pp. 544–550, San Diego, Calif, USA, 2003.
- Anonymous, The Navy Unmanned Undersea Vehicle (UUV) Master Plan, U.S. Department of the Navy, Washington, DC, USA, 2004.
- A. W. Stroupe, S. Singh, R. Simmons et al., “Technology for autonomous space systems,” Tech. Rep. CMU-RI-TR-00-02, The Robotics Institute, Carnegie Mellon University, Pittsburgh, Pa, USA, 2002.
- P. H. Zipfel, Modeling and Simulation of Aerospace Vehicle Dynamics, AIAA Education Series, AIAA, Reston, Va, USA, 2000.
- M. E. Dreier, Introduction to Helicopter and Tiltrotor Flight Simulation, AIAA Education Series, AIAA, Reston, Va, USA, 2007.
- D. Garg and M. Kumar, “Sensor modeling and multi-sensor data fusion,” Final Progress Report W911NF-01-10434, Army Research Office, 2005.
- P. Capone, “Actuator modeling and aircraft related performance: selection of the appropriate actuation system,” in Proceedings of the AIAA Modeling and Simulation Technologies Conference, Denver, Colo, USA, 2000.
- J. Pritchard, “Overview of landing gear dynamics,” Journal of Aircraft, vol. 38, no. 1, pp. 130–137, 2001.
- M. Lichtsinder and Y. Levy, “Jet engine model for control and real-time simulations,” Journal of Engineering for Gas Turbines and Power, vol. 128, no. 4, pp. 745–753, 2006.
- M. L. Steinberg, “A comparison of intelligent, adaptive, and nonlinear flight control laws,” in Proceedings of the AIAA Guidance, Navigation, and Control Conference and Exhibit, Portland Ore, USA, 1999, AIAA 99-4044.
- K. A. Wise, “Guidance and control for military systems: future challenges,” in Proceedings of the AIAA Guidance, Navigation and Control Conference and Exhibit, vol. 5, pp. 5382–5389, Hilton Head, SC, USA, 2007.
- F. De Crescenzio, G. Miranda, F. Persiani, and T. Bombardi, “Advanced interface for UAV (Unmanned Aerial Vehicle) ground control station,” in Proceedings of the AIAA Modeling and Simulation Technologies Conference and Exhibit, vol. 1, pp. 486–494, Hilton Head, SC, USA, 2007.
- M. L. Cummings, K. Meyers, and S. D. Scott, “Modified cooper harper evaluation tool for unmanned vehicle displays,” in Proceedings of the Unmanned Vehicle Systems Canada Conference, Montebello, Canada, 2006.
- J. Rasmussen, “Models of mental strategies in process plant diagnosis,” in Human Detection and Diagnosis of System Failures, J. Rasmussen and W. B. Rouse, Eds., Plenum Press, New York, NY, USA, 1981.
- U. Metzger and R. Parasuraman, “Effects of automated conflict cueing and traffic density on air traffic controller performance and visual attention in a datalink environment,” International Journal of Aviation Psychology, vol. 16, pp. 343–362, 2006.
- G. Barbato, G. Feitshans, R. Williams, and T. Hughes, “Operator vehicle interface laboratory: unmanned combat air vehicle controls & displays for suppression of enemy air defences,” in Proceedings of the 12th International Symposium on Aviation Psychology, Dayton, Ohio, USA, 2003.
- J. Keller, “Human performance modeling for discrete-event simulation: workload,” in Proceedings of the Winter Simulation Conference, vol. 1, pp. 157–162, San Diego, Calif, USA, 2002.
- V. K. Kongezos and C. R. Allen, “Wireless communication between A.G.V.'s (Autonomous Guided Vehicle) and the industrial network C.A.N. (Controller Area Network),” in Proceedings of IEEE International Conference on Robotics and Automation, vol. 1, pp. 434–437, Washington, DC, USA, 2002.
- S. C. T. Poh, Simulations of diversity techniques for urban UAV data links, M.S. thesis, Naval Postgraduate School, Monterey, Calif, USA, 2004.
- C. P. Evans III, Development of world modeling methods for autonomous systems based on the joint architecture for unmanned system, M.S. thesis, University of Florida, 2004.
- R. Touchton, D. Kent, T. Galluzzo et al., “Planning and modeling extensions to the joint architecture for unmanned systems (JAUS) for application to unmanned ground vehicles,” in Unmanned Ground Vehicle Technology VII, vol. 5804 of Proceedings of SPIE, pp. 146–155, Orlando Fla, USA, March 2005.
- G. Vachtsevanos, L. Tang, G. Drozeski, and L. Gutierrez, “From mission planning to flight control of unmanned aerial vehicles: strategies and implementation tools,” Annual Reviews in Control, vol. 29, no. 1, pp. 101–115, 2005.
- M. Pachter and P. Chandler, “Challengers of autonomous control,” IEEE Control Systems Magazine, vol. 18, no. 4, 1998.
- S. A. Bortoff, “Path planning for UAVs,” in Proceedings of the American Control Conference, vol. 1, pp. 364–368, Chicago, Ill, USA, June 2000.
- J. S. Gansler and W. Lucyshyn, Evaluation of Performance Based Logistics, University of Maryland and Naval Postgraduate School, 2006, UMD-LM-06-040.
- Y. Haimes, Risk Modeling, Assessment, and Management, John Wiley & Sons, New York, NY, USA, 2nd edition, 2004.
- R. E. Weibel and R. J. Hansman Jr., “An integrated approach to evaluating risk mitigation measures for UAV operational concepts in the NAS,” in Proceedings of InfoTech at Aerospace: Advancing Contemporary Aerospace Technologies and Their Integration, vol. 1, pp. 509–519, Arlington, Va, USA, September 2005.
- C. Reece, R. Walker, N. Fulton, and D. Campbell, “A casualty risk analysis for unmanned aerial system (UAS) operations over inhabited areas,” in Proceedings of the 12th Australian International Aerospace Congress, and the 2nd Australasian Unmanned Air Vehicles Conference, Melbourne, Australia, 2007.
- J. D. Bošković, S.-M. Li, and R. K. Mehra, “On-line failure detection and identification (FDI) and adaptive reconfigurable control (ARC) in aerospace applications,” in Proceedings of the American Control Conference, vol. 4, pp. 2625–2626, 2001.
- M. G. Perhinschi, G. Campa, M. R. Napolitano, M. Lando, L. Massotti, and M. L. Fravolini, “Modeling and simulation of a fault tolerant control system,” International Journal of Modelling and Simulation, vol. 26, no. 1, pp. 1–10, 2006.
- B. J. Bacon and I. M. Gregory, “General equations of motion for a damaged asymmetric aircraft,” in Proceedings of the AIAA Atmospheric Flight Mechanics Conference and Exhibit, vol. 1, pp. 63–75, Hilton Head, SC, USA, 2007.
- K. Jha, R. Wray, C. Lebiere et al., “Towards a complete, multi-level cognitive architecture,” in Proceedings of the International Conference for Cognitive Modeling, Ann Arbor, Mich, USA, 2007.
- M. L. Hanson and K. A. Harper, “An intelligent agent for supervisory control of teams of unihabited combat air vehicles (UCAVs),” in Proceedings of the Unmanned Systems Conference, Orlando, Fla, USA, 2000.
- R. Touchton, An adaptive planning framework for situation assessment and decision-making of an autonomous ground vehicle, Ph.D. thesis, University of Florida, 2006.
- P. Narayan, P. Wu, D. Campbell, and R. Walker, “An intelligent control architecture for unmanned aerial systems (Uas) in the national airspace system (Nas),” in Proceedings of the 2nd Australasian Unmanned Air Vehicle Systems Conference, Melbourne, Australia, 2007.
- C. E. Nehme, J. W. Crandall, and M. L. Cumming, “Using discrete-event simulation to model situational awareness of unmanned-vehicle operators,” in Proceedings of the Virginia Modeling, Analysis and Simulation Center Capstone Conference, Norfolk, Va, USA, 2008.
- K. M. Reichard and E. C. Crow, “Intelligent self-situational awareness for unmanned and robotic platforms,” in Proceedings of AUVSI Unmanned Systems North America, pp. 1221–1235, Baltimore, Md, USA, 2005.
- K. Pederson, R. Lethin, J. Springer, R. Manohar, and R. Melhem, “Enabling cognitive architectures for UAV mission planning,” in Proceedings of the 10th Annual Workshop High Performance Embedded Computing, Burlington, Mass, USA, 2006.
- X. Zhang, S. Yoon, P. DiBona et al., “An ensemble learning and problem solving architecture for airspace management,” in Proceedings of the 21st Innovative Applications of Artificial Intelligence Conference, pp. 203–210, 2009.
- S. Yoon and S. Kambhampati, “Hierarchical strategy learning with hybrid representations,” in Proceedings of the AAAI Workshop on Acquiring Planning Knowledge via Demonstration, pp. 52–56, 2007.
- L. Mazzucchelli and A. Monteleone, “A 4D immersive virtual reality system for air traffic control,” in Proceedings of the 7th EUROCONTROL Innovative Research Workshop & Exhibition, EUROCONTROL Experimental Centre, December 2008.
- M. Porretta, M.-D. Dupuy, W. Schuster, A. Majumdar, and W. Ochieng, “Performance evaluation of a novel 4D trajectory prediction model for civil aircraft,” Journal of Navigation, vol. 61, no. 3, pp. 393–420, 2008.
- J. Kosecka, C. Tomlin, G. Pappas, and S. Sastry, “Generation of conflict resolution maneuvers for air traffic management,” in Proceedings of IEEE International Conference on Intelligent Robots and Systems, vol. 3, pp. 1598–1603, Grenoble, France, 1997.
- J. K. Kuchar and L. C. Yang, “A review of conflict detection and resolution modeling methods,” IEEE Transactions on Intelligent Transportation Systems, vol. 1, no. 4, pp. 179–189, 2000.
- G. Dowek and C. Muñoz, “Conflict detection and resolution for 1,2,... N aircraft,” in Proceedings of the 7th AIAA Aviation Technology, Integration, and Operations Conference, vol. 1, pp. 403–415, Belfast, Northern Ireland, 2007.
- T. C. Farley and H. Erzberger, “Fast-time simulation evaluation of a conflict resolution algorithm under high air traffic demand,” in Proceedings of the 7th USA/Europe ATM R&D Seminar, Barcelona, Spain, July 2007.
- R. Sharma and D. Ghose, “Swarm intelligence based collision avoidance between realistically modelled UAV clusters,” in Proceedings of the American Control Conference, pp. 3892–3897, New York, NY, USA, July 2007.
- A. Kelly, A. Stentz, O. Amidi et al., “Toward reliable off road autonomous vehicles operating in challenging environments,” International Journal of Robotics Research, vol. 25, no. 5-6, pp. 449–483, 2006.
- M. T. DeGarmo, Issues Concerning Integration of Unmanned Aerial Vehicles in Civil Airspace, Federal Aviation Administration, MITRE Center for Advanced Aviation System Development, McLean, Va, USA, 2004.
- R. Sleeman, A. Cox, and J. Colledge, “Systems approach to unmanned air vehicle development and certification,” in Improvements in System Safety: Proceedings of the 16th Safety-Critical Systems Symposium, Bristol, UK, 2008.
Copyright © 2010 M. G. Perhinschi et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.