Challenge 1: additional information and innovation challenge elements
Updated 10 August 2017
1. Additional information and innovation challenge elements
1.1 Allowing rapid and automated integration of new sensors
Military commanders, working at pace in Information-Age operations, need seamless access to and an ability to exploit a wide range of sensors, some of which are unknown prior to an operation.
We envisage a situation where available sensors (for example off the shelf products or equipment from allies or host nations) are quickly adopted to be used in military situations, rapidly integrated into military collection systems without prior knowledge of format standards, and their data processed and fused automatically to provide the critical information the commanders need at the lowest cognitive burden.
Given the scale, diversity and complexity of this mix of sensor information, automation of integration, processing and sensor management is critical, but this comes with key challenges to be solved.
Example within a military scenario:
An intelligent system rapidly integrates raw or processed data from a network of previously unknown autonomous vehicles and fixed sensors in the maritime environment from a variety of sources. An intelligent tasking and sensor management system considers all of the available sources at any particular time, computes the optimal collect tasks (possibly by collaboration with the human operator), and directs the sources to collect data in a manner which best meets mission objectives. The system scans the data from multiple input feeds, and detects correlations in time and space to identify entities and their movements. It fuses inputs from multiple source types to deliver the best result. The sensing system also builds, over time, higher level information such as patterns, tracks, hostile actor meetings/events, and likely liaisons / relationships.
2. What we are interested in specific to challenge 1
- solutions that solve one aspect of the problem area across a broad range of input data types and scalable as the number of sensors and targets increases
- solutions that work in real (or near-real) time
- solutions that show awareness of the Dstl/Innovate UK funded project SAPIENT
- solutions that show strong linkages to challenges 2 and 3
3. What we are not interested in specific to challenge 1
- mechanisms to enable non-cooperative access to collection assets; in other words access to equipment that isn’t owned or controlled by UK Defence or cooperating parties
- poor-scalability, systems that have fundamental constraints on the number of sensors
- distributed architectures. This challenge has assumed a single integrating system and hence a centralised architecture. While fully distributed architectures (non-tree network structure with no central or top-level node) are out of scope for this competition, hierarchical architectures are in scope
4. Technical areas of investigation: allowing rapid and automated integration of new sensors
Sensors are increasingly ubiquitous and diverse. Even now they range from conventional military sensors, such as radar and cameras; through to sensors owned by allies or host nations, including civil sources; and even commercially available sensors such as security systems, or those mounted on quadcopters. To improve Defence’s ability to operate in the information age we need to be able to rapidly own and automatically integrate a multitude of sensing assets to meet a dynamically changing military scenario.
Defence exploitation of many of these sensors is limited due to the laborious data integration demands and the scale and complexity of the data available. This means that currently, Defence understanding and decision making is mainly based on data from expensive and bespoke military sensors. This severely limits our ability to understand situations more broadly. The inflexibility of this approach also creates significant barriers to act in an agile way in response to changing situations.
Integration and exploitation of these new collection assets brings additional challenges of automated fusion and management of the assets, which will be essential to reduce the cognitive burden of dealing with this deluge.
5. Specific areas of interest for challenge 1
5.1 Integration of ‘Raw Data’ sensors
Automatic integration of data from sensors that provide unprocessed data (for example streaming video) presents challenges due to the large variety of formats, standards, data rates and compression schemes that exist. Of course it might be possible to pre-install a catalogue of drivers to allow connectivity with known systems. However, the validity of this approach would soon expire as new formats are developed.
This competition seeks innovative techniques that allow the connecting system to instinctively know the format of the incoming data from an innate understanding of meta-level parameters of the data modality and recognisable scene elements. Obviously where meta-data exists with the raw data, this should be put to best use.
Where data has inadequate or error-prone geo- or time-tagging, small calibration errors can lead to the integrating system making incorrect associations between observations from different sensors. Therefore methods for automated registration between sensors are required.
5.2 Integration of ‘Intelligent’ information sources
As opposed to raw data sensors, ‘Intelligent’ sensors will typically have applied some detection, tracking, recognition or identification algorithms to produce ‘observations’ (this could be information about entities in the environment). However, the integrating system would not necessarily know the full performance details of the local algorithm, nor would the nature of the observation diagrams at the sensor necessarily match that of the integrating system. Some work has already been done in this space in the Dstl/Innovate UK funded project SAPIENT and it is recommended that applicants show cognisance of this. The SAPIENT Interface Control Document (ICD) published at the above link is consultative and work funded as a result of this competition could inform future versions.
Relevant issues include:
- can an integrating system instinctively know or learn sensors’ algorithm performance parameters (for example probability of detection and false alarm rate) given supporting data or observations?
- integration of sensors that output new types of observations requires the system’s ontology to be updated on the fly without the need for any operator interaction
Applicants are encouraged to adopt the W3C recommendation of using JavaScript Object Notation for Linked Data (JSON-LD), which is a JSON-based format to serialize Linked Data.
5.3 Processing
Raw data requires processing to integrate it into the system. For this to occur autonomously, a suite of processing services should be available and automatically activated in order to create the processing stream required to service the mission objectives. This competition is not seeking the processing services themselves but is looking for ways to autonomously decompose the mission objectives and map these to a processing pipeline in order to satisfy those objectives. This is challenging in situations where the characteristics of the sensor are not known prior to the operation, and in situations that require dynamic re-optimisation as new sensors are tasked.
5.4 Fusion
Multimodal fusion is the combination of data from multiple data sources to achieve improved accuracy and more specific inferences than could be achieved by the use of a single sensor alone. This isn’t easy even when the data sources are known in advance and tends to require bespoke solutions. The challenge in this competition is to find innovative ways to perform fusion on data sources that are not fully characterised, particularly when the data sources show uncertainty, bias or are not well trusted. Building models of sensors from on-line learning could be involved and this links to challenge 2. Learning based on intense operator supervision should be minimised, but assisted learning could be a feature of the human-machine teaming proposals into challenge 3.
5.5 Autonomous sensor management
Planning and decision making is not a static task. Sensors may be actively retasked to collect data from a different area or with different data collection parameters. To cope with a dynamically-changing situation characterised by high-volume data, sensor management must be autonomous. It must have the ability to “understand” a complex situation with associated uncertainty together with target dynamics, constraints and restrictions, and compute the optimal sensor taskings in real time. This presents challenges even in situations where the sensor resources are fully characterised, but the challenge is a significant problem where the integrating system has not fully characterised a model for the sensor’s task implementation; for example when modelling how long it takes for a sensor system hosted on an unmanned vehicle to be tasked to fly to an area to make observations.
Again the work performed in the SAPIENT project will provide applicants with a useful grounding.