Objective:
IED and landmine detection has been a research topic for many years. However, progress is hindered by the lack of standardised benchmarks. There is a need to rely on representative testing environments enabling an objective and comparable evaluation of developed systems.
Furthermore, field tests cannot be repeated at will and are not perfectly reproducible, especially for detection systems that involve artificial intelligence. Online tests of software components, for which measurements are easily reproducible and which enable short development cycles, should therefore also be organised. Since little data is readily available, data for online tests need to be collected during field tests organised previously during the challenge. This combination of field tests and online tests is needed to steer fast progress toward operational goals.
Scope:Scope and types of activities
Scope
Proposals should address the organisation of a technological challenge on IED and landmine detection based on the preliminary evaluation plan provided as part of the call documents. This includes the collection of data recorded by the participating teams during field tests, the annotation of this data and the sharing of the resulting databases.
Proposals should include clear descriptions of criteria to assess work package completion. Criteria should include the production of detailed evaluation plans agreed upon by all stakeholders, the production of the annotated databases needed for the evaluations, the production of measurements for all systems submitted to the tests by the participating teams following these plans, and the organisation of the needed events.
Types of activities
The following types of activities are eligible for this topic:
Types of activities (art 10(3) EDF Regulation) | Eligible? | |
(a) | Activities that aim to create, underpin and improve knowledge, products and technologies, including disruptive technologies, which can achieve significant effects in the area of defence (generating knowledge) | Yes (optional) |
(b) | Activities that aim to increase interoperability and resilience, including secured production and exchange of data, to master critical defence technologies, to strengthen the security of supply or to enable the effective exploitation of results for defence products and technologies (integrating knowledge) | Yes (mandatory) |
(c) | Studies, such as feasibility studies to explore the feasibility of new or upgraded products, technologies, processes, services and solutions | Yes (optional) |
(d) | Design of a defence product, tangible or intangible component or technology as well as the definition of the technical specifications on which such design has been developed, including partial tests for risk reduction in an industrial or representative environment | Yes (optional) |
(e) | System prototyping of a defence product, tangible or intangible component or technology (prototype) | No |
(f) | Testing of a defence product, tangible or intangible component or technology | No |
(g) | Qualification of a defence product, tangible or intangible component or technology | No |
(h) | Certification of a defence product, tangible or intangible component or technology | No |
(i) | Development of technologies or assets increasing efficiency across the life cycle of defence products and technologies | No |
The proposals must address in particular the following as part of the mandatory activities:
Functional requirements
The proposed solutions should enable to measure the performances of the tested systems according to detailed evaluation plans based on the preliminary evaluation plan provided as part of the call documents. Key aspects of the foreseen detailed evaluation plans and associated data management should be described in the proposals. Proposals should in particular describe:
The testing environment should be able to accommodate for up to six participating teams.
During the challenge, drafts of the detailed evaluation plans should be submitted for discussion to the participating teams and to any stakeholder designated by the funding authority, early enough to take into account the feedback for the actual evaluation campaigns. Any evolution of the evaluation plans should take into account several factors: technical possibilities and cost, scientific relevance of the measurement, and representativeness of the metrics and protocols with respect to military needs. The justification of any change that is not subject to a consensus should be documented.
Expected Impact:The expected impacts are