Autonomous aircraft

{{short description|Aircraft which flies without a human pilot}}

{{Use American English|date=April 2020}}

{{Use dmy dates|date=October 2019}}

{{Update|date=February 2022}}

An autonomous aircraft is an aircraft which flies under the control of on-board autonomous robotic systems and needs no intervention from a human pilot or remote control. Most contemporary autonomous aircraft are unmanned aerial vehicles (drones) with pre-programmed algorithms to perform designated tasks, but advancements in artificial intelligence technologies (e.g. machine learning) mean that autonomous control systems are reaching a point where several air taxis and associated regulatory regimes are being developed.

History

=Unmanned aerial vehicles=

{{Main|History of unmanned aerial vehicles}}

File:Winston Churchill and the Secretary of State for War waiting to see the launch of a de Havilland Queen Bee radio-controlled target drone, 6 June 1941. H10307.jpg and others waiting to watch the launch of a de Havilland Queen Bee target drone, 6 June 1941]]

The earliest recorded use of an unmanned aerial vehicle for warfighting occurred in July 1849,[https://books.google.com/books?id=YSSPAgAAQBAJ&pg=PT43 The Future of Drone Use: Opportunities and Threats from Ethical and Legal Perspectives], Asser Press{{snd}} Springer, chapter by Alan McKenna, page 355 serving as a balloon carrier (the precursor to the aircraft carrier){{cite book|last=Kaplan|first=Philip|title=Naval Aviation in the Second World War|url=https://books.google.com/books?id=pDARBQAAQBAJ&pg=PT19|year=2013|publisher=Pen and Sword|isbn=978-1-4738-2997-8|page=19}} Significant development of radio-controlled drones started in the early 1900s, and originally focused on providing practice targets for training military personnel. The earliest attempt at a powered UAV was A. M. Low's "Aerial Target" in 1916.Taylor, John W. R.. Jane's Pocket Book of Remotely Piloted Vehicles.

Autonomous features such as the autopilot and automated navigation were developed progressively through the twentieth century, although techniques such as terrain contour matching (TERCOM) were applied mainly to cruise missiles.

Some modern drones have a high degree of autonomy, although they are not fully capable and the regulatory environment prohibits their widespread use in civil aviation. However some limited trials have been undertaken.

=Passengers=

As flight, navigation and communications systems have become more sophisticated, safely carrying passengers has emerged as a practical possibility. Autopilot systems are relieving the human pilot of progressively more duties, but the pilot currently remains necessary.

A number of air taxis are under development and larger autonomous transports are also being planned. The personal air vehicle is another class where from one to four passengers are not expected to be able to pilot the aircraft and autonomy is seen as necessary for widespread adoption.

Control system architecture

The computing capability of aircraft flight and navigation systems followed the advances of computing technology, beginning with analog controls and evolving into microcontrollers, then system-on-a-chip (SOC) and single-board computers (SBC).

=Sensors=

Position and movement sensors give information about the aircraft state. Exteroceptive sensors deal with external information like distance measurements, while exproprioceptive ones correlate internal and external states.

Non-cooperative sensors are able to detect targets autonomously so they are used for separation assurance and collision avoidance.{{Cite journal|last1=Fasano|first1=Giancarmine|last2=Accardo|first2=Domenico|last3=Tirri|first3=Anna Elena|last4=Moccia|first4=Antonio|last5=De Lellis|first5=Ettore|date=1 October 2015|title=Radar/electro-optical data fusion for non-cooperative UAS sense and avoid|journal=Aerospace Science and Technology|volume=46|pages=436–450|doi=10.1016/j.ast.2015.08.010|doi-access=free|bibcode=2015AeST...46..436F }}

Degrees of freedom (DOF) refers to both the amount and quality of sensors on board: 6 DOF implies 3-axis gyroscopes and accelerometers (a typical inertial measurement unit{{snd}} IMU), 9 DOF refers to an IMU plus a compass, 10 DOF adds a barometer and 11 DOF usually adds a GPS receiver.{{Cite web|title = Arduino Playground – WhatIsDegreesOfFreedom6DOF9DOF10DOF11DOF|url = http://playground.arduino.cc/Main/WhatIsDegreesOfFreedom6DOF9DOF10DOF11DOF|website = playground.arduino.cc|access-date = 4 February 2016}}

=Actuators=

UAV actuators include digital electronic speed controllers (which control the RPM of the motors) linked to motors/engines and propellers, servomotors (for planes and helicopters mostly), weapons, payload actuators, LEDs and speakers.

=Software=

UAV software called the flight stack or autopilot. The purpose of the flight stack is to obtain data from sensors, control motors to ensure UAV stability, and facilitate ground control and mission planning communication.{{Cite journal|date=2018-01-01|title=Adapting open-source drone autopilots for real-time iceberg observations|url= |journal=MethodsX|language=en|volume=5|pages=1059–1072|doi=10.1016/j.mex.2018.09.003|issn=2215-0161|pmc=6139390|pmid=30225206|last1=Carlson|first1=Daniel F.|last2=Rysgaard|first2=Søren}}

UAVs are real-time systems that require rapid response to changing sensor data. As a result, UAVs rely on single-board computers for their computational needs. Examples of such single-board computers include Raspberry Pis, Beagleboards, etc. shielded with NavIO, PXFMini, etc. or designed from scratch such as NuttX, preemptive-RT Linux, Xenomai, Orocos-Robot Operating System or DDS-ROS 2.0.

class="wikitable"

|+ Flight stack overview

Layer

!Requirement

!Operations

!Example

Firmware

|Time-critical

|From machine code to processor execution, memory access

|ArduCopter-v1, PX4

Middleware

|Time-critical

|Flight control, navigation, radio management

|PX4, Cleanflight, ArduPilot

Operating system

|Computer-intensive

|Optical flow, obstacle avoidance, SLAM, decision-making

|ROS, Nuttx, Linux distributions, Microsoft IOT

Civil-use open-source stacks include:

{{colbegin|colwidth=22em}}

{{colend}}Due to the open-source nature of UAV software, they can be customized to fit specific applications. For example, researchers from the Technical University of Košice have replaced the default control algorithm of the PX4 autopilot.{{Cite book|last1=Lesko|first1=J.|last2=Schreiner|first2=M.|last3=Megyesi|first3=D.|last4=Kovacs|first4=Levente|title=2019 Modern Safety Technologies in Transportation (MOSATT) |chapter=Pixhawk PX-4 Autopilot in Control of a Small Unmanned Airplane |date=November 2019|chapter-url=https://ieeexplore.ieee.org/document/8944101|location=Kosice, Slovakia|publisher=IEEE|pages=90–93|doi=10.1109/MOSATT48908.2019.8944101|isbn=978-1-7281-5083-3|s2cid=209695691}} This flexibility and collaborative effort has led to a large number of different open-source stacks, some of which are forked from others, such as CleanFlight, which is forked from BaseFlight and from which three other stacks are forked from.

=Loop principles=

File:UAV Flight control.jpg

UAVs employ open-loop, closed-loop or hybrid control architectures.

  • Open loop{{snd}} This type provides a positive control signal (faster, slower, left, right, up, down) without incorporating feedback from sensor data.
  • Closed loop{{snd}} This type incorporates sensor feedback to adjust behavior (reduce speed to reflect tailwind, move to altitude 300 feet). The PID controller is common. Sometimes, feedforward is employed, transferring the need to close the loop further.{{Cite web|url = http://www.nt.ntnu.no/users/skoge/prost/proceedings/ifac11-proceedings/data/html/papers/2327.pdf|title = The Navigation and Control technology inside the AR.Drone micro UAV|date = 2011|website = IFAC World Congress|last = Bristeau, Callou, Vissière, Petit}}

Communications

Most UAVs use a radio for remote control and exchange of video and other data. Early UAVs had only narrowband uplink. Downlinks came later. These bi-directional narrowband radio links carried command and control (C&C) and telemetry data about the status of aircraft systems to the remote operator. For very long range flights, military UAVs also use satellite receivers as part of satellite navigation systems. In cases when video transmission was required, the UAVs will implement a separate analog video radio link.

In most modern autonomous applications, video transmission is required. A broadband link is used to carry all types of data on a single radio link. These broadband links can leverage quality of service techniques to optimize the C&C traffic for low latency. Usually, these broadband links carry TCP/IP traffic that can be routed over the Internet.

Communications can be established with:

  • Ground control – a military ground control station (GCS). The MAVLink protocol is increasingly becoming popular to carry command and control data between the ground control and the vehicle.
  • Remote network system, such as satellite duplex data links for some military powers.{{Cite web|url = http://www.barnardmicrosystems.com/media/presentations/IET_UAV_C2_Barnard_DEC_2007.pdf|title = Small UAV Command, Control and Communication Issues|date = 2007|website = Barnard Microsystems|last = Barnard|first = Joseph}} Downstream digital video over mobile networks has also entered consumer markets,{{Cite web|title = The Cheap Drone Camera That Transmits to Your Phone|url = https://www.bloomberg.com/news/videos/b/539e81ee-cefd-4810-86d7-c16f0fcffca5|website = Bloomberg.com|access-date = 3 February 2016}} while direct UAV control uplink over the cellular mesh and LTE have been demonstrated and are in trials.{{Cite web|title =Cellular enables safer drone deployments|url = https://www.qualcomm.com/invention/technologies/lte/advanced-pro/cellular-drone-communication|website = Qualcomm|access-date = 9 May 2018}}
  • Another aircraft, serving as a relay or mobile control station{{snd}} military manned-unmanned teaming (MUM-T).{{Cite web|url = http://apps.dtic.mil/dtic/tr/fulltext/u2/a565510.pdf|archive-url = https://web.archive.org/web/20160206104148/http://www.dtic.mil/dtic/tr/fulltext/u2/a565510.pdf|url-status = live|archive-date = 6 February 2016|title = Identifying Critical Manned-Unmanned Teaming Skills for Unmanned Aircraft System Operators|date = September 2012|website = U.S. Army Research Institute for the Behavioral and Social Sciences}}

As mobile networks have increased in performance and reliability over the years, drones have begun to use mobile networks for communication. Mobile networks can be used for drone tracking, remote piloting, over the air updates,{{Cite patent|title=4G drone link|country=US|number=20170127245|status=application|pubdate=2017-05-04|inventor1-last=Adkins|inventor1-first=Timothy M.}}, now abandoned. and cloud computing.{{Cite journal|last1=Sharma|first1=Navuday|last2=Magarini|first2=Maurizio|last3=Jayakody|first3=Dushantha Nalin K.|last4=Sharma|first4=Vishal|last5=Li|first5=Jun|date=August 2018|title=On-Demand Ultra-Dense Cloud Drone Networks: Opportunities, Challenges and Benefits|url=|journal=IEEE Communications Magazine|volume=56|issue=8|pages=85–91|doi=10.1109/MCOM.2018.1701001|hdl=11311/1063273 |s2cid=52019723|issn=1558-1896|hdl-access=free}}

Modern networking standards have explicitly considered autonomous aircraft and therefore include optimizations. The 5G standard has mandated reduced user plane latency to 1ms while using ultra-reliable and low-latency communications.{{Cite web|title=Minimum requirements related to technical performance for IMT-2020 radio interface(s)|url=https://www.itu.int/pub/R-REP-M.2410-2017|access-date=2020-10-08|website=www.itu.int}}

Autonomy

{{more citations needed section|date=May 2016}}

File:Autonomous control basics.jpg

Basic autonomy comes from proprioceptive sensors. Advanced autonomy calls for situational awareness, knowledge about the environment surrounding the aircraft from exteroceptive sensors: sensor fusion integrates information from multiple sensors.{{cite journal|last1=Floreano|first1=Dario|last2=Wood|first2=Robert J.|title=Science, technology and the future of small autonomous drones|journal=Nature|date=27 May 2015|volume=521|issue=7553|pages=460–466|doi=10.1038/nature14542|pmid=26017445|bibcode=2015Natur.521..460F|s2cid=4463263|url=http://infoscience.epfl.ch/record/208757}}

= Basic principles =

One way to achieve autonomous control employs multiple control-loop layers, as in hierarchical control systems. As of 2016 the low-layer loops (i.e. for flight control) tick as fast as 32,000 times per second, while higher-level loops may cycle once per second. The principle is to decompose the aircraft's behavior into manageable "chunks", or states, with known transitions. Hierarchical control system types range from simple scripts to finite state machines, behavior trees and hierarchical task planners. The most common control mechanism used in these layers is the PID controller which can be used to achieve hover for a quadcopter by using data from the IMU to calculate precise inputs for the electronic speed controllers and motors.{{citation needed|date=December 2016}}

Examples of mid-layer algorithms:

  • Path planning: determining an optimal path for vehicle to follow while meeting mission objectives and constraints, such as obstacles or fuel requirements
  • Trajectory generation (motion planning): determining control maneuvers to take in order to follow a given path or to go from one location to another{{Cite journal|title = Comparison of Parallel Genetic Algorithm and Particle Swarm Optimization for Real-Time UAV Path Planning|journal = IEEE Transactions on Industrial Informatics|date = 1 February 2013|issn = 1551-3203|pages = 132–141|volume = 9|issue = 1|doi = 10.1109/TII.2012.2198665|first1 = V.|last1 = Roberge|first2 = M.|last2 = Tarbouchi|first3 = G.|last3 = Labonte|s2cid = 8418538}}{{Cite journal|title = Autonomous UAV path planning and estimation|journal = IEEE Robotics Automation Magazine|date = 1 June 2009|issn = 1070-9932|pages = 35–42|volume = 16|issue = 2|doi = 10.1109/MRA.2009.932529|first1 = J.|last1 = Tisdale|first2 = ZuWhan|last2 = Kim|first3 = J.K.|last3 = Hedrick|s2cid = 9696725}}
  • Trajectory regulation: constraining a vehicle within some tolerance to a trajectory

Evolved UAV hierarchical task planners use methods like state tree searches or genetic algorithms.{{Cite web|url = http://www.iaeng.org/publication/WCE2014/WCE2014_pp551-557.pdf|title = UAV Path Planning with Parallel Genetic Algorithms on CUDA Architecture|date = 2014|website = World congress on engineering|last = Cekmez, Ozsiginan, Aydin And Sahingoz}}

= Autonomy features =

File:Degrees of autonomy.jpg

UAV manufacturers often build in specific autonomous operations, such as:

  • Self-level: attitude stabilization on the pitch and roll axes.
  • Altitude hold: The aircraft maintains its altitude using barometric pressure and/or GPS data.
  • Hover/position hold: Keep level pitch and roll, stable yaw heading and altitude while maintaining position using GNSS or inertial sensors.
  • Headless mode: Pitch control relative to the position of the pilot rather than relative to the vehicle's axes.
  • Care-free: automatic roll and yaw control while moving horizontally
  • Take-off and landing (using a variety of aircraft or ground-based sensors and systems; see also:Autoland)
  • Failsafe: automatic landing or return-to-home upon loss of control signal
  • Return-to-home: Fly back to the point of takeoff (often gaining altitude first to avoid possible intervening obstructions such as trees or buildings).
  • Follow-me: Maintain relative position to a moving pilot or other object using GNSS, image recognition or homing beacon.
  • GPS waypoint navigation: Using GNSS to navigate to an intermediate location on a travel path.
  • Orbit around an object: Similar to Follow-me but continuously circle a target.
  • Pre-programmed aerobatics (such as rolls and loops).

= Functions =

Full autonomy is available for specific tasks, such as airborne refueling{{Cite news|title = Watch a step in Navy history: an autonomous drone gets refueled mid-air|url = https://www.washingtonpost.com/news/checkpoint/wp/2015/04/23/watch-a-step-in-aviation-history-an-autonomous-drone-getting-refueled-mid-air/|newspaper = The Washington Post|date = 23 April 2015|access-date = 3 February 2016|issn = 0190-8286|first = Christian|last = Davenport}} or ground-based battery switching; but higher-level tasks call for greater computing, sensing and actuating capabilities. One approach to quantifying autonomous capabilities is based on OODA terminology, as suggested by a 2002 US Air Force Research Laboratory, and used in the table below:{{Cite web|url = http://apps.dtic.mil/dtic/tr/fulltext/u2/a515926.pdf|archive-url = https://web.archive.org/web/20160206104148/http://www.dtic.mil/dtic/tr/fulltext/u2/a515926.pdf|url-status = live|archive-date = 6 February 2016|title = Metrics, Schmetrics! How The Heck Do You Determine A UAV's Autonomy Anyway?|date = August 2002|website = US Air Force Research Laboratory|last = Clough|first = Bruce}}

class="wikitable collapsible"

|+United States Autonomous control levels chart

style="vertical-align: top;"

!Level

!Level descriptor

!Observe

!Orient

!Decide

!Act

style="vertical-align: top;"

| colspan="2" |

|Perception/Situational awareness

|Analysis/Coordination

|Decision making

|Capability

style="vertical-align: top;"

!10

!Fully Autonomous

|Cognizant of all within battlespace

|Coordinates as necessary

|Capable of total independence

|Requires little guidance to do job

style="vertical-align: top;"

!9

!Battlespace Swarm Cognizance

|Battlespace inference – Intent of self and others (allied and foes).

Complex/Intense environment – on-board tracking

|Strategic group goals assigned

Enemy strategy inferred

|Distributed tactical group planning

Individual determination of tactical goal

Individual task planning/execution

Choose tactical targets

|Group accomplishment of strategic goal with no supervisory assistance

style="vertical-align: top;"

!8

!Battlespace Cognizance

|Proximity inference – Intent of self and others (allied and foes)

Reduces dependence upon off-board data

|Strategic group goals assigned

Enemy tactics inferred

ATR

|Coordinated tactical group planning

Individual task planning/execution

Choose target of opportunity

|Group accomplishment of strategic goal with minimal supervisory assistance

(example: go SCUD hunting)

style="vertical-align: top;"

!7

!Battlespace Knowledge

|Short track awareness – History and predictive battlespace

Data in limited range, timeframe and numbers

Limited inference supplemented by off-board data

|Tactical group goals assigned

Enemy trajectory estimated

|Individual task planning/execution to meet goals

|Group accomplishment of tactical goals with minimal supervisory assistance

style="vertical-align: top;"

!6

!Real Time

Multi-Vehicle Cooperation

|Ranged awareness – on-board sensing for long range,

supplemented by off-board data

|Tactical group goals assigned

Enemy trajectory sensed/estimated

|Coordinated trajectory planning and execution to meet goals{{snd}} group optimization

|Group accomplishment of tactical goals with minimal supervisory assistance

Possible: close air space separation (+/-100yds) for AAR, formation in non-threat conditions

style="vertical-align: top;"

!5

!Real Time

Multi-Vehicle Coordination

|Sensed awareness – Local sensors to detect others,

Fused with off-board data

|Tactical group plan assigned

RT Health Diagnosis Ability to compensate for most failures and flight conditions;

Ability to predict onset of failures (e.g. Prognostic Health Mgmt)

Group diagnosis and resource management

|On-board trajectory replanning – optimizes for current and predictive conditions

Collision avoidance

|Self accomplishment of tactical plan as externally assigned

Medium vehicle airspace separation (hundreds of yds)

style="vertical-align: top;"

!4

!Fault/Event Adaptative

Vehicle

|Deliberate awareness – allies communicate data

|Tactical group plan assigned

Assigned Rules of Engagement

RT Health Diagnosis; Ability to compensate for most failures and flight conditions{{snd}} inner loop changes reflected in outer loop performance

|On-board trajectory replanning – event driven

Self resource management

Deconfliction

|Self accomplishment of tactical plan as externally assigned

Medium vehicle airspace separation (hundreds of yds)

style="vertical-align: top;"

!3

!Robust Response to Real Time Faults/Events

|Health/status history & models

|Tactical group plan assigned

RT Health Diagnosis (What is the extent of the problems?)

Ability to compensate for most failures and flight conditions (i.e. adaptative inner loop control)

|Evaluate status vs required mission capabilities

Abort/RTB is insufficient

|Self accomplishment of tactical plan as externally assigned

style="vertical-align: top;"

!2

!Changeable mission

|Health/status sensors

|RT Health diagnosis (Do I have problems?)

Off-board replan (as required)

|Execute preprogrammed or uploaded plans

in response to mission and health conditions

|Self accomplishment of tactical plan as externally assigned

style="vertical-align: top;"

!1

!Execute Preplanned

Mission

|Preloaded mission data

Flight Control and Navigation Sensing

|Pre/Post flight BIT

Report status

|Preprogrammed mission and abort plans

|Wide airspace separation requirements (miles)

style="vertical-align: top;"

!0

!Remotely

Piloted

Vehicle

|Flight Control (attitude, rates) sensing

Nose camera

|Telemetered data

Remote pilot commands

|N/A

|Control by remote pilot

Medium levels of autonomy, such as reactive autonomy and high levels using cognitive autonomy, have already been achieved to some extent and are very active research fields.

= Reactive autonomy =

{{See also|Perceptual control theory}}

Reactive autonomy, such as collective flight, real-time collision avoidance, wall following and corridor centring, relies on telecommunication and situational awareness provided by range sensors: optic flow,{{Cite journal|title = A bee in the corridor: centering and wall-following|url = https://hal-amu.archives-ouvertes.fr/hal-02294572/file/2008-Serres%20et%20al.%20Naturwissenschaften%20-%20preprint%20final.pdf|journal = Naturwissenschaften|pages = 1181–1187|volume = 95|issue = 12|doi = 10.1007/s00114-008-0440-6|first1 = Julien R.|last1 = Serres|first2 = Guillaume P.|last2 = Masson|first3 = Franck|last3 = Ruffier|first4 = Nicolas|last4 = Franceschini|pmid=18813898|year = 2008|bibcode = 2008NW.....95.1181S|s2cid = 226081}} lidars (light radars), radars, sonars.

Most range sensors analyze electromagnetic radiation, reflected off the environment and coming to the sensor. The cameras (for visual flow) act as simple receivers. Lidars, radars and sonars (with sound mechanical waves) emit and receive waves, measuring the round-trip transit time. UAV cameras do not require emitting power, reducing total consumption.

Radars and sonars are mostly used for military applications.

Reactive autonomy has in some forms already reached consumer markets: it may be widely available in less than a decade.

File:Autonomous-control-level-trend.png

= Simultaneous localization and mapping =

SLAM combines odometry and external data to represent the world and the position of the UAV in it in three dimensions. High-altitude outdoor navigation does not require large vertical fields-of-view and can rely on GPS coordinates (which makes it simple mapping rather than SLAM).{{Cite journal|title = Novel Aerial 3D Mapping System Based on UAV Platforms and 2D Laser Scanners|date = 2016|journal = Journal of Sensors|volume = 2016|pages = 1–8|last = Roca, Martínez-Sánchez, Lagüela, and Arias|doi = 10.1155/2016/4158370|doi-access = free}}

Two related research fields are photogrammetry and LIDAR, especially in low-altitude and indoor 3D environments.

  • Indoor photogrammetric and stereophotogrammetric SLAM has been demonstrated with quadcopters.{{Cite web|title = ETH Zurich: Drones with a Sense of Direction|url = http://www.asctec.de/en/ethz-drones-with-a-sense-of-direction/|website = Ascending Technologies GmbH|access-date = 3 February 2016|date = 10 November 2015}}
  • Lidar platforms with heavy, costly and gimbaled traditional laser platforms are proven. Research attempts to address production cost, 2D to 3D expansion, power-to-range ratio, weight and dimensions.{{cite web|url=https://arstechnica.com/cars/2018/01/driving-around-without-a-driver-lidar-technology-explained/|title=Why experts believe cheaper, better lidar is right around the corner|date=1 January 2018|via=Ars Technica|author=Timothy B. Lee }}{{Citation|title = Autonomous Aerial Navigation in Confined Indoor Environments|url = https://www.youtube.com/watch?v=IMSozUpFFkU|date = 16 November 2010|access-date = 3 February 2016|last = Shaojie Shen}} LED range-finding applications are commercialized for low-distance sensing capabilities. Research investigates hybridization between light emission and computing power: phased array spatial light modulators,{{Cite web|title = SWEEPER Demonstrates Wide-Angle Optical Phased Array Technology|url = http://www.darpa.mil/news-events/2015-05-21|website = www.darpa.mil|access-date = 3 February 2016}}{{Cite web|title = LIDAR: LIDAR nears ubiquity as miniature systems proliferate|url = http://www.laserfocusworld.com/articles/print/volume-51/issue-10/features/lidar-lidar-nears-ubiquity-as-miniature-systems-proliferate.html|website = www.laserfocusworld.com|access-date = 3 February 2016|date = 13 October 2015}} and frequency-modulated-continuous-wave (FMCW) MEMS-tunable vertical-cavity surface-emitting lasers (VCSELs).{{Cite web|url = http://www-bsac.eecs.berkeley.edu/publications/search/send_publication_pdf2client.php?pubID=1364925404|title = Development of an FMCW LADAR Source Chip using MEMS-Electronic-Photonic Heterogeneous Integration|date = 2015|website = University of California, Berkeley|last = Quack, Ferrara, Gambini, Han, Keraly, Qiao, Rao, Sandborn, Zhu, Chuang, Yablonovitch, Boser, Chang-Hasnain, C. Wu}}

= Swarming =

{{Further|Swarm robotics}}

Robot swarming refers to networks of agents able to dynamically reconfigure as elements leave or enter the network. They provide greater flexibility than multi-agent cooperation. Swarming may open the path to data fusion. Some bio-inspired flight swarms use steering behaviors and flocking.{{clarify|date=May 2016}}

Future military potential

{{Update section|date=February 2022}}

In the military sector, American Predators and Reapers are made for counterterrorism operations and in war zones in which the enemy lacks sufficient firepower to shoot them down. They are not designed to withstand antiaircraft defenses or air-to-air combat. In September 2013, the chief of the US Air Combat Command stated that current UAVs were "useless in a contested environment" unless crewed aircraft were there to protect them. A 2012 Congressional Research Service (CRS) report speculated that in the future, UAVs may be able to perform tasks beyond intelligence, surveillance, reconnaissance and strikes; the CRS report listed air-to-air combat ("a more difficult future task") as possible future undertakings. The Department of Defense's Unmanned Systems Integrated Roadmap FY2013-2038 foresees a more important place for UAVs in combat. Issues include extended capabilities, human-UAV interaction, managing increased information flux, increased autonomy and developing UAV-specific munitions. DARPA's project of systems of systems,{{Cite web|title = DARPA's Plan to Overwhelm Enemies With Swarming Drones – Drone 360|url = http://blogs.discovermagazine.com/drone360/2015/04/06/darpas-swarming-drones/#.VrHhSPnhDcc|website = Drone 360|access-date = 3 February 2016|date = 6 April 2015}} or General Atomics work may augur future warfare scenarios, the latter disclosing Avenger swarms equipped with High Energy Liquid Laser Area Defense System (HELLADS).{{Citation|title = US Air force STEALTH UAV armed with LASER GUN named General Atomics Avenger|url = https://www.youtube.com/watch?v=mPvBDlQOtqY|date = 17 January 2014|access-date = 3 February 2016|last = NewWorldofWeapons}}

= Cognitive radio =

Cognitive radio{{clarify|date=May 2016}} technology may have UAV applications.{{Cite journal|url = http://hdl.handle.net/10919/19295|title = Unified Multi-domain Decision Making: Cognitive Radio and Autonomous Vehicle Convergence|date = December 2012|access-date = September 18, 2020 |website = Faculty of the Virginia Polytechnic Institute and State University|last = Young|hdl = 10919/19295}}

= Learning capabilities =

UAVs may exploit distributed neural networks.

{{Mobile robots}}

{{Robotics}}

See also

References