Thursday, June 26, 2014

The Northrop-Grumman (NG) made U.S. Air Force RQ-4 Global Hawk is arguably the archetypal example of a high altitude, long endurance (HALE) unmanned aerospace system (UAS) capable of missions far removed from its control station. The Global Hawk is widely used for intelligence, surveillance, and reconnaissance (ISR) missions for the military, capable of autonomous operations from taxi to flight and return to base on program. The Global Hawk takes its command and control (C2) and relays captured data in real time to control stations or operational mission commands within line-of-sight (LOS) or beyond (Loochkartt, 2014).

The RQ-4 UAS includes the air vehicle (AV), the forward-deployed Mission Control Element (MCE), and the Launch and Recovery Element (LRE) working in concert to provide command and control and sensor data transmission and control. The LRE is able to communicate with and provide C2 to the AV via transmission through a LOS common data link (CDL) and LOS ultra-high frequency (UHF) radios, as well as reaching beyond line-of-sight (BLOS) via UHF radios. However, the LRE is not capable of controlling payload sensor or receiving data captured on them. The MCE has all the capabilities of the LRE plus the ability to control sensors and receive and disseminate data. The MCE communicates and maintains situational awareness of the AV through LOS narrowband UHF radios and Ku-band UHF satellite transmission (Unmanned Aircraft Systems Roadmap 2005-2030, 2005).

In a typical Global Hawk mission the LRE prepares and launches the AV from its base station or main operations airfield, maintaining contact and C2 of the AV from taxi, launch, and recovery. The LRE hands over C2 to the corresponding MCE after launch, which maintains control of the AV and most of the mission. A Global Hawk Operations Center (GHOC) provides oversight and mission prioritization to MCEs and oversees handover procedures between LRE and MCE or between MCE and another MCE when mission re-tasking takes the AV outside the scope of its initial area of operations.

Over the last decade of war in the Middle East the Global Hawk’s capabilities as a worldwide ISR platform displays its BLOS capabilities by using extraterrestrial satellite communications (SATCOM) to relay data back to exploitation and processing centers located in the continental US (CONUS). These CONUS exploitation and processing centers processed raw data collected by the Global Hawk sensors and forwarded data to forward-deployed customers and/or other operations centers. The advantages of BLOS capabilities allows equipment and personnel at the GHOC and exploitation and processing centers to operate in relative safety at CONUS locations instead of the austere environments at forward locations. These BLOS capabilities are also attractive to civil uses such as ground mapping and high altitude visual observation. In 2013, Canada started a collaborative project with NG and NASA to use Global Hawks equipped with high-resolution cameras and synthetic aperture radar to conduct ground mapping and visual observation of the Arctic Circle (Bellamy, 2013).

Disadvantages of BLOS operations include human factors (HF) involved in the handover procedures mentioned previously, loss of situational awareness between handover participants, loss of communications link, tactical oversight, and miscommunications, amongst others, can prove to be problematic. While the MCE and LRE can provide redundancy for most C2 functions the GHOC is an essential layer of oversight and control for BLOS missions, ensuring safe and positive control of the multimillion aircraft that is the Global Hawk.

Reference
Bellamy, W. (2013, December 19). Global Hawk UAS Performs First Canadian Civil Flight - See more at: http://www.aviationtoday.cGlobal Hawk UAS Performs First Canadian Civil Flight. Retrieved from Aviation Today: http://www.aviationtoday.com/av/commercial/Global-Hawk-UAS-Performs-First-Canadian-Civil-Flight_80896.html#.U6z6Y_ldWSo
Loochkartt, G. (2014, June 25). RQ-4 Global Hawk. Retrieved from Northrop-Grumman: http://www.northropgrumman.com/Capabilities/RQ4Block20GlobalHawk/Documents/HALE_Factsheet.pdf
RQ-4 Block 20 Global Hawk. (2007, March 1).

Unmanned Aircraft Systems Roadmap 2005-2030. (2005, aUGUST 4). Retrieved from Federation of American Scientists: http://fas.org/irp/program/collect/uav_roadmap2005.pdf

Tuesday, June 17, 2014


UAS Integration into the NAS

Next Generation Air Transportation System (NextGen) is the transformation of the US national air transportation system to alleviate the current congestion in the air and airports and in anticipation of the demands on the national air transportation system in the future. The US Congress enacted NextGen in 2003 under President Bush, creating the Joint Planning and Development Office (JPDO) to manage the different agencies partnering to design and develop NextGen. The partnerships include private sector organizations, academia, and government agencies such as the Departments of Transportation, Commerce, Defense, and Homeland Security, as well as the Federal Aviation Administration, NASA, and the Offices of Science and Technology Policy and Director of National Intelligence. The goals of NextGen are to develop new technologies while leveraging legacy technologies to support the transformation; to create capabilities and the highly interdependent technologies that will change the operations of the air transportation system, reduce traffic and passenger congestion, and improve overall flying experience (Fact Sheet – NextGen, 2014).

NextGen programs include the Automatic Dependent Surveillance Broadcast (ADS-B), System Wide Information management (SWIM), NextGen Data Communications, NextGen Network Enabled Weather, and NAS Voice Switch. ADS-B is the backbone of the NextGen system, currently in use and will be required on all commercial and GA aircraft by the year 2020, it takes the sense and avoid capabilities of aircraft to the next level (Automatic Dependent Surveillance-Broadcast (ADS-B), 2014). ADS-B can significantly enhance UAS’ ability to detect, sense, and avoid other aircraft on the grid. Other benefits of NextGen technology include: “trajectory based operations allow pilots and dispatchers to select their own efficient flight paths instead of following the existing “interstate in the sky” type routs;” a collaborative air traffic management system between air traffic managers and flight operators; reduced weather impacts through information sharing, improved weather forecasting; higher density airports through new and improved surface movement with reduced spacing and separation requirements; and allowing flexibility in terminals and airports allowing increase in throughput by uncovering previously untapped system capacity (Fact Sheet – NextGen, 2014).

However, this technology comes at a cost and may not be applicable to all UAV categories at the moment. The equipment necessary to utilize ADS-B adds weight and power demands on the air vehicle. While these requirements may be negligible on medium range to MALE/HALE UAV designs, they are of note when incorporating into smaller unmanned platforms where space, weight, and power are at a premium. The effect of the additional demands on the system comes into consideration compared to endurance or payload capacity.

            Additionally, while NextGen technology paves the way for increase integration of UAS into the MAS UAS operators will still play a big role to prevent collisions with manned aircraft. As UAS pilots they must maintain situational awareness of their aircraft and also perform analogous air traffic control (ATC) functions in conjunction with other operators to maintain separation in segregated airspace. In non-segregated airspace, UAS operators must comply with local ATC instructions if they are to operate safely within the vicinity of commercial and general aviation aircraft. NextGen aims to provide a comprehensive solution for all involved to maintain a high level of reliability and safety.

In looking towards integration of UAS into the NAS, the US Air Force issued a request for information to technology vendors to build sense and avoid systems for its drones, called the Common-Airborne Sense and Avoid (C-ABSAA) Program (Cooney, 2013). The AF seeks alternatives to the Certificate of Authorization process to increase its mission options as military and commercial use of UASs expands. This, however, only addresses one issue of the many facing UAV integration into the NAS to include ensuring reliable command, control, and communications, failsafe actions in loss-link situations, network security and anti-jamming or anti-spoofing capabilities, and interference issues in saturated RF spectrum.

 

Reference

Cooney, M. (2013, September 23). Air Force wants technology that will let drones sense and avoid other aircraft. Retrieved from Network World: http://www.networkworld.com/article/2225425/security/air-force-wants-technology-that-will-let-drones-sense-and-avoid-other-aircraft.html

Fact Sheet – NextGen. (2014, June 17). Retrieved from FAA: http://www.faa.gov/news/fact_sheets/news_story.cfm?newsid=8145

 

Thursday, June 12, 2014


AeroVironment Ground Control Station
      AeroVironment, Inc. (AV) develops and manufactures scores of unmanned aircraft and electric vehicle solutions. Amongst its family of unmanned aircraft systems (UAS) is a line of small UAS widely used in support of the war effort in Iraq and Afghanistan; these ruggedized, compact, and portable UAS provide excellent intelligence, surveillance, and reconnaissance (ISR) coverage in the battlefield while gaining popularity in civil applications (Imagination, Passion, and Persistence, 2014).
               


     All AV small UAS such as the Raven, Puma, Wasp, and Dragon Eye are controlled by a common ground control station (GCS) which provides the command and control (C2), communications, data and video link to the air vehicle. AV’s GCS is a small, lightweight, compact, dustproof, waterproof, and battle-tested GCS capable of displaying real time video capture from the air vehicle payload cameras. Operators are able to capture screen images, store, playback, and re-transmit video and metadata on the network. The GCS can be used as a remote video terminal (RVT) at remote command centers providing the same capabilities as the operator’s GCS. Capable of manual and autonomous C2 the AV GCS components fit in a small sack and take only a small portion of a typical small backpack (GCS - Joint Common Interoperable Ground Control Station, 2014).


Other features of the AV common GCS include:
  • An intuitive user interface built on the company’s proprietary core operating system
  • Storage for up to 80 image  captures and multiple preprogrammed missions
  • Ability to operate as a remote video terminal
  • Manual, Altitude-Hold, Navigate, Loiter, Home, Loss-of-Link, Follow Me, and Auto land modes of operation
  • Operates on common military BA-5590/U (or similar) battery
  • Has a fully-packaged weight of only 7.42 pounds
  • Available options include a Panasonic Toughbook laptop, Falcon view software,  and an RVT Kit Antenna

                Although a single operator can work the AV GCS, a two-person team is more ideal and is the preferred mode of operation. A study at the Army Research Laboratory on Raven operations found that GCS operators are subject to high workloads in a typical 40-45 minute mission (Pomranky, 2006). Task saturation can lead to a loss in situational awareness. Manning, Rash, LeDuc, Noback, and McKeon (2004) states that a loss in situational awareness is a leading causal factor in aviation mishaps.  The same advantages which make the GCS a popular military tactical gear, such as its portability and small design, may also cause conditions which can be detrimental to maintaining situational awareness. The vehicle operator uses a handheld controller about the size of a typical seven inch tablet computer with user control and input buttons and knobs on either side of the screen. It lacks a map display (maps are resident on the Toughbook computer) which is invaluable in determining the air vehicle’s location. This is where multiple operators become valuable; mission tasks can be tackled more effectively when divided between two operators – one to operate the hand controller and the other to program and monitor the mission on the laptop.
                Also, because of its small screen size, the hand controller display contains a significant amount of flight information that can overwhelm the operator. Add to that the need for a hood to shroud the screen from bright sunlight and give the operator a better viewing experience. This may cause some disorientation when switching from a hooded view, causing a loss in situational awareness for a few seconds while the operator’s vision recovers in transition.
                Task saturation and issues with multifunction display and control systems’ design are common to manned aircraft and AV’s small UAS mentioned above. Both human factors can adversely affect the operator’s performance which may lead to a mishap.

Reference

GCS - Joint Common Interoperable Ground Control Station. (2014, June 10). Retrieved from AeroVironment, Inc.: http://www.avinc.com/downloads/AV_GCS_V10109.pdf

Imagination, Passion, and Persistence. (2014, June 10). Retrieved from AeroVironment, Inc.: http://www.avinc.com/

Manning, S., Rash, C., LeDuc, P., Noback, R., & McKeon, J. (2004). The Role of Human Causal Factors in U.S. Army Unmanned Aerial Vehicle Accidents. U.S. Army Aeromedical Research Laboratory.

Pomranky, R. (2006). Human Robotics Interaction Army Tehcnology Objective Raven Small Unmanned Aerial Vehicle Task Analysis and Modeling. Army Research Laboratory.