Friday, February 20, 2015

Automatic Takeoff and Landing - Global Hawk & Boeing 777

The focal of this research paper is the Automatic Take-off and Landing system on both manned and unmanned aircraft.  The ATLS allows the aircraft to take-off and land without pilot interference.  This system can be very beneficial allowing safe landing in precarious situations.  Situations that include poor visibility or any form of adverse weather (Larson, 2012).  This may be an ideal technology for commercial airliners but that day has not come yet.  To believe that commercial airliners currently have the capability to “take-off” and “land” automatically is a misconception, but is a technological aspiration of future flight.  BBC Future’s Jon Stewart wrote an article “Pilotless passenger planes prepare for take-off”.  The article spoke of the gradual move in technology that could actually remove the necessity of a pilot from the commercial airliner cockpit.  The article also mentioned that the phrase “this is your captain speaking” may soon become a thing of the past, thanks to a new generation of robotic, passenger aircraft that will take to the skies by themselves” (Stewart, 2013).
            Unfortunately, automatic take-off for manned commercial aircraft is nonexistent at the moment.  100 years earlier the first autopilot feature was introduced.  The technology was designed for steadying a plane during flight, providing a way to pre-program the aircraft‘s attitude and heading (Stewart, 2013).  The advantage of stepping away from the cockpit responsibilities allowed the commercial pilot’s a chance to do something we take for granted, something as simple as use the lavatories.  Yes, restroom privileges are important not to mention mental breaks.  The Boeing 777 commercial aircraft is equipped with autopilot function that will allow the aircraft to fly and land automatically.  Boeing 777’s autopilot feature can be turned off and the pilot can use the manual controls for landing.   Boeing 777’s are equipped with a remote piloting featured called the Boeing Honeywell ‘Uninterruptible’ Autopilot System” (Helton, 2014).  The main purpose of this system is to counteract any terrorist attempt by hijackers.  It will also stop any other unauthorized persons’ from gaining the ability to control the aircraft.  However the ILS – Instrument Landing System acts as a guide for the aircraft in correcting runway and landing.    This is not to be confused with auto-piloting, but is considered as an aid of such.  Heading, airspeed, altitude and even a specific rate of climb are controlled by the auto-piloting system. The Instrument Lansing System receives data from the flight director system and data from an aerodrome.  Using this data received from each source, the Boeing 777 determines its position in respect to the runway. The human pilot, however, is still there as another link in the chain. The combination of systems allows the pilot a very accurate and safe way to land.  In addition to anti-hijacking, auto landing features the Uninterruptible Autopilot System the pilot calls out altitudes and flap angles.  If those data calls are not correct the copilot must take control and abort the landing sequence. The copilot is also responsible for the aircraft if the pilot is incapacitated.  Additional safeguards include preflight briefings, and easy autopilot disengagement.  According to what are the main differences piloting Boeing vs. Airbus aircraft, if an over excessive amount of pressure is applied to the controls this should automatically shut off autopilot. 
            The Global Hawk Unmanned Aerial System has a pilot just like any other aircraft.  The difference is those pilots are not collocated with the aircraft.  It is remotely controlled.   The RQ-4 Global Hawk is equipped with autonomous take-off and landing abilities.  The launch and recovery element (LRE) autonomous flight mission plans are loaded by pilots pre-flight.  The pilots also monitor the operations during automatic take-off and landings.  The Global Hawk’s flight control system includes GPS and INS which play a part in the automatic take-offs and landings of the UAS.  The RQ-4 uses SATCOM satellites to transfer information from the inertial navigation system and GPS data to the Ground Control Systems to determine the location of the UAV within airspace.  Global Hawk has a forward-looking infrared camera for takeoff and landing, the pilot relies on graphic displays to maintain situational awareness. The pilot has to visualize the three dimensional model (Colucci, 2004).   Safety of flight consideration some issues might need tweaking.  If the RQ-4 receives a transmission to end its mission, it will automatically start the termination sequence.  Also, if the vehicle receives a set of instructions that go against its programming, the Global Hawk has the ability to block those instructions.  “The aircraft's self-control goes well beyond simply following the orders of a human-programmed mission plan” (Weed & Schorr, 2002).   It doesn’t have the ability to completely replace humans in the loop but it does have some situational adaptabilities programmed into the system.  Single launch and recovery element crew handles takeoff and landing with benefit of a rest period (Colucci, 2004).      
            Training for UAV pilots and sensor operators are concurrent three-and-a-half month training courses.  Each course enrolls 10 to 12 students per class.  Although pilots are chosen from all facet from military to civilian sectors, their training is reasonably different.  UAV pilots receive extensive training –Predator, etc.  Training for lethal UAV provide pilots with at least 20 days of classroom instruction and 50 to 60 hours of flying time (Colucci, 2004).   Initial training for LRE crews consists of takeoff and landing, plus basic handling. Following UAV training incorporate mission reconnaissance, surface attack tactics, and strike coordination (Colucci, 2004).


References
Colucci, F. (2004). Air Force Refines Training Programs for UAV Operators, National Defense (NDIA). Retrieved 20 February, 2015, from http://www.nationaldefensemagazine.org/archive/2004/May/Pages/Air_Force_Refines3555.aspx
Helton, S. (2014, August 7). FLIGHT CONTROL: Boeing’s ‘Uninterruptible Autopilot System’, Drones & Remote Hijacking. Retrieved from 21st Centtury Wire - News for the Waking Generation: http://21stcenturywire.com/2014/08/07/flight-control-boeings-uninterruptible-autopilot-system-drones-remote-hijacking/
Lim, K. H. (2007, December 11). How does a pilot execute an auto landing during bad weather in a Boeing 777? Retrieved from Ask Captain Lim: http://www.askcaptainlim.com/flying-on-the-boeing-777-flying-91/439-how-does-a-pilot-execute-an-auto-landing-during-bad-weather-in-a-boeing-777.html
Stewart, J. (2013). Pilotless passenger planes prepare for take-off. BBC Future. Retrieved February 20, 2015, from http://21stcenturywire.com/2014/08/07/flight-control-boeings-uninterruptible-autopilot-system-drones-remote-hijacking/ automatic takeoff and landing. 
Verver, G. (2013, December 7). Aircraft Accidents & Incidents. Retrieved from A Photographic History of NAF & VX-5 at NOTS China Lake: http://www.chinalakealumni.org/Accidents.htm
Weed, W. S., & Schorr, C. (2002, August). Flying Blind | DiscoverMagazine.com. Retrieved from About Discover Magazine | DiscoverMagazine.com: http://discovermagazine.com/2002/aug/featflying
What are the main differences piloting Boeing vs. Airbus aircraft? (2013, December 13). Retrieved from Aviation Stack Exchange: http://aviation.stackexchange.com/questions/149/what-are-the-main-differences-piloting-boeing-vs-airbus-aircraft


Thursday, February 12, 2015

A Modified Work Schedule for UAS Teams

The scenario for this assignment consists of the hiring of HUM - Human Factors Management’s consulting expertise for the Predator,  MQ-1B Medium Altitude, Long Endurance (MALE) UAS squadron of the United States Air Force (USAF).  The current work status of the squadron is that the crews are performing constant missions.  This availability for this mission is 24 hours a day, 7 days a week, 365 days a year.  This availability is necessary for the provision of armed, Intelligence, Surveillance, and Reconnaissance (ISR) for boots on the ground while in combat zones.  To successfully accomplish these missions, the Unmanned Aircraft System crews have been separated into 4 teams.  The current status of the imposed work schedules are continuous shift work with 6 days on, 2 days off 8.5 hour shifts (Day, Swing, and Night shift).  As each team completed their eight day, work/off cycle they were rotated to the next shift.  The consulting firm – HUM was made aware of the Squadron Commander’s concern for his UAS crews.  The concerns are very valid because the teams have reported extreme fatigue while conducting operations.  The team has also complained of inadequate sleep due to their current shift schedules. 
Given the nature of the UAS crew’s schedule several human factors issues can be raised with validity.  The first human factors issue of concern would be the rotation schedule.  The UAS crew is unfortunately plagued with rotating three completely different shifts for a prescribed period of time.  This type of shift work often time lead to fatigue, physical and mental drain, all round instability.  Many recognize this bodily function with the phrase “internal body clock”
 or “internal clock”.  This preverbal “clock” is responsible for the body’s 24-hour cyclic biological processes.  Those processes are brain wave activity, hormone production, cell regenerations, and other biological activities (Circadian Rhythm Disorders: Shift Work, Jet Lag, and More, 2005).  Unfortunately, when this happens to shift workers it is normal for it to take several days to upwards of a week for the human body to adjust the major changes (Orlady & Orlady, p. 300, 1999). 
Next human factor identified centers around personnel extreme fatigue.  A normal long-term schedule would be five days on 2 days off for resting.  The designated six days of work followed by the two days off cycle is suspected to lead to chronic fatigue among crew members.  “Since workers in shift systems require more time to recover than those working only day shifts, the observed chronic fatigue is likely reflective of continued inadequate opportunity for restorative sleep” (Miller, Tvaryanas, Platte, Swigart, & Colebank, 2008, p. 20).  Due to this information it is an obvious assessment that 2 days are far too few days needed to achieve the maximum resting period required to recuperate the human body. 
To maintain mission standard but address the physical and mental needs of the Squadron; a modified work schedule was created to meet the 24-hour, 7 day a week over a year time span.  The modified work schedule was based upon a three day, 12.5 hour shift scheduled which allows three days off.  The elimination of the swing shift and permanently assigning each crew member to either the day or night shift would be more efficient and suitable for the stress and fatigue previously observed.  This modifications can potentially foster a synchronized UAS crew allowing their circadian rhythm “internal clocks” the time to adapt to a much more stable
schedule.





  


Circadian Rhythm Disorders: Shift Work, Jet Lag, and More. (2005). Retrieved February 13, 2015, from http://www.webmd.com/sleep-disorders/guide/circadian-rhythm-disorders-cause 

Miller, N.L., Tvaryanas, A.P., Platte, W., Swigart, C., & Colebank, J. (2008, January). A resurvey of shift work-related fatigue in MQ-1 Predator unmanned aircraft system crewmembers. Monterey, CA: Naval Postgraduate School.

Orlady, H.W., & Orlady, L.M. (1999). Human factors in multi-crew flight operations. Burlington, VT: Ashgate Publishing Company.


Tuesday, February 10, 2015

Predator – UAS operations Beyond Line of Sight

The MQ-1B Predator is a special Unmanned Aircraft System in that is has the capability of operating in Line of Sight (LOS) and Beyond Line of Sight (BLOS).  Numerous UAS are capable of LOS operations but a select few are capable of operating in BLOS successfully to closure of their mission. 
MQ-1B Predator is part of an armed remotely piloted aircraft system that is multi-mission, medium-altitude, capable of long-endurance missions (MQ-1B Predator, 2010).  The primary mission of the Predator is the capture intelligence and secondly execute targets dynamically (US Air Force, 2010).    The capabilities of the MQ-1B is impressive.  This UAS has a significant loiter time, larger suite of sensors, highly exact and accurate weapons, and multi-mode comm suite (MQ-1B Predator, 2010).  ISR, close proximity air support, search and rescue during combat, among others missions have successfully used the MQ-1B (MQ-1B Predator, 2010).  The MQ-1B's capabilities make it uniquely qualified to conduct irregular warfare operations in support of combatant commander objectives.  The MQ-1B is not self-sufficient.  It still requires system maintenance and during possible 24 hour operations human must be involve at some point or the other. 
In order to operate one MQ-1B a three man crew is assembled.  The MQ-1B crew consists of a pilot and two sensor operators.  The pilot maneuvers the aircraft using controls that transmit their commands.  Normal operations commands transmitted by way of a C-Band-Line-of-sight data link (US Air Force, 2010).  Beyond-line-of-sight missions require the use of Ku-Band satellite link for communication, command and control of the UAS.  This is accomplished using the links to and from the UAS to satellite to Ground Control Station (US Air Force, 2010).  Depending upon what stage of the mission, the three man crew can remotely control the aircraft from the ground control station (GCS) through use of line-of-sight data link or operate the UAS using satellite data link for beyond line-of-sight if the aircraft is has covered a greater distance (US Air Force, 2010).  In the case of BLOS missions, the MQ-1B Predator is equipped with an infrared sensor, color daylight TV camera, laser designator or illuminator, and an image-intensified TV camera as well (UAS Air Force, 2010).  The cameras allow for viewing of full-motion video from the each imaging sensor, which can then be streamed independently or combined together into one video stream.  Moreover, the Predator can also utilize laser-guided missiles for target execution.  These are operated slightly different in the LOS opposed to BLOS.  LOS provides a smaller opportunity for signal loss or disruption due to UAS distance to the Ground Control Station.  However BLOS provides and additional phase for communication.  The integration of the satellite could be seen as a disadvantage.  The communications is no longer from Ground Control Station to MQ-1B but now, communications must be sent to the Satellite, and transferred back to the Satellite Uplink Vehicle; a benefit to this method is that this data signal can also be sent to other military facilities (Valdes, 2004). 
One commonly discussed human factors issue that occurs when an unmanned aircraft pilot operates an UAS such as the MQ-1B limited situational awareness, tunnel vision, fatigue and boredom.  This happens because pilots must now rely heavily on cameras to gain situational awareness.  The pilot may feel as though he or she are looking through a narrow tunnel when accessing the video stream.  This may significantly limit his or her ability readily gain the much needed situational awareness required for safe flight during mission operations.  Additionally, fatigue is a common human factors issue associated with piloting UASs.  UAS pilots must essentially stare at a monitor for long hours, oftentimes this results in boredom and fatigue. 
One application that would interest commercial use of UAS would be the use of BLOS when filming movies on location when on rough terrain, sky maneuvers or major stunts are involved.  Shipping industries and also the retail industry could take advantage of UAS BLOS capabilities when shipping goods and providing services for various customers.  One example would be Amazon. 



Figure 1. Predator UAV Communication System.   This figure illustrates the design of the Predator UAV Communication System. It is composed for three main parts: 1) Ground Control Station 2) Predator Drone and 3) Satellite Relay.  The satellite relay serves as communication between the UAV and the GCS particularly in beyond line of sight missions. Figure was borrowed from Valdes (2004).


Federal Aviation Administration (2008). FAA Surveillance and Broadcast Services
Retrieved November 7, 2014 from http://www.faa.gov/about/office_org/headquarters_offices/ato/service_units/enrou
te/surveillance_broadcast/

MQ-1B Predator. (2010, July 20). Retrieved February 11, 2015, from http://www.af.mil/AboutUs/FactSheets/Display/tabid/224/Article/104469/mq-1b-predator.aspx

Vu, K. L., Kiken, A., Chiappe, D., Strybel, T. Z., & Battiste, V. (2013). Application of part-whole training methods to evaluate when to introduce NextGen air traffic management tools to students. The American Journal of Psychology, 126(4), 433-447. doi:http://dx.doi.org/10.5406/amerjpsyc.126.4.0433


Valdes, R. (2014). How the Predator UAV Works. How Stuff Works. Retrieved from http://science.howstuffworks.com/predator6.htm

Monday, February 9, 2015

Unmanned Aerial Systems Integration in the National Air Space

The Next Generation Air Transportation System (NextGen) is the Federal Aviation Administration's (FAA) and Congress’ way of implementing and handling the new onset of air traffic expected in the near future.  NextGen in theory is the much needed upgrade of current procedures and technology used by Air Traffic Controllers (ATC).  This newly designed state of the art technology is expect to provide a more efficient National Airspace System (NAS) capable of tracking and controlling the movement of the large influx of new aircraft into the air traffic system.  NextGen creation promises reliability, safety, security, ability to handle increased capacity, and a way to minimize aviation impacts to the environment.  Current aircraft controllers will use GPS technology based reporting system that is installed in both the ground stations and aircrafts located across the United States.  The overall all solution to the old system would consist of shorter air travel routes, induce savings of fuel and money, and for most passengers a big plus, reduce delays in air travel.  Increased safety is introduced by the reduction of air traffic controller workloads.  This will come in the form of allowing the controller to downgrade to simple basic monitoring and oversight of flying aircrafts. 
The NextGen system consists of six different technologies that are touted to enhance safety and pilot awareness.  Those technologies are the Automatic dependent Surveillance-Broadcast (ADS-B), Collaborative Air Traffic Management Technologies (CATMT), Data Communication (Data Comm), National Airspace System Voice System (NVS), NextGen Weather, and the System Wide Information Management (SWIM) (NextGen, n.d.)
NextGen technology uses space based navigation and integrated surveillance of aircraft in the NAS. This system is known as the Automatic Dependent Surveillance-Broadcast (ADS-B). The ADS-B is a cooperative surveillance technology which determines an aircraft’s position by way of satellite navigation (NextGen, n.d.). These broadcasting signals relayed by the satellite allow air traffic controllers the ability to track and monitor active aircraft precise locations and possibly predict future locations for expedited take-offs and landing processes (NextGen, n.d.).  In addition, the ADS-B includes enhancements for pilot awareness by streaming flight information to the cockpits of aircraft properly equipped for data reception (NextGen, n.d.). The CATMT is an upgraded version of decision-support and data-sharing tools currently used by air traffic personnel. The goal of the CATMT is foster a more collaborative environment between air traffic controllers and aircraft operators (NextGen, n.d.).  This collaboration will improve the efficiency within the National Airspace System.  Data Communications (Data Comm) provides a means for controllers to transmit clearances and instructions to pilots digitally.  Data Comm is designed to supersede the old way of voice communications (NextGen, n.d.). To reduce pilot error the aircraft’s flight computer has the ability to act reciprocally with the data that is displayed in the cockpit (NextGen, n.d.).  Much like the Data Comm system the National Airspace System Voice System (NVS) will function in the say way on a facilities level connecting the voice infrastructures for the FAA.  System Wide Information Management (SWIM) is the real-time exchange network designed to relay digital information for the NextGen (NextGen, n.d.).  NextGen Weather uses the SWIM network to provide controllers and operators with better weather information that can affect flight plans, decisions and performance.  NextGen Weather is provided by inter agency cooperation between NASA, NOAA, and the FAA (NextGen, n.d.).
NextGen’s suite of data exchange and communications received by the ground control stations and air traffic controllers replaces the need for the secondary radar.  Technology of this magnitude provides greater situational awareness and also provides the much needed way of self-separation (FAA website, 2014).  All of these planned upgrades are well and good however, if these new technologies are not inclusive to UAS in the NAS the drive for safer airways may be all for naught.  The UAS industries should consider the use of ADS-B for greater broadcast surveillance.  The use of this GPS technology will provide the UAS operators the ability to detect, see, and avoid probable traffic conflicts.  The UAS industry must wholeheartedly buy into this concept, and new technology for the integration into the NAS to be successful.  Uses of this technology may reduce or eliminate redundant flight procedures for lost data-link.  A way to mitigate these procedures is critical although they are still in development.  Understanding the human factors associated with the operations of UAS and also the air traffic controller is extremely important if integration is to be successful.  Not only should both the operators and controller be cognizant of air traffic but research has also proven that if the operators can relate to the controllers a heighten sense of awareness is induced. Outside of being the pilot in the air craft it is important to take into account the human factors associated with the separation of man and machine. This separation can causes an inherent loss of sensory cues, unwanted delays in communication, and difficulty in scanning the visual environment surrounding the vehicle (Kenny & Ferns, 2012). The UAS operator has to deal with “sensory isolation” (McCarley & Wickens, n.d.).  Additional human factors connected with control of UAS span from complacency to inadequate qualifications and training.  Because great efforts have been made by industry professionals, improvements and understanding of these needs have been fostered. 

Kenny, C. & Fern, L. (2012). Varying levels of automation on UAS Operator responses to traffic resolution advisories in civil airspace. Retrieved from: http://human- factors.arc.nasa.gov/publications/LevelsofAutomationonUASOperators.pdf

McCarley, J., & Wickens, C. (n.d.). HUMAN FACTORS CONCERNS IN UAV FLIGHT. Retrieved February 10, 2015, from http://www.hf.faa.gov/hfportalnew/Search/DOCs/uavFY04Planrpt.pdf
Federal Aviation Administration (FAA) Website. (2014). retrieved from: http://www.faa.gov/nextgen/programs/adsb/

Next Generation Air Transportation System. (2012, January 1). Retrieved February 10, 2015, from http://www.natca.org/legislative_current_issues.aspx?zone=Legislative-Current Issues&pID=200


NextGen. (n.d.). Retrieved February 10, 2015, from https://www.faa.gov/nextgen/programs/

Sunday, January 25, 2015

Unmanned Aerial Systems Ground Control Stations - Human Factor Issue

The topic of choice is the RQ-4 Global Hawk Unmanned Aerial Systems and its Ground Control Stations.  The Global Hawk provides field command with near-real time images in high-resolution using synthetic aperture radar –SAR, and long-range electro-optical/infrared –EO/IR sensing.  It is capable of carrying out various recon missions for multiple type of operations.  The Global Hawk has a nautical range of over 14,000 miles with a duration of flight that exceeds 42 hours.  The UAS is operable world-wide through the use of satellite and line-of-sight communications (RQ-4 Global Hawk, n.d.).  The Global Hawk and its variants are considered as semi-autonomous due to its occasionally required cross-checks and commands using human interfaces. 
The current Ground Control Stations are the MCE – Mission Control Element and the LRE – Launch and Recovery Element.  These GCS are designed for mobility and be self-sufficient.  This means that each trailer can function in separate sites and provide the necessary control needed to ensure mission success.  The LRE and the MCE workstations are generally manned by at least a minimum of three personnel crew.   
The Mission Control Element is the Global Hawks ground control station for recon operations.  From within the Mission Control Element crews are able to direct the aircraft where the UAS should go and what the UAS should do once it reaches its location.  The MCE contains four computer based work stations that provided human interaction for mission planning, CCO – Command and Control Operations, Communications, and sensor data collections and data processing (RQ-4 Global Hawk, n.d.). 

The Launch and Recovery Element does just what it is says, controls the launch and recovery of the UAS.  The LRE is responsible for “precision differential global positioning system corrections” which provides the necessary accuracy during mission navigation for landing and take-off of the Global Hawk.  The LRE is also responsible for “coded GPS” which also incorporates an inertial navigational system for mission execution (RQ-4 Global Hawk, n.d). 
The MCE and LRE pilot workstations are designed with control and display interfaces similar to an aircraft cockpit.  These workstations displays UAS health status, and sensors status. The pilot can also alter the navigational course of the Global Hawk.  These workstations also include pilot communications capabilities with outside command team members allowing the coordination of any mission.  This would include team members such as air traffic control, airborne controllers, ground controllers, and additional Intelligence Surveillance and Reconnaissance personnel of value (RQ-4 Global Hawk, 2014).  

The workstation designated for sensor operators furnishes the capability of assigning the sensors and continuously updating numerous plans during real time operations.  This workstation is able to initiate the calibration of sensors, plus observe and check the progress; event the quality; of sensors through the mission (RQ-4 Global Hawk, 2014).   Additional responsibilities include sensor operator node exploitation with image quality control allowing the UAS ability to provide the best image possible (RQ-4 Global Hawk, 2014).  The sensor operator is also responsible for target decking prioritization and the tracking of scenes for fluid operations (RQ-4 Global Hawk, 2014).

Below are figures 1 through 4 which depict the RQ4 Global Hawk and its internal set-up of the MCE and the LRE.  These photo provided to supply a visual explanation of the confined close quartered, non-ergonomic arrangements for the UAS pilots which can lead to human factor issues during flight.

Figure 1.0 An RQ-4 Global Hawk gets prepared for a mission while deployed Nov. 23, 2010, at an air base in Southwest Asia. The RQ-4 and the Airmen are assigned to the 380th Expeditionary Operations Group.  Retrieved Jan. 25, 2015 from http://www.af.mil/AboutUs/FactSheets/Display/tabid/224/Article/104516/rq-4-global-hawk.aspx

Figure 2.0, GCS for the Global Hawk. Retrieved Jan. 25, 2015 from http://www.af.mil/AboutUs/FactSheets/Display/tabid/224/Article/104516/rq-4-global-hawk.aspx


Figure 3.0   GCS for the Global Hawk. Retrieved Jan. 25, 2015 from http://www.af.mil/AboutUs/FactSheets/Display/tabid/224/Article/104516/rq-4-global-hawk.aspx

Figure 4.0, GCS for the Global Hawk. Retrieved Jan. 25, 2015 from http://www.af.mil/AboutUs/FactSheets/Display/tabid/224/Article/104516/rq-4-global-hawk.aspx

The Global Hawk is capable of automation it still requires human interfacing for monitor the health and status of the aircraft, and information exchange between the sensor operators & between the UAS and the GCS. Although the Global Hawk “Hawk has no cockpit”, “It flies itself”, “has no joysticks, throttles, or pedals”, provide no “pilot's-eye view from the plane”, incorporates no “forward-facing camera” it still requires human interface (Schorr & Weed, 2002).   Yes, it is dubbed as the “first man-out-of-the-loop airplane” during flight but monitoring of missions are left in the hands of the two on-board computers (Schorr & Weed, 2002).   UAS operations that are long-endurance, such as the Global Hawk require shifting of work schedules to operate the Ground Control Station 24/7 causing fatigues (McCarley, & Wickens, 2005).  These prolong hours for UAS pilots lead to serious inferences on physical performances and mental stamina of UAS pilots.  Discussions have led to “identified automation as being central to many of the human factors issues that are of concern in the case of the Global Hawk UAV” (Burchat, Hopcroft, & Vince, 2006).  In addition the UAS operators voiced that they feel “it is difficult to monitor the automated system closely over extended periods” (Burchat, Hopcroft, & Vince, 2006).  In addition, Situational awareness and resolutions of fault and failures suffer resulting in pilots selectively monitoring certain cockpit instruments for system performance or being prepared for unexpected changes. 
Although the Global Hawk has an extremely low crash record this doesn’t mean that the pilots are able to totally rely upon the system.  No matter what the level of automation, when humans are involved human factors must be monitored and addressed for continued mission success.  One way to continue successful missions would be to continuously train increasing pilot understanding of the system, and possible systems failure scenarios cultivating the proper timely respond required for such a sophisticated technology.

  

Burchat, E., R. Hopcroft, & Vince, J.  (2006, May). Unmanned Aerial Vehicles for Maritime Patrol: Human Factors Issues. Retrieved January 25, 2015, from http://www.dtic.mil/dtic/tr/fulltext/u2/a454918.pdf

McCarley, J., & Wickens, C. (2005). Human factors implications of UAVs in the national airspace. Savoy, Ill:  University of Illinois at Urbana-Champaign, Aviation Human Factors Division. Retrieved January 25, 2015, from http://www.tc.faa.gov/logistics/Grants/pdf/2004/04-G-032.pdf

RQ-4 Global Hawk. (2014, October 27). Retrieved January 25, 2015, from http://www.af.mil/AboutUs/FactSheets/Display/tabid/224/Article/104516/rq-4-global-hawk.aspx
RQ-4 Global Hawk. (n.d.). Retrieved January 25, 2015, from http://air-attack.com/page/54/RQ-4-Global-Hawk.html

Schorr, C. & Weed, W. (2002, August 1). Flying Blind. Retrieved January 25, 2015, from http://discovermagazine.com/2002/aug/featflying


This blog will used for the purpose of fulfilling an ERAU requirement for the ASCI Human Factors in Unmanned Aerial Systems course. I am currently a student of the university seeking a Masters in Aeronautical Science.