Saturday, October 26, 2019

ERAU AVRL Crash Lab



 The ERAU AVRL simulation tool helps explores the design, assembly, and simulation of different Unmanned Aerial Systems (UAS) for different missions such as a missing hiker at the Yosemite, Agricultural survey and Crash lab inspection. For this activity, I selected the Virtual Crash Lab which involves inspecting a crashed Boeing 737. I selected to assemble and simulate three different UAS to assess their performance in this crash site environment.

The first UAS I tried using for this mission was the Tern fixed-wing UAS. I added a brushless electric motor, auto control, a GPS module, dipole antenna, EO camera with gimbal, temperature sensor, ERA Powerhouse 10000 battery. For the GCS, I used the GCS Trailer and large Dipole antenna. I, however, found the Tern fixed-wing UAS unsuitable due to the fact that it flew past the crash site very fast and did not enable for a good view or assessment of the crash site.  Because the crash site was also located over a small area. The UAS was also limited to a certain turn radius. When selecting waypoints for automated flights, a warning that the turn or angle would exceed the maximum turn radius of the aircraft.


Next, I tried the Gadfly quadcopter UAV with an empty weight of 1.6 lbs and a max weight of 3.562lbs. I added an X5 Red Electric motor, auto control, NDVI, and PSI 0015 IR Cameras, ERA Enterprise 2300 battery, Dipole antenna, and a GPS Module. The total takeoff weight was 2.5lbs, 900 m radio range, and about 18 mins of flight time. The big problem with the Gadfly quadcopter UAV was that both the NDVI and PSI 0015 IR cameras were attached and fixed to the UAS. This hindered viewing the crash site from different angles especially when the UAS was flying out of direct view of the crash site. The result is an incomplete picture or view of the crash site.


      The third UAS  I tried was the Condor Octorotor (professional) UAS. The Octorotor allowed for the assembly of an X5 Red Electric motor, Infrared sensor with gimbal and a LiDAR camera, GPS Module, Dipole antenna, ERA Powerhouse 10000 battery. Unlike the Gadfly Quadcopter, the angles of the Infrared sensor with gimbal and LiDAR cameras were adjustable allowing for a better view and assessment of the crash site.

The man-portable and handheld GCS enabled the operator to have a direct view of the UAS around the crash, unlike the Trailer type GCS. This can enable the operator to make adjustments when necessary especially when flying in manual control mode.

Tuesday, October 15, 2019

UAS Human Factors




Human factors in Unmanned Aerial Systems are the effects of interactions with UAS systems which can either be positive impacts on the system or negative impacts on the system due to limitations in system design or human errors. 
When it comes to the design and manufacture of unmanned system ground control systems, it has been noted that many designers do not involve unmanned system operators during designs of these systems.  The lack of aviator involve has led to cockpits that are wither misapplied (McCarley & Wickens, n.d.). Designers also tend to use video game and smartphone designs to develop unmanned system GCS which are in most cases divergent from aviation standards. The rush to put out a working model has also created underdeveloped or incomplete mission requirements. The result of these inconsistent or non-standardized GCS are pilot errors and confusions or inappropriate responses during emergencies which could lead to the loss of the UAS.
Another human factor in UAS operation is over automation and inadequate ineffective command interfaces. For many UAS especially the highly autonomous systems such as the RQ4, most of the operations are automated with the operator only having access to a throttle, keyboard and a mouse. In most of these highly autonomous systems, the pilot is more of a program manager than an operator. The Pilot also uses more text entries which can cause distractions and errors as the pilot/operator has to shift eyes from keyboard to textbox(Howe, 2017).

Solutions to overcome relying heavily on autonomy is to make the UAS systems more intuitive. Command interface should be more of a display, buttons, and layouts, and button guards for critical knobs to avoid mistakes when under duress. Challenges with video visuals and depth can be overcome with the development of technologies that aid in-dept perception such as heads up displays and stereoscopic vision technology. In the long run, standardization of GCS such as the ones seen in manned aircraft will lead to less confusion and errors which are usually more experienced in emergency situations.

 References
Howe, S. (2017). The leading human factors deficiencies in unmanned aircraft systems. Retrieved from https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20170005590.pdf
McCarley, J., & Wickens, C. (n.d.). Human factors concerns in uav flight. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.551.6883