Georgia Institute of Technology Georgia Institute of Technology

Research Horizons

Georgia Tech's Research Horizons Magazine
Menu
On Their Own - By Rick Robinson, Photos by Rob Felt

On Their Own

A
As a team of unmanned quadrotor aircraft hovers above, six small ground robots roll into an unfamiliar two-story structure. Soon, the rotorcraft are darting about mapping the upper floor, while the ground vehicles chart the lower floor, process both teams’ data, and produce a full building plan for computers outside.
Notably absent are human beings and radio control devices. This little squadron is fully autonomous.

In robotics, autonomy involves enabling unmanned vehicles to perform complex, unpredictable tasks without human guidance. Today, in the early stages of the robotics revolution, it’s among the most critical areas of research.

“The move to true autonomy has become highly important, and progress toward that goal is happening with increasing speed,” said Henrik Christensen, executive director of the Institute for Robotics and Intelligent Machines (IRIM) at Georgia Tech and a collaborator on the mapping experiment. “It won’t happen overnight, but the day is coming when you will simply say to a swarm of robots, ‘Okay, go and perform this mission.’”

Traditionally, robotic devices have been pre-programmed to perform a set task: Think of the factory robot arm that automatically performs a repetitive function like welding. But an autonomous vehicle must be fully independent, moving — without human intervention — in the air, on the ground, or in the water. Well-known examples include the prototype self-driving vehicles currently being tested in some U.S. cities.

photo - icefin underwater device with electronics exposed
 Icefin’s robust electronics enabled it to navigate under Antarctica’s Ross Ice Shelf and explore to a depth of 500 feet.

Vehicular autonomy requires suites of sensors, supported by advanced software and computing capabilities. Sensors can include optical devices that use digital camera technology for robotic vision or video reconnaissance; inertial motion detectors such as gyroscopes; global positioning system (GPS) functionality; radar, laser and lidar systems, pressure sensors, and more.

At Georgia Tech, researchers are developing both commercial and defense-focused technologies that support autonomous applications. This article looks at some of the Georgia Tech research teams focused on improving performance of autonomous surface, air, and marine systems.
 

Autonomous in Antarctica

Radio-controlled devices have been replacing humans in hazardous situations for years. Now, autonomous vehicles are starting to take on complex jobs in places where radio frequency signals don’t work.

An autonomous underwater vehicle (AUV) known as Icefin has already ventured deep under the vast Ross Ice Shelf in Antarctica, simultaneously testing a unique vehicle design and gathering new information on conditions beneath the ice. Icefin was designed and built in just six months by a team led by researchers from the Georgia Tech School of Earth and Atmospheric Sciences collaborating with an engineering team from the Georgia Tech Research Institute (GTRI).

“GPS signals are blocked by water and ice under the Ross Ice Shelf, and, at the same time, the water is quite murky, making navigation with optical devices a challenge,” explained Mick West, a principal research engineer who led the GTRI development team and participated in the recent Icefin mission in Antarctica. “In some ways, this kind of marine application is really helping to push robotic autonomy forward, because in an underwater environment unmanned devices are completely on their own.”

The research team used tools suited to underwater navigation including Doppler velocity detection; sensors that perform advanced sonar imaging; and conductivity, temperature, and depth (CTD) readings. They also added inertial navigation capability in the form of a fiber optic gyroscope.

Photo - three people picking up Icefin to lower it into large pool.
GTRI principal research engineer Mick West, graduate student Jacob Buffo of the School of Earth and Atmospheric Sciences, and undergraduate Matthew Meister of the School of Mechanical Engineering handle the 210-pound Icefin at Georgia Tech’s Underwater Acoustics Research Laboratory.


The sensors were supported by a software-based technique known as simultaneous localization and mapping (SLAM) — computer algorithms that help an autonomous vehicle map an area while simultaneously keeping track of its position in that environment. Using this approach, Icefin could check its exact position at the surface via GPS, and then track its new locations relative to that initial reference point as it moved under the ice. Alternatively, it could navigate using local features such as unique ice formations.

Icefin successfully ventured down to 500 meters under the Ross Ice Shelf and found that the frigid waters were teeming with life. On this expedition, data was sent back from under the ice via a robust fiber-optic tether; novel technologies to transmit information wirelessly from under the water are being studied.

West’s GTRI team was part of a team led by Britney Schmidt, an assistant professor in the School of Earth and Atmospheric Sciences and principal investigator on the Icefin project. Technologies developed for Icefin could someday help search for life in places like Europa, Jupiter’s fourth-largest moon, which is thought to have oceans similar to Antarctica’s. (See Instruments to Europa, from this issue.)

“We’re advancing hypotheses that we need for Europa and understanding ocean systems here better,” Schmidt said. “We’re also developing and getting comfortable with technologies that make polar science — and eventually Europa science — more realistic.”

Photo - Fumin Zhang
Fumin Zhang, associate professor in the School of Electrical and Computer Engineering, works with small airborne blimps that can simulate the behavior of underwater vehicles for research purposes.

Real-World Performance

Autonomous technologies are generally developed in laboratories. But they must be tested, modified, and retested in more challenging environments.

Cédric Pradalier, an associate professor at the Georgia Tech-Lorraine campus in Metz, France, is working with aquatic vehicles to achieve exacting autonomous performance.
“I’m an applied roboticist. I bring together existing technologies and test them in the field,” Pradalier said. “Autonomy really becomes useful when it is precise and repeatable, and a complicated real-world environment is the place to develop those qualities.”

Pradalier is working with aquatic robots, including a 4-foot long Kingfisher unmanned surface vessel modified with additional sensors. His current research aim is to refine the autonomous vehicle’s ability to closely monitor the shore of a lake. Using video technology, the craft surveys the water’s edge while maintaining an exact distance from the shore at a consistent speed.

As currently configured, the boat performs a complex mission involving the taking of overlapping photos of the lake’s periphery. It can autonomously stop or move to other areas of the lake as needed, matching and aligning sections of the shore as they change seasonally.

Applications for such technology could include a variety of surveillance missions, as well as industrial uses such as monitoring waterways for pollution or environmental damage.
One of the advantages of autonomy for mobile applications is that a robot never gets tired of precisely executing a task, Pradalier said.

“It would be very tedious, even demanding, for a human to drive the boat at a constant distance from the shore for many hours,” he said. “Eventually, the person would get tired and start making mistakes, but if the robot is properly programmed and maintained, it can continue for as long as needed.”

photo - two people standing on small boat in lake
Graduate students Dmitry Bershadsky and Pierre Valdez of the School of Aerospace Engineering prepare a 16-foot wave adaptive modular vehicle (WAM-V) for testing at Georgia’s Sweetwater Creek State Park.

 

Ocean-Going Software

An autonomous underwater vehicle (AUV) faces unknown and unpredictable environments. It relies on software algorithms that interact with sensors to address complex situations and adapt to unexpected events.

Fumin Zhang, an associate professor in the Georgia Tech School of Electrical and Computer Engineering (ECE), develops software that supports autonomy for vehicles that delve deep into the ocean gathering data. His work is supported by the Office of Naval Research and the National Science Foundation.

“Underwater vehicles often spend weeks in an ocean environment,” Zhang said. “Our software modules automate their operation so that oceanographers can focus on the science and avoid the stress of manually controlling a vehicle.”

The ocean is a challenging and unpredictable environment, he explained, with strong currents and even damaging encounters with sea life. Those who study underwater autonomy must plan for both expected conditions and unexpected events.

Among other things, the team is using biologically related techniques, inspired by the behavior of sea creatures, to enhance autonomous capabilities.

It won’t happen  overnight, but the  day is coming when  you will simply say to  a swarm of robots,  ‘Okay, go and perform this mission’.

Zhang has also devised an algorithm that analyzes collected data and automatically builds a map of what underwater vehicles see. The resulting information helps oceanographers better understand natural phenomena.

In 2011, a Zhang student team designed and built an AUV from scratch. Working with Louisiana State University, the team used the craft to survey the Gulf of Mexico and assess underwater conditions after a massive oil spill off the U.S. coastline.

Among other novel AUVs developed by Zhang’s team is one constructed entirely of transparent materials. The design is aimed at testing optical communications underwater.

To facilitate underwater testing of AUVs, Zhang and his team have developed a method in which autonomous blimps substitute for underwater vehicles for research purposes. The blimps are flown in a large room, lessening the time needed to work in research pools.

“The aerodynamics of blimps have many similarities to the conditions encountered by underwater vehicles,” Zhang said. “This is an exciting development, and we are going full speed ahead on this project.”
 

Undergraduates Tackle Autonomy

At the Aerospace Systems Design Laboratory (ASDL), numerous Georgia Tech undergraduates are collaborating with professors, research faculty, and graduate students on autonomous vehicle development for sponsors that include the Naval Sea Systems Command (NAVSEA) and the Office of Naval Research (ONR).

For instance, student teams, working with the Navy Engineering Education Center (NEEC), are helping design ways to enable autonomous marine vehicles for naval-surface applications.

This year, the researchers plan to work on ways to utilize past ASDL discoveries in the areas of autonomy algorithms and the modeling of radio frequency behavior in marine environments. The aim is to exploit those technologies in a full-size robotic boat, enabling it to navigate around obstacles, avoid other vehicles, find correct landing areas, and locate sonar pingers like those used to identify downed aircraft.

Among other things, they are working on pathfinding algorithms that can handle situations where radio signals are hampered by the humid conditions found at the water’s surface. They’re developing sophisticated code to help marine networks maintain communications despite rapidly shifting ambient conditions.

In addition to the NEEC research, ASDL undergraduates regularly compete against other student teams in international autonomous watercraft competitions such as RobotX and RoboBoat.

“The tasks in these competitions are very challenging,” said Daniel Cooksey, a research engineer in ASDL. “The performance achieved by both Georgia Tech and the other student groups is really impressive.”

These competitions have inspired other spin-off undergraduate research efforts. In one project, an undergraduate team is developing autonomous capabilities for a full-size surface craft. The resulting vehicle could be used for reconnaissance and other missions, especially at night or in low-visibility conditions.

“Basically, we study the ways in which adding autonomy changes how a vehicle is designed and used,” said Cooksey. “We’re working to achieve on the water’s surface some of the performance that’s being developed for automobiles on land.”

ASDL, which is part of the Daniel Guggenheim School of Aerospace Engineering, is directed by Regents Professor Dimitri Mavris.

photo - Professor Henrik Christensen standing in exit doorway with robot.
Professor Henrik Christensen of the School of Interactive Computing follows a rescue robot designed to guide people to safety in a low-visibility crisis situation such as a fire.
 

Diverse Robots Team Up

A major goal of today’s autonomous research involves different robots cooperating on complex missions. As part of the Micro Autonomous Systems and Technology (MAST) effort, an extensive development program involving 18 universities and companies, Georgia Tech has partnered with the University of Pennsylvania and the Army Research Laboratory (ARL) on developing heterogeneous robotic groups. The work is sponsored by the ARL.

In the partners’ most recent joint experiment at a Military Operations on Urban Terrain (MOUT) site, a team of six small unmanned ground vehicles and three unmanned aerial vehicles autonomously mapped an entire building. Georgia Tech researchers, directed by Professor Henrik Christensen of the School of Interactive Computing, developed the mapping and exploration system as well as the ground vehicles’ autonomous navigation capability. The University of Pennsylvania team provided the aerial autonomy, and ARL handled final data integration, performance verification, and mapping improvements.

“We were able to successfully map an entire two-story structure that our unmanned vehicles had never encountered before,” said Christensen. “The ground vehicles drove in and scanned the bottom floor, and the air vehicles scanned the upper floor, and they came up with a combined model for what the whole building looks like.”

The experiment used the Omnimapper program, developed by Georgia Tech, for exploration and mapping. It employs a system of plug-in devices that handle multiple types of 2-D and 3-D measurements, including rangefinders, RGB-F computer vision devices, and other sensors. Graduate student Carlos Nieto of the School of Electrical and Computer Engineering (ECE) helped lead the Georgia Tech team participating in the experiment.

The research partners tested different exploration approaches. In the “reserve” technique, robots not yet allocated to the scanning mission remained at the starting locations until new exploration goals cropped up. When a branching point was detected by an active robot, the closest reserve robot was recruited to explore the other path.

In the “divide and conquer” technique, the entire robot group followed the leader until a branching point was detected. Then the group split in half, with one robot squad following the original leader while a second group followed their own newly designated leader.

In other work, mobile robots’ ability to search and communicate is being focused on ways that would promote human safety during crisis situations. Technology that can locate people in an emergency and guide them to safety is being studied by a team that includes GTRI research engineer Alan Wagner, ECE Professor Ayanna Howard, and ECE graduate student Paul Robinette.

Dubbed the rescue robot, this technology is aimed at locating people in a low-visibility situation such as a fire. Current work is concentrated on optimizing how the rescue robot interacts with humans during a dangerous and stressful situation.

When development is complete, the robot could autonomously find people and guide them to safety, and then return to look for stragglers. If it senses an unconscious person, it would summon help wirelessly and guide human or even robotic rescuers to its location.

Photo - Eric Johnson
Eric Johnson, associate professor in the School of Aerospace Engineering, holds a 1.1-pound quadrotor designed and built by his students that can replicate the autonomous navigational and sensing performance of the 200-pound commercial helicopter behind him.

Controlling Robot Swarms

Magnus Egerstedt, Schlumberger Professor in the School of Electrical and Computer Engineering (ECE), is focused on cutting-edge methods for controlling air and ground robotic vehicles. He’s investigating how to make large numbers of autonomous robots function together effectively, and how to enable them to work harmoniously — and safely — with people.

In one project for the Air Force Office of Scientific Research, Egerstedt is investigating the best methods for enabling autonomous robots to differentiate among interactions with other robots and humans. Another important issue: getting robots to organize themselves so they’re easier for a person to control.

“Think about standing in a swarm of a million mosquito-sized robots and getting them to go somewhere or to execute a particular task,” he said. “It’s clear that somehow I need to affect the entire flow, rather than trying to manipulate individuals. My research shows that the use of human gestures is effective for this, in ways that can resemble how a conductor guides an orchestra.”

Egerstedt is using graph theory — mathematical structures used to model relations between objects — and systems theory to design multi-agent networks that respond to human prompts or similar control measures. He’s also concerned with preventing malicious takeover of the robotic swarm. To address this issue, he’s programming individual robots to recognize and reject any command that could lead to unacceptable swarm behavior.

One of Egerstedt’s major goals involves establishing an open access, multi-robot test bed at Georgia Tech where U.S. roboticists could run safety and security experiments on any autonomous system.

photo - quadcopter
A research quadrotor built by Associate Professor Eric Johnson and graduate students Dmitry Bershadsky and Stephen Haviland is being used at the School of Aerospace Engineering to study obstacle avoidance and mapping using vision sensors. Here it maneuvers during a flight test at the Fort Benning Army base in Georgia.
 

Developing Aerial Autonomy

At the Daniel Guggenheim School of Aerospace Engineering (AE), faculty-student teams are involved in a wide range of projects involving autonomous aerial vehicles.

Eric Johnson, who is Lockheed Martin Associate Professor of Avionics Integration, pursues ongoing research efforts in fields from collaborative mapping to autonomous load handling. Sponsors include Sikorsky Aircraft, the Defense Advanced Research Projects Agency (DARPA), National Aeronautics and Space Administration (NASA), and the National Institute of Standards and Technology (NIST).

“An early major research effort in aerial robotics was the DARPA Software Enabled Control Program, which used autonomy to support vertical takeoff and landing of unmanned aircraft,” Johnson said. “Many of our subsequent projects have been built on the foundation started at that time.”

Johnson and his student team are pursuing multiple projects in areas that include:

  • Vision-aided inertial navigation — This technology uses cameras, accelerometers, and gyroscopes to allow an autonomous aerial vehicle to navigate in conditions where GPS information isn’t available. Such situations include navigating inside buildings, flying between buildings, and operating when satellite signals are jammed or spoofed — falsified — by hostile forces.
  • Sling load control — Delivering large loads suspended underneath an aircraft is tricky even for manned rotorcraft; autonomous vehicles can encounter major stability issues during such missions. Johnson and his team are tackling these load control challenges, and are even working on a project that involves an autonomous aircraft delivering a load to a vehicle that’s also moving.
  • Collaborative autonomy — Johnson and his team recently demonstrated a collaborative mapping application that lets two autonomous rotorcraft use onboard laser scanners to cooperatively map an urban area. In a flight experiment at Fort Benning, Georgia, the two aircraft succeeded in not only sharing the mapping chores, but were also able to warn each other of hostile threats nearby.
  • Fault tolerance control — These techniques allow aircraft to autonomously recover from major unanticipated failures and continue flying. In one demonstration, Johnson and his team enabled an unmanned aircraft to remain aloft after losing half of a wing.
  • The Georgia Department of Transportation — Johnson and Javier Irizarry, associate professor in the School of Building Construction, are studying ways in which the state of Georgia might use unmanned vehicles to perform routine functions such as inspecting bridges and monitoring regulatory compliance in construction.

Meanwhile, AE student teams have placed strongly in recent competitions involving autonomous aerial vehicles, noted Daniel Schrage, an AE professor who is also involved in autonomy research. They have secured first place finishes in:

  • The American Helicopter Society International 3rd Annual Micro Air Vehicle Student Challenge. Directed by Johnson, the Georgia Tech team reproduced the functionality of a 200-pound helicopter in a tiny 1.1-pound autonomous rotorcraft, which accurately completed the competition’s required tasks.
  • The American Helicopter Society International 32nd Student Design Competition. Along with a team from Middle East Technical University (METU), the Georgia Tech undergraduate team developed an autonomous rotorcraft capable of delivering 5,000 packages a day within a 50-square-mile area.
     

Bio-Inspired Autonomy

Certain self-guiding aerial robots could someday resemble insects in multiple ways. Professor Frank Dellaert of the School of Interactive Computing and his team are taking inspiration from the world of flying creatures as they develop aerial robots.

The research involves mapping large areas using two or more small unmanned rotorcraft. These autonomous aircraft can collaboratively build a complete map of a location, despite taking off from widely separated locations and scanning different parts of the area.

This effort includes Dellaert’s team and a team directed by Nathan Michael, an assistant research professor at Carnegie Mellon University. The work is part of Micro Autonomous Systems and Technology (MAST), an extensive Army Research Laboratory program in which Georgia Tech and Carnegie Mellon are participants.

“This work has a number of bio-inspired aspects,” Dellaert said. “It supports the development of future autonomous aerial vehicles that would have size and certain capabilities similar to insects and could perform complex reconnaissance missions.”

Called Distributed Real-Time Cooperative Localization and Mapping, the project’s technology uses a planar laser rangefinder that rotates a laser beam almost 180 degrees and instantaneously provides a 2-D map of the area that’s closest to a vehicle. As the quadrotor flies, it collaborates by radio with its partner vehicles to build an image of an entire location, even exchanging the laser scans themselves to co-operatively develop a comprehensive map.

This work can be performed outdoors — and indoors where receiving GPS signals is a problem. Using built-in navigational capability, the collaborating vehicles can maneuver autonomously with respect to each other, while also compensating for the differences between their takeoff points so they can create an accurate map. This performance was made possible by novel algorithms developed by the two teams and by Vadim Indelman, an assistant professor at Technion - Israel Institute of Technology.

END

Rick Robinson is a science and technology writer in Georgia Tech’s Institute Communications. He has been writing about defense, electronics, and other technology for more than 20 years.

Research projects highlighted in this article are supported by sponsors that include the National Science Foundation (NSF), Office of Naval Research (ONR), U.S. Navy (USN), Air Force Office of Scientific Research (AFOSR), Army Research Laboratory (ARL), Defense Advanced Research Projects Agency (DARPA), National Aeronautics and Space Administration (NASA), National Institute of Standards and Technology (NIST), Georgia Department of Transportation (GDOT), and Sikorsky Aircraft. Any opinions, findings, conclusions, or recommendations expressed in this publication are those of the principal investigators and do not necessarily reflect the views of the NSF, ONR, USN, AFOSR, ARL, DARPA, NASA, NIST, GDOT, or Sikorsky.

on the air

Collaboration with CNN Investigates Use of UAVs for Newsgathering

By T.J. Becker

In June 2014, the Georgia Tech Research Institute (GTRI) and CNN launched a joint research initiative to study the use of unmanned aerial vehicles (UAVs) for newsgathering. In January 2015, CNN signed an agreement with the Federal Aviation Administration (FAA) to share the results of the research. The project is now gaining momentum as researchers shift their focus from evaluating UAV equipment to developing potential protocols for safe operations.

The issue: Hobbyists can fly drones without FAA oversight as long as the aircraft weighs 55 pounds or less, flies in unpopulated areas, and remains within line of sight of the operator. Yet flying drones for commercial purposes requires review and approval by the FAA. The only way to get a thumbs-up from the FAA is to pursue airworthiness certification (an expensive and complicated process that can take up to a year), or secure a “Section 333 exemption.”

A Section 333 exemption allows the FAA to waive the airworthiness requirement as long as the commercial UAV flights are conducted under a number of restrictions. Among these restrictions: Drone operators must notify local aviation authorities two or three days prior to flight — and operations over people or near airports are off-limits.

“Securing a 333 exemption is doable for the movie industry since obtaining aerial footage can be planned far in advance,” observed Mike Heiges, a GTRI principal research engineer who leads the CNN project. “Yet journalists can’t operate under these rules for breaking news and chaotic situations where there may be emergency responders, police helicopters, or the National Guard.”

Granted, drones aren’t needed for every news story, but they provide a unique perspective in many situations, said Greg Agvent, senior director of news operations for CNN/US.

“Being able to fly over an area after an earthquake or tornado hits would provide a deeper understanding of how widespread the devastation is,” Agvent explained and pointed to the May 12 Amtrak train derailment in Philadelphia. “Part of the issue with the accident was the speed going into the curve. The ability to get footage from 200 feet in the air would have presented a better sense of the curve — context that you simply couldn’t get from the ground.”

Safety of news personnel is another benefit of drone journalism, Agvent added. “In many cases, such as a flood, safety would trump context. We could capture footage of an event without putting our people in harm’s way.”

Some of the research that comes out of the project will be helpful beyond newsgathering, observed Dave Price, a GTRI senior research technologist working on the project. “Commercial drones are of interest for crop monitoring and inspection of bridges and railroad tracks,” he explained. “Railroads and agriculture agencies will be able to see the results of CNN’s camera selection and stabilization systems and take advantage of this for their own applications.”

Photo- Greg Agvent and Cliff Eckertshown standing in CNN control room

The Georgia Tech Research Institute (GTRI) and CNN have been working together to study the issues affecting the use of unmanned aerial vehicles for newsgathering. Shown in CNN’s World Headquarters are (left) Greg Agvent, senior director of news operations for CNN, and Cliff Eckert, a GTRI senior research associate who’s working on the project. They are shown with an AirRobot AR 180, one of the devices that may be suitable for CNN’s use.
 

The Right Stuff

During the past year, the researchers, including GTRI and CNN staff, have been investigating different UAVs that could carry the type of camera systems journalists need to shoot and transmit aerial footage.

That’s easier said than done. For one thing, the commercial drone industry is in its infancy. Manufacturers come and go, and there aren’t a great number with a long track record. Another challenge is finding the right equipment — airframes and payloads that match up. “It’s a trade-off,” Heiges explained. “You have to factor in size, weight, and power of what you want to put on the aircraft with what the aircraft can carry.”

Flight times for many commercial drones aren’t long enough for CNN’s purposes, nor is video quality high enough. “To install a better camera, you need a bigger vehicle for endurance,” Heiges said. “And that means stepping up to UAVs that were developed for the military, which dramatically increases price.”

GTRI has been testing drones since 2006 through the FAA’s certificate of authorization process, which enables public institutions to operate drones in national airspace for research purposes. Currently, GTRI holds 28 certificates of authorization for specific locations in five states. For the project with CNN, GTRI provides pilots to fly the drones in approved areas, plans the flight tests with CNN’s participation, collects data, and prepares reports with recommendations.

One of CNN’s takeaways from the flight tests: Drone journalism is no one-person show. “In most cases, especially for live video, you need three people,” Agvent said. This includes a pilot to guide the actions of the UAV and an operator for the camera, which is usually suspended under the drone and sits on gimbals for stabilization.

“The third person, a spotter, is particularly important in urban areas,” Agvent continued. “The spotter focuses solely on situational awareness and communicates to the pilot about people and other aircraft that may be in the area. In some cases, you could get by with a two- person team — a pilot/cameraman and a spotter — but a trio is best to ensure both high quality and safety.”
 

Advancing to Operational Protocols

“We’ve hit a lot of milestones in the past year,” Agvent said. “Now, we begin to work on the finer points of flight operations and coordinating with air traffic control.”
One of the FAA’s chief concerns with drones is getting the word out to manned aircraft about a UAV’s presence in the area. The current practice is to file a “notice to airmen” two or three days in advance.

A new technology known as automatic dependent surveillance-broadcast (ADS-B) could provide a just-in-time alternative to the notice to airmen. Developed by the FAA, this technology enables aircraft to broadcast their GPS coordinates to anyone in the local air space that has ADS-B, and vice-versa, so the drone operator would be able to see other aircraft.

“It’s like having an air traffic radar map inside your cockpit,” Heiges said. “Even better, unlike conventional radar, ADS-B works all the way to the ground.” That’s important, because, in some situations, journalists may need to cooperate with police helicopters or medical aircraft flying at low altitudes to pick up patients.
Geo-fencing technologies, which prevent UAVs from entering airport and other restricted areas, could add another layer of safety, Heiges added.

Because FAA rules prohibit drones from flying over people, crowd-control issues must also be resolved. For example, are journalists responsible for blocking off the area where they wish to fly drones — or do they communicate with on-scene commanders to find out where they can operate?

Over the next few months, GTRI and CNN will meet with regional emergency responders and other stakeholders to address these questions and develop an operational framework. Then GTRI will work with law enforcement agencies to test the procedures at remote locations. “We’ll hold mock trials and simulate circumstances that would happen in a breaking news situation,” Heiges explained.

Creating appropriate regulations for various types of UAV flights is important, as the flight landscape has changed dramatically in recent years.

“When people built radio-controlled airplanes out of balsa wood, they learned the rules for flying and flew aircraft at sanctioned sites,” Heiges said. “Yet in the past few years, we now have multi-rotors and quad-rotors with automatic stabilization that don’t require the same skills. People are flying them out of the box without knowing the rules. That can be dangerous if flown beyond visual range. Any significant accident will set back the industry, punishing those who do follow the rules.”

Even small drones could cause a helicopter or aircraft to go down if it gets caught in a propeller or pulled into an engine. Indeed, drones have been in the news this past summer for interfering with firefighting efforts in California, including a San Bernadino wildfire where drones operated by curious hobbyists caused fire pilots to pull out of the fray for 30 minutes, allowing the fire to spread.

“The one thing that doesn’t get talked about enough is the differentiation between hobbyists and commercial drone users — and that most of the problems are caused by laymen,” said Agvent. “Our goal is to create a framework that allows for safe integration of commercial drones for newsgathering. It’s about having trusted vendors, trusted aircraft, and trusted procedures in place to act in a safe manner.”

END

T.J. Becker is a freelance writer based in Michigan. She writes about business and technology issues.

Get the Latest Research News in Your Inbox

Sign up for the Research Horizons Monthly Newsletter

More Features

Read More
Read More
Read More
Read More

Related Stories

Read More
Read More
Read More

Media Contacts

John Toon

John Toon

Director of Research News
Phone: 404.894.6986
photo - Jason Maderer

Jason Maderer

National Media Relations
Phone: 404.385.2966
photo - Ben Brumfield

Ben Brumfield

Senior Science Writer
Phone: 404.385.1933
Josh Brown

Josh Brown

Senior Science Writer
Phone: 404-385-0500

Subscribe & Connect

Follow Us on Twitter:

@gtresearchnews

RSS Feeds

Subscribe to our RSS Feeds with your favorite reader.

Email Newsletter

Sign up to receive our monthly email newsletter.

Research Horizons Magazine

Sign up for a free subscription to Research Horizons magazine.