You are on page 1of 48

V O L U M E 2 N O .

4 W I N T E R 2 0 1 2 A U V S I 2 7 0 0 S o u t h Q u i n c y S t r e e t , S u i t e 4 0 0 , A r l i n g t o n , VA 2 2 2 0 6 , U S A

Sensors show the way

Inside this issue:

MEMS go unmanned Localizing with lidar Talking to robots


Mission CritiCal

Winter 2012

Not the Same Old Briefs


DARPAs Robotics Challenge Commercial Applications for UGVs Precision Agriculture ACC Perspectives on UAS Ops UMVs in Offshore Oil & Gas UMVs & COLREGS

REGISTER
auvsi.org/uspr

NOW

1214 Feb. 2013

THE RITZ-CARLTON, TYSONS CORNER

M c LEAN, Va., USA

A U V S I s U n m a n n e d S y s t e m s P ro g r a m R e v i e w c o n t i n u e s to be an invaluable forum for understanding the nuances of t h e d e f e n s e a n d c o m m e rc i a l a u t o n o m o u s ro b o t i c m a r k e t .


Rob Hughes, Rockwell Collins

Speaker Lineup Includes


Maj. Gen. Charles Lyon, Director of Operations, HQ ACC, U.S. Air Force Mr. Steve Markofski, Corporate Planning, Yamaha Motor Corp., U.S.A Dr. Missy Cummings, Program Officer, AACUS, ONR, U.S. Navy Dr. Robert Ambrose, Division Chief, Software, Robotics and Simulation, NASA

Dr. Karlin Toner, Dr. Kathryn D. Sullivan, Director, Deputy Administrator JPDO, FAA and Acting Chief Scientist, NOAA

3 DAYS

3 DOMAINS
Ground Day Tuesday 12 Feb.

ALL SYSTEMS
Maritime Day Thursday 14 Feb.

Air Day Wednesday 13 Feb.

. . .

W H E R E

B U S I N E S S

H A P P E N S

CONTENTS
VOLUME 2 NO.4 WINTER 2012

Tiny and everywhere


A look at the unmanned MEMS movement

Page 8

Essential components
News from the sensors market

14

Q&A
John Marion, director of persistent surveillance at Logos Technologies

On the cover:
How a self-driving car sees the world. The circles recreate how a lidar would perform at 360-degree scan of the surroundings. The boxes are objects and the green path is a map of the road ahead. For more on lidar technology, see the feature on Page 16. AUVSI image.

22

State of the art


A look at the security cameras watching cities around the world

25

Pop culture corner


Sensor ideas imagined by Star Trek that became reality

Mission CritiCal

Winter 2012

26 Timeline
The sensors paving the way for self-driving cars

35 Market report
Pivot to Asia drives new sensors

Page 16
Lost in space?
How lidar ensures robots know more about their surroundings

39 Testing, Testing
Mesh networking: robots setting up communications

41 Technology gap
ADS-B tests may help expedite UAS flights in public airspace

43 End users
IHMCs tongue sensor fills in for human sight

Page 29
Talking to robots
Researchers look for novel new ways to communicate with unmanned systems.

Mission Critical is published four times a year as an official publication of the Association for Unmanned Vehicle Systems International. Contents of the articles are the sole opinions of the authors and do not necessarily express the policies or opinion of the publisher, editor, AUVSI or any entity of the U.S. government. Materials may not be reproduced without written permission. All advertising will be subject to publishers approval and advertisers will agree to indemnify and relieve publisher of loss or claims resulting from advertising contents. Annual subscription and back issue/reprint requests may be addressed to AUVSI.

Mission CritiCal

Winter 2012

Editors message
Editorial Vice President of Communications and Publications, Editor Brett Davis davis@auvsi.org Managing Editor Danielle Lucey lucey@auvsi.org

Contributing Writers Ramon Lopez David Rockwell Advertising Senior Advertising and Marketing Manager Lisa Fick fick@auvsi.org +1 571 255 7779

ensors dont always get the credit that they deserve. Without them, robots and unmanned systems would mostly be really expensive toys, incapable of detecting and moving about their surroundings. Sensors of various types enable all the smart and sophisticated motions and learning that will one day make robotics as sophisticated as their human creators. In an effort to shine the spotlight on this often looked over sector of robotics, AUVSI dedicated this entire issue of Mission Critical to the topic. Freelance writer, and former AUVSI editor, Ramon Lopez tackled how MEMS, or microelectromechanical, sensors are making their way into a multitude of smarter projects. The sensors themselves are notable because of their tiny size, producing astonishingly small products, like photovoltaic cells for collecting solar energy that are the size of fleck of glitter. He also explores how the company Xsens is proliferating these micro-sensors into unmanned technology. That story is on Page 8. I spoke with AUVSI member company Velodyne on its lidar, which aids robots like the Google selfdriving cars, by helping them detect the many moving objects in their surroundings. The company got its roots in the DARPA Grand Challenges, and now their product is featured on an endless list of large military ground vehicles. Leveraging

Danielle Lucey

this laser-based, radar-like technology enables object detection within a centimeter of accuracy and could one day be featured on every car on the road. Read more about that on Page 16. Brett Davis, editor of Mission Critical and Unmanned Systems magazine, tackles robotic communication, which leverages many more senses than a simple satellite transmission. Computer giant IBM aims, within five years, to be able to relay textures, the quietest of sounds and even smell and taste over computers and wireless networks. This technology could smell disease before a person even thinks to visit a doctor or hear a mudslide days before it actually occurs. Look for that story on Page 29. In addition to those features, we have many more departments that encompass many other aspects of sensing, like the possibly ubiquitous Automatic Dependent SurveillanceBroadcast system, a sensor that can be placed on blind peoples tongues that relays visual information and a well-rounded market report written by Teal Groups David Rockwell that explores where future hot spots in sensors will be. We hope you enjoy it!
Mission CritiCal

A publication of

President and CEO Michael Toscano Executive Vice President Gretchen West AUVSI Headquarters 2700 South Quincy Street, Suite 400 Arlington, VA 22206 USA +1 703 845 9671 info@auvsi.org www.auvsi.org

Winter 2012

Essential Components

When Penguin B made its record-breaking endurance flight, a custom Gill Sensors fuel detector played a pivotal role. Photo courtesy UAV Factory.

Longest UAS flight aided by sensor


Although the company UAV Factory broke the longest recorded flight record for a small UAS, its supplier Gill Sensors played a part in ensuring the platform was able to make its historic flight. Gill Sensors developed a fuel level sensor that enabled the Penguin B UAS to stay in the air for 54.5 hours, so it could accurately monitor the fuel left in its 7.5-liter tank. The task was challenging, since the fuel tank had an irregular shape,
4
Mission CritiCal

and space was extremely limited, so the company could not mount the sensor through the top of the tank, as is custom. Engineers at Gill created a unique sensor that could instead be mounted to the side of the tanks wall. The sensor used an angled probe to take measurements of the tank depth. Key to the excellent performance and suitability of the Gill fuel sensor for this aviation application is the use of new microelectronics that offers a 50 percent space saving compared to standard electronics, said the company in a press release.

Gill also had to cut down on the weight of the sensor, slicing it to 60 kilograms. We were delighted when we were told about this fantastic achievement by UAV Factory, says Mike Rees, head of marketing at Gill Sensors. Our design engineers relished the challenge when we first met UAV Factory at AUVSIs Unmanned Systems [North America] conference in Washington in 2011, and were able to utilize the proven microelectronic level sensor technology that is currently supplied by Gill into other specialist applications, such as For-

Winter 2012

Essential Components
mula 1 race cars and military land vehicles. It was a great achievement for UAV Factory, and we are proud to be a part.

sCan it

or

Click it:

Infrared sensor to fill Japan security gap


Japan is developing an unmanned aircraft outfitted with an infrared sensor after its existing sensor suite failed to pick up an attempted satellite launch by North Korea in April, according to an IHS Janes report. Japans Ministry of Defense wants 3 billion yen ($37.6 million) for a replacement of the ballistic missile defense system, slotted for fiscal year 2013. The existing system consists of land-based early warning radars and destroyers outfitted with Aegis, which have RIM-161 Standard Missile 3 systems attached. Japan is interpreting its failure to locate the North Korean launch, despite the fact that the satellite never reached very high in orbit before crashing, as a security gap. The UAV would have a long-endurance capability, according to a November budget request, operating over the Sea of Japan at around 44,000 feet. Japan would like a prototype of the unmanned system by 2014, with initial deployment not scheduled until 2020 or 2021.

To see a video of CNRS work, click or scan this barcode with your smartphone.

The pair recently completed the testing near Edwards Air Force Base at Gray Butte Airfield using a moving dynamic protection zone, a collision avoidance alert system. This zone creates a series of alerts sent to the UAS pilot as an object approaches his system, to avoid near mid-air collisions. They used a sense-and-avoid system based on the Airport Surveillance Radar Model-11 and a repurposed Standard Terminal Automation Replacement System for air traffic control. Using these two items reduces the need for new infrastructure to integrate a sense-and-avoid system. Our solution provides the Federal Aviation Administration and the Department of Defense with a cost-effective and safe approach to handle the thousands of unmanned aerial systems thatll be flying in our airspace in the next few years, says Joseph Paone, director of Air Traffic Management for Raytheons Network Centric Systems business. Our system properly notifies controllers and pilots of intrusions and accurately shows aircraft altitude, which is important in keeping commercial aircraft, unmanned aerial systems and other hazards safely separated. Raytheon says it will continue this testing at other sites around the United States.

A thinking mans robot


Researchers at CNRS-AIST Joint Robotics Laboratory and CNRS-LIRMM Interactive Digital Human group are working on creating a robot that could be controlled entirely by thought. The interface they are planning will use flashing symbols that will tell the robot how to move and interact with its environment. Basically we would like to create devices which would allow people to feel embodied, in the body of a humanoid robot, says Abderrahmane Kheddar, professor and director at the robotics lab. To do so we are trying to develop techniques from Brain Computer Interfaces so that we can read the peoples thoughts and then try to see how far we can go from interpreting brain waves signals, to transform them into actions to be done by the robot. The user wears a cap, covered in electrodes. Then that electric brain activity is transferred to a computer. A signal-processing unit takes what the user is thinking and then assigns different frequencies to icons on the screen. Then they instruct the robot which task is related to the icon, so it can perform the thought. Basically what you see is how with one pattern which is the ability to
5

Air Force, Raytheon put sense and avoid to the test


The U.S. Air Force and Raytheon Co. have conducted concept evaluation demonstrations, showing that existing air traffic equipment could be modified to include ground-based sense and avoid to track the presence of UAS.

Mission CritiCal

Winter 2012

Essential Components continued from Page 5

associate flickering things with actions, we associate actions with objects and then we bring this object to the attention of the user, he says. Then by focusing their intention, the user is capable of inducing which actions they would like with the robot, and then this is translated.

The robot is named HEARBO, for Hearing Robot, and the audio system is named HRI-Japan Audition for Robots with Kyoto University, or HARK. The university is a partner on the team developing the system. We have the ability to consciously or unconsciously listen to what we want to hear when there is noise around (cocktail party effect), but this is not the case in robots and their systems, HRI-Japan says on its website. Furthermore, the systems have a severe limitation. In general voice recognition systems, all sounds input are recognized as voices. Therefore, not only human voices but music and sounds from a television set are also recognized as voices.

HARK overcomes that limitation, allowing the robot to recognize human voices as being distinct from other sounds. By using HARK, we can record and visualize, in real time, who spoke and from where in a room, HRIJapan says. We may be able to pick up voices of a specific person in a crowded area, or take minutes of a meeting with information on who spoke what by evolving this technology.

HEARBO can hear you now


The Honda Research Institute-Japan has developed a robot that can differentiate between four sounds at once, including voices, and tell where they are coming from. Such a capability could one day lead to robots that are able to respond to various verbal commands. In one experiment with the robot, it took food orders from four people speaking at once and knew which person ordered which dish.

Integration is the name of the UAS game


Two recent announcements showcase how unmanned systems companies are teaming to integrate new sensors and capabilities onto existing platforms, expanding their capability. Insitu of Bingen, Wash., has teamed with Melbourne, Australia-based Sentient to incorporate its Kestrel land and maritime software detection systems into Insitus unmanned aircraft, including the ScanEagle and Integrator. The Kestrel software is able to automatically detect moving targets on land or on the surface of the water. Many ScanEagle customers already use Kestrel to provide an automated detection functionality and are very satisfied with the results, says Simon Olsen, Sentients head of sales and marketing. This agreement allows customers to benefit from the two technologies working together seamlessly to enhance airborne ISR missions.

In one demonstration, HEARBO could play rock-paper-scissors by listening to peoples voices and determine who won. Image courtesy HRI-Japan.

sCan it

or

Click it:

To see and hear HEARBO in action, click or scan this barcode with your smartphone.

Mission CritiCal

Winter 2012

Essential Components

Saab integrates Roke altimeter onto Skeldar


Roke Manor Research Ltd. of the United Kingdom has worked with Saab to integrate its miniature radar altimeter into Saabs Skeldar unmanned helicopter, increasing its performance. Rokes MRA Type 2 will be integrated into the Skeldars landing system to enable it to determine its height above ground, even in misty or dusty conditions.

Saabs Skeldar.

Rokes MRA will deliver the very high accuracy required in order to be a part of the avionics suite in Skeldar. This will effectively support Skeldars high autonomy during landing to maximize the safe conclusion of missions, says Jonas Carlsson, senior product manager at Swedens

Saab. The MRAs compact size and light weight also allows us to free up space on Skeldar and maximize payload.

Roke Manors miniature radar altimeter, now standard equipment on Saabs Skeldar. Photo courtesy Roke Manor.

Mission CritiCal

Winter 2012

Tiny

and

and

verywhere

Tiny

and

everywhere

and

Unmanned memS movemenT


Tiny
everywhere
and
everywhereTiny

Tiny and everywhere:


and
Tiny

and everywhere

Tiny everywhere
and

EMS devices tiny machines with moving parts are everywhere these days, and they have wrought a revolution for shrinking sensors that operate unmanned systems. An acronym for microelectromechanical, the shrunken sensors can be found throughout daily technologies. Arrays of micromirrors, for instance, enabled digital film projectors, and MEMS gyros and accelerometers like those in Nintendos Wii controller have changed gaming forever. MEMS accelerometers provide orientation for smartphones and image stabilization for digital
8
Mission CritiCal

cameras. And smartphones speakers incorporate one or more MEMS microphones. MEMS devices monitor air pressure in car tires, and auto GPS devices wont work without their MEMSbased inertial navigation system. Airbag crash sensors and side-impact airbags are lifesavers because of MEMS accelerometers, as are MEMS-based stability control systems that activate during hydroplanes or skids. MEMS accelerators control auto parking brakes, and MEMS-based anti-rollover systems are becoming standard fit in automobiles.

Meanwhile, automakers are stepping up efforts to see if a car can monitor driver stress or illness, saving the operator from having an accident. Vehicles with MEMS-based biometric sensors would keep tabs on drivers pulse and breathing. The steering wheel would sense sweaty palms, a possible prelude to a heart attack or a fainting spell. The drivers vital health signs would be fed into a cars safety system that would take action in an emergency. Cars wouldnt start if a drunk driver gets behind the wheel. Already, some autos have steering sensors that detect drowsy drivers.

Winter 2012

and

and everywhere

Tiny and everywh

By RAMON LOPEz

Tiny
Tiny

everywhere

and everywhere
Tiny

everywhere

Tiny and everywhere


everyw
and

Devices, such as seatbelt-based respiration sensors, are getting cheaper and smaller through the magic of MEMS. The technology could also lead to self-driving cars that combine artificial intelligence software, a global positioning system and an array of sensors to navigate through traffic. Taxicabs might shuttle fares without a driver; people with medical conditions and ineligible for a drivers license would get around with a virtual chauffer. Digital health feedback systems use MEMS sensors the size of a grain of sand to detect medications and record when they were taken. And one day, electro-responsive fibers in sleepwear and soft electronics in pillows will monitor your blood pressure, sleep patterns and stress levels while you slumber.

Researchers in Europe have developed a vest embedded with sensors that measure the wearers muscle tension and stress level. At the core of the vest is wearable electronics consisting of sensors woven into the fabric that register the electrical excitation of the muscle fibers and thin conducting metallic fibers that pass the signals to an electronic analysis system. Muscle tension changes with their stress level. Though barely perceptible, electrodes register the change. Electrodes affixed to test subjects chests induce stress, making clinical test results of very little use. The smart vest was developed for inconspicuous measuring during stress studies. The vest can also contribute to workplace safety, and sports

coaches could use it to measure whether athletes have reached their performance limits. MC10, a startup U.S. company that makes flexible electronics, recently unveiled a new product: a sports skullcap that measures contact sport impacts that could cause severe concussions. The device is thought to incorporate accelerometers wired up with the firms stretchable electronics. The device can also support research into combat brain trauma. The technology could lead to skin patches that monitor whether the wearer is sufficiently hydrated and other adhesive patches that monitor heartbeat, respiration, temperature and blood oxygenation. The skin patches can wirelessly transmit the

Norways Northern Research Institute has developed an unmanned fixed-wing aircraft, named CryoWing, which can be used for power line inspection, environmental monitoring (land and sea), aerial mapping and meteorological measurements. The CryoWing is well suited for operations in extremely cold weather. Xsens provides the CryoWings heading and attitude control. Photo courtesy of Xsens.

Mission CritiCal

Winter 2012

MEMS continued from Page 9

medical data to a smartphone. One day, an inflatable balloon catheter equipped with sensors will snake through the heart to treat cardiac arrhythmias. Surgery to treat strokes, hardened arteries or blockages in the bloodstream may be helped by MEMSbased micromotors small enough to be injected into the human bloodstream. Australian researchers are harnessing piezoelectricity to power microbot motors just a quarter of a millimeter wide. Remote-controlled miniature robots small enough to swim up arteries could save lives by reaching parts of the body, like a stroke-damaged cranial artery, that catheters are unable to reach. With the right sensors attached to the microbot motor, a surgeons view of a patients troubled artery can be enhanced and the ability to work remotely also increases the surgeons dexterity. Researchers at Louisiana Tech University are taking a different tack regarding piezoelectricity. They have developed a technology that harvests power from small generators embedded in the soles of shoes. It is based on new voltage regulation circuits that efficiently convert a piezoelectric charge into usable voltage for charging batteries or for directly powering electronics. The technology, for example, could power emergency locators for lost hikers or cell phones. Energy harvesting is an attractive way to power MEMS sensors and locator devices such as GPS. However, power-harvesting technologies of-

ten fall short in terms of output, as many of todays applications require higher power levels. This technology breakthrough uses a low-cost polymer transducer that has metalized surfaces for electric contact. Unlike conventional ceramic transducers, the polymer-based generator is soft and robust, matching the properties of regular shoe fillings. The transducer can therefore replace a regular heel on shoes. Scientists at the University of Pennsylvania think along the same lines, having developed a power-generating backpack. The suspended-load backpack converts mechanical energy from walking into electricity. It incorporates a rigid frame pack. Rather than being rigidly attached to the frame, a sack carrying the load is suspended from the frame by vertically oriented springs. It is this vertical movement of the backpack contents that provides the mechanical energy to drive a small generator mounted on the frame. Meanwhile, Sandia National Laboratories scientists have developed tiny glitter-sized photovoltaic cells

that could revolutionize the way solar energy is collected and used. The tiny cells fastened to clothing could turn a person into a walking solar battery charger. The cells are fabricated using MEMS techniques.

MEMS goes unmanned


Nowhere has MEMS penetration been more pronounced than the area of sensors and avionics for unmanned systems. Founded in 2000, Xsens is a privately held company with headquarters in Enschede, Netherlands, and a U.S. subsidiary in Los Angeles. The founders were interested in measuring the performance of athletes, and a company was born with launch of a measurement unit used for human motions and industrial applications. Clients include Sony Pictures Imagework, Daimler, Sagem, Siemens, Saab Underwater Systems and Kongsberg Defence & Aerospace. Xsens is a leading innovator in 3-D motion tracking technology and products based upon MEMS inertial sensor technology. Since its inception in 2000, several thousands of motion sensors and motion capture solutions have successfully been

Xsens MTi are used for navigation and control on SAABs multipurpose underwater vehicles. Photo courtesy of Xsens.

10

Mission CritiCal

Winter 2012

deployed in areas such as 3-D character animation, rehabilitation and sports science, and robot and camera stabilization. Xsens officials have found new uses for MEMS sensors initially designed for rollover detection and impact detection in cars and MEMS gyroscopes used in smartphones and game controllers. It is a market leader in MEMS inertial measurement units (IMUs), attitude and heading reference systems (AHRS) and inertial navigation systems (INS). Xsens IMU consists of 3-D gyroscopes, 3-D accelerometers and a 3-D magnetometer. The AHRS adds filtering to that, estimating 3-D orientation based on the IMU sensor data. An INS additionally uses the accelerometers to find velocity and position, using GPS as a reference. Xsens offers an alternative

average 2 inches in length, 1.5 inches in width and 1 inch in height. A traditional IMU, for example, snugly fits into a 4-inch cube. He said Xsens uses the same MEMS hardware used by the automotive industry, such as smart seatbelts, but for a different application: stabilization and control of unmanned systems, whether air, maritime or ground vehicles. Xsens also applies the technology for camera systems or platform systems that need to be stabilized. Xsens, says van Hak, provides systems for the smaller unmanned aircraft, weighing between 3 and 300 pounds. The firm is aboard unmanned aerial systems made by Delft Dynamics and Area-Is PTERA (Prototype Technology Evaluation Research Aircraft). He said his equipment is also on several robotic underwater vehicles.

The MTi-G-700 GPS/INS is the successor of the MTi-G introduced in 2007. Deliveries of the MTi-G-700 GPS/INS started in December 2012. The MTi-100 series can serve as a cost-effective replacement unit for high-grade IMUs, making the end product more economically viable. The MTi-G-700 GPS/INS is now being used to navigate an unnamed European target drone, replacing fiber optic gyros in test aircraft. Xsens established that the unit can cope with very high accelerations during launch and cornering. With similar performance to the fiber optic gyro it replaced, the unit is 15 to 20 percent lower in cost, produces a weight savings and provides more room for payload, says van Hak. He said the MTi-G-700 GPS/INS will work with other target drones and unmanned air systems. We are searching for additional customers. We are in discussions with three other customers who are actively considering the MTi-G-700 GPS/INS for their target drones.

Area-Is PTERA provides a bridge between wind tunnel testing and manned flight by providing a low-risk, low-cost platform to flight test high-risk technologies. The 200-pound aircraft has a 50-pound payload capacity. The unmanned aircraft operates with an Xsens MTi-G INS. Photo courtesy Xsens.

to bulky and heavy fiber optic IMUs and ring-laser gyros, shrinking similar tracking performance in a significantly smaller package. Xsens is able to offer high performance in a package that is tens of times smaller than the traditional IMUs and INS used for sonar and unmanned aircraft, according to company officials. Marcel van Hak, Xsens product manager for industrial applications, says his product line wouldnt exist if not for MEMS technology. Using MEMS subcomponents allows Xsens to produce IMUs, AHRS and INS that

Xsens makes systems that keep telecommunications satellites and roving vehicles, whether trucks or maritime vessels, connected. He said half of the firms earnings come from that application. The Dutch companys current MTi product portfolio includes the MTi-10 IMU, the MTi-20 VRU (Vertical Reference Unit) and the MTi-30 AHRS. The MTi 100-series includes the MTi100 IMU, MTi-200 VRU and MTi-300 AHRS.

The MTi OEM is a board-only version of the Xsen MTi. The housing-less MTi OEM is a small and ultra-light (11-gram) AHRS with the same functionality as the regular MTi. Photo courtesy Xsens.

Mission CritiCal

Winter 2012

11

MEMS continued from Page 11

We have integrated the Xsens MTiG AHRS sensor with a range of products designed for installation on land, sea and air platforms, including tactical and rotary wing aircraft, says Paul Wynns, aircraft systems program manager at Argon ST, a wholly owned subsidiary of Boeing. We value the Xsens MTi product line for its ease of integration, reliability and accuracy, along with its small size and rugged packaging. Xsens is not alone in supplying MEMS-based sensors to the unmanned systems industry. MicroStrain is a Vermont business specializing in combining microsensors with embedded processors to autonomously track operational usage and to navigate and control unmanned systems. It has the 3DMGX3-45 GPS/INS for vehicle tracking, camera pointing, antenna pointing, and unmanned aerial and micro vehicle navigation and the 3DMGX3-35 AHRS with GPS. MicroStrain also offers the 3DM-GX3-15 IMU and Vertical Gyro. The 3DM-GX3-15 is a miniature IMU that utilizes MEMS sensor technology and combines a triaxial accelerometer and a triaxial gyro to maintain the inertial per-

Northrop Grumman supplies the fiber optic, gyrocompassing LCR-100 AHRS for Embraer Legacy 500 and Legacy 450 aircraft. The LCR-100 AHRS provides navigation information regarding the aircrafts position, heading and attitude. Photo courtesy Northrop Grumman.

formance of the original GX3-25. Applications include unmanned vehicle navigation and robotic control. Two other players in the field are De Leon Springs, Fla.-based Sparton with its AHRS-8 MEMS-based attitude heading reference system. Dallas-based Tronics has introduced a high-performance angular rate sensor (gyrometer) for demanding applications such as platform stabilization. The product is based on Tronics long-standing expertise in high-end inertial sensors using MEMS-on-SOI and high-vacuum wafer-level packaging technologies.

The growing use of disposable medical devices and respiratory monitoring is due to MEMS technology. The most common medical pressure sensor is the disposable catheter to monitor blood pressure. Another type if disposable, low-cost MEMS pressure sensor is the infusion pump used to introduce fluids, medication and nutrients into a patients circulatory system. MEMS pressure sensors are used in respiratory monitoring, such as the Continuous Positive Air Pressure device, used to treat sleep apnea, and oxygen therapy machines. MEMS devices will proliferate as cheaper manufacturing techniques for the micro machines are developed. Massachusetts Institute of Technology researchers have found a way to manufacture them by stamping them on plastic film, opening up the possibility of coating large areas with tiny sensors. That should significantly reduce their cost, but it also opens up the possibility of large sheets of sensors that could, say, cover the wings of an airplane to gauge their structural integrity. The printed devices are also flexible, so they could be used to make sensors with irregular shapes.

Trends in manufacturing
MEMS have revolutionized every market in which they play, but the trend for the still-nascent mini technology is just beginning. Analysts predict rapid growth for the types of MEMS now in widespread use and in the making. MEMS devices, especially motion sensors like accelerometers, have changed consumer electronics forever and, more recently, have enabled an emerging market for facial recognition, motion-controlled apps, location-based services, augmented reality and pressure-based altimeters.

The Delft Biorobotics Labs FLAME robot is an active walker that uses the MTi for its stability. Photo courtesy Xsens.

12

Mission CritiCal

Winter 2012

And since the stamping process dispenses with the harsh chemicals and high temperatures ordinarily required for the fabrication of MEMS, it could allow them to incorporate a wider range of materials. Conventional MEMS are built through the same process used to manufacture computer chips, which is called photolithography: different layers of material are chemically deposited on a substrate usually a wafer of some semiconducting material and etched away to form functional patterns. Photolithography requires sophisticated facilities that can cost billions of dollars, so MEMS manufacturing has high initial capital costs. And since a semiconductor wafer is at most 12 inches across, arranging todays MEMS into large arrays re-

quires cutting them out and bonding them to some other surface. Besides serving as sensors to gauge the structural integrity of aircraft and bridges, sheets of cheap MEMS could also change the physical texture of the surfaces theyre applied to, altering the airflow over a planes wing, or modifying the reflective properties of a buildings walls or windows. How they did it: The MIT process begins with a grooved sheet of a rubbery plastic, which is coated with the electrically conductive material indium tin oxide. The researchers use what they call a transfer pad to press a thin film of metal against the grooved plastic. Between the metal film and the pad is a layer of organic molecules that weaken the metals adhesion to the pad. If the

researchers pull the pad away fast enough, the metal remains stuck to the plastic. Once the transfer pad has been ripped away, the metal film is left spanning the grooves in the plastic like a bridge across a series of ravines. Applying a voltage between the indium-tin-oxide coating and the film can cause it to bend downward, into the groove in the plastic: The film becomes an actuator the moving part in a MEMS device. Varying the voltage would cause the film to vibrate, like the diaphragm of a loudspeaker. Selectively bending different parts of the film would cause them to reflect light in different ways, and dramatically bending the film could turn a smooth surface into a rough one. Similarly, if pressure is applied to the metal film, it will generate an electric signal that the researchers can detect. The film is so thin that it should be able to register the pressure of sound waves.

Australias EM Solutions was awarded a contract to develop a Mounted Battle Command Ka-band Satcom On-The-Move System by the Australian Defence Force. The system employs an Mti-G AHRS. Photo courtesy Xsens.

Next steps
The researchers are working on better ways to bond the metal films to the plastic substrate, so that they dont have to rely on tearing the transfer pad away quickly to get the film to stick. Theyre also developing prototypes of some of the applications they envision for the technology. Ramon Lopez is an aviation, aerospace and defense journalist who previously served as editor-in-chief of Air Safety Week, editor of AUVSIs Unmanned Systems and Washington Correspondent for Flight International, Janes Defence Weekly and International Defense Review.

Mission CritiCal

Winter 2012

13

Q & A with John Marion


Q&a
John Marion is the director of the persistent surveillance division of Logos Technologies in Fairfax, Va., which offers systems for the wide area surveillance, remote sensing, cyber security and other areas.

Q:

What does persistent surveillance bring to the table, both for military and civilian users?

military setting, where the target is the Taliban, or in a local police scenario, where the target is an urban drug smuggling operation.

A: While standard full-motion video cameras only have a soda straw field of view, wide-area persistent surveillance systems can provide video coverage of city-sized areas. They do this at medium resolution, enough to track vehicles and people in real time. On the battlefield, these systems provide over watch, giving the warfighter greater situational awareness and the user the ability to monitor multiple areas or targets at one time, from one sensor.
Wide-area persistent surveillance systems also give analysts a way of back-tracking events. For example, suppose an IED was found by the side of the road the sensor operator could use the stored sensor imagery to go back in time to discover when the IED was emplaced. He could then go even further back to find out where the emplacer came from. Finally, he could fast forward to where the emplacer went after planting the IED. By using clues gleaned from the stored sensor data, we could eventually map out a whole network of individuals, right up to the groups leadership. And thats both true in a
14
Mission CritiCal

Q:

Has the militarys use of these systems changed in the years since they first became available?

John Marion

A: In terms of basic uses, much has stayed the same since the U.S. Army deployed the first wide-area persistent surveillance system, Constant Hawk, on turboprop planes back in 2006. What has improved is how we task assets, use persistent surveillance imagery with other intelligence sources and cross-cue different sensors. In addition, we are now putting a strong emphasis on the automation and efficiency of analysis tools a concept we call intelligent persistent surveillance, or IPS.

But even as we collect imagery of a city-sized area, we dont intend to look at all the data, only the parts that matter, directed by clues from other sources. We can think of big data from a couple of different angles. For example, there is the storage issue. But as disks get cheaper and denser, this becomes less of a problem. Then there is the data transfer issue. But by using novel compression techniques, we can compress the imagery by 50 times. And we can compress the data by 1,000 times if we just represent the moving targets and dont update the background map.

Q:

What is the best way to cope with the massive amounts of data such systems can provide?

A: The issue of big data is usually framed the following way: How can we possibly look at all this data? Thats the wrong way to think about the problem.
We collect all this data because we dont know when, where or what sort of the bad things will happen.

Q:

Can you describe how intelligent persistent surveillance systems work?

A: Many in the persistent surveillance field tend to focus on the platform, be it fixed wing, rotary wing and lighter than air. Others look at the sensors that go on those plat-

Winter 2012

forms. However, the real challenge with the new persistent surveillance systems is the data analysis. Thats why attention should be directed to IPS. IPS tools index data by transactions geo-temporally tagging the starts and stops of all the targets within a field of view and then storing that information. When geo-temporal tagging is done across various intelligence sources, analysts can quickly search recorded sensor data for targets at a specific location and over a specific time period, efficiently exploiting those intelligence sources. So, as you can see, IPS goes way beyond the platform, sensor and mere data collection. It gives the analysts a means of efficiently extracting the intelligence value from the available data. That means fewer analysts producing better products, much faster.

els of day/night data per second in the air, while the processing can be performed on the ground with relatively large processing computers. We can do this because we send the imagery data down through a fiberoptic cable in the aerostat tether. We also developed an IPS system for tactical, fixed-wing unmanned aerial systems. Called LEAPS, it provides ISR to ground forces on the move. So it cannot pump persistent surveillance imagery down a tether to large computers on the ground Instead, LEAPS performs all the processing, geo-registration, nonuniformity correction, etc., in an 11-pound processor that shares a gimbal with the wide-area sensor.

aircraft for more dynamic border security operations.

Q:

Is there any commercial potential for such systems as well?

A: Theres definitely a strong domestic market for them. Besides local law enforcement, wide-area persistent surveillance could be used for disaster relief, public event security and environmental missions, like mapping the location of oil slicks in an offshore spill or counting polar bears over a large swath of Arctic wilderness.

Q:

Q:

Assuming there is commercial potential, how can the issue of privacy best be handled?

Q:

Are aerostats the best platform for such systems, or do they make sense for smaller systems as well?

Youve said that such systems have homeland security applications. Can you describe a couple of them?

A: The platform choice really depends on the application. Aerostats are great for surveilling fixed locations, such as the perimeter area of a forward operating base, or FOB, in Afghanistan and urban areas along the U.S.-Mexico border. By contrast, unmanned aircraft are best used when the target location changes frequently or where friendly forces dont control the ground.
This is why we have developed intelligent persistent surveillance systems for both aerostats and unmanned aircraft. Our Kestrel system is mounted on an aerostat located at a FOB. The system collects about 350 megapix-

A: We have demonstrated both aircraft- and aerostat-based wide-area persistent surveillance along the southern border. With these systems, we can track illegal activity in both rural and urban areas, focusing on illegal border crossing and mapping networks of drug traffickers operating in the urban areas.
This past March, the Department of Homeland Security conducted a weeklong test of an aerostat system in Nogales, Ariz. The demonstration was very successful. The Customs and Border Protection agents found it easy to work with the wide-area persistent surveillance system, and within seven days, they nabbed 100 suspects. Likewise, a couple of years ago, we demonstrated LEAPS on a manned

A: I think its good that the UAS industry is thinking about the privacy issue. In the case of persistent surveillance systems used for law enforcement, I would point out that they are like any other police tool, and their use will have to be governed by strict rules and regulations. We already have police helicopters; airborne persistent surveillance systems just stay in the air longer.

Q:

What technological hurdles, if any, remain to be overcome for persistent surveillance?

A: We will continue to improve the sensors miniaturizing them and expanding beyond black and white imagery and into the to multi- and hyper-spectral area. Still, the largest challenges are in IPS as we develop the tools to make sensor analysts faster, more efficient and able to deliver better products. Thats the area that needs the most focus.

Mission CritiCal

Winter 2012

15

lidar ensures robots know more about tHeir surroundings


By dANIELLE LUCEy

How

Lost in space?

arning: Objects in your mirror are closer than they appear. And robotics has the answer for bringing that archaic notion into the 21st century. Most drivers might currently use a series of mirrors to determine their surroundings, but for many robots, including the Google car, lidar is proving a better substitute than a quick glance and a prayer. If youre driving on the street and somebody passes you, you want to know if somebody comes from behind before you start a passing maneuver, says Wolfgang Juchmann, product marketing manager at Velodyne Acoustics lidar division. Essentially each time you look in your rearview mirror, you want to look backwards. Velodyne Lidars sensors provide this capability on a lot of high-profile projects. It makes the sensor of choice for Googles self-driving car program, Oshkoshs TerraMax, Lockheed Martins Squad Mission

Support System and TORC Robotics Ground Unmanned Support System, to name a few. They also were tapped by rock band Radiohead to create their Grammy-nominated House of Cards music video. The company got its start as a spinoff of the DARPA Grand Challenges, where company founders David and Bruce Hall entered the competitions as Team Digital Audio Drive, or DAD. The brothers had previous robotics experience in competitions such as BattleBots, Robotica and Robot Wars in the beginning of the 2000s. After the first Grand Challenge, the Halls realized all the teams had a sensor gap they could fill. Stereovi-

sion was not good enough for the task, so they invented the HDL-64 lidar in 2005 and entered the second Grand Challenge with the sensor, though a steering control board failure ended their run prematurely. By 2006, the company started selling a more compact version of the sensor, the HDL-64E. By then, the teams were gearing up for DARPAs Urban Challenge event. Instead of entering the competition themselves, the brothers sold their device to other competitors. Five out of the six teams that finished used their lidar, including the top two teams.

sCan it

or

Click it:

Click or scan this barcode with your smartphone to see Radioheads House of Cards video, which was shot using Velodynes lidar. The video shows how many robots use the sensor to perceive their environment.

16

Mission CritiCal

Winter 2012

How lidar works


Though the device proved a breakthrough in autonomous sensing technology, lidar is not a new concept. The lidar itself is a technology thats been around for a long time, says Juchmann. The laser beam hits an object and the object reflects light back. The time this takes tells us how far away the object is and the amount of light reflected back gives us an idea about the reflectivity of the object. Lidar works in a similar way to radar, in that it measures the time it takes for a signal to return to its point of origin, though it ditches radio waves for laser beams. Because of the different nature of the two mediums, while radar excels at measuring faraway objects, Velodynes sweet spot is in the 100-meter radius range, says Juchmann. However, lidar overall has a better angular resolution. What makes Velodynes product different than simple lidar technology, explains

Juchmann, is that instead of using one laser to determine an objects range, it uses 64. Instead of just shooting one laser to the wall, we shoot 64 all on top of each other so if you look at the wall youll see a [vertical] line of dots, says Juchmann. This means you can see a wall with a resolution of 64 lines in a vertical field of view of about 26 degrees. Instead of measuring the time-todistance correlation of this series of dots at the same time, Velodyne measures them one after the other, in a series, to capture the distance data from each point. If you were shooting the lasers toward a flat wall, it would be a fairly easy measurement, says Juchmann, because the

laser data would return almost simultaneously. However, if the series of laser points were flashed toward a staircase, it would mark faster returns on the lower-level stairs and longer returns as the steps ascend, giving the user an idea of the varying distances. The measurement of a single vertical line in space is not very useful though, especially to large cars trying to navigate their environment at fairly high speeds. Velodynes sensor also spins these 64 points, so there are 64 lines moving through the whole room. The amazing part is the amount of data that is measured in a very short time, he says. A human blink lasts about two-fifths of a second. In that time span, Velodynes lidar has done a 360-degree scan of its surroundings four times. This 10-times-per-second scan produces 1.3 million data points per second. At this speed, lidar can get in a centimeters range of accuracy in measuring an objects location. While much older methods, like surveying, can measure an objects accuracy in the smaller, millimeter range, highdefinition lidars speed versus breaking out some tripods is no contest. After the success of the companys HDL-64E, it has also released the HDL-32E, which uses the same concept but uses 32 laser points instead of 64. This is useful for smaller vehicles, because Velodynes HDL-32E lidar weighs 1 kilogram, versus 15 kilograms for double the laser points. This is a huge factor when people want to mount their lidar on something lighter, explains Juchmann. Its also less than half the price.

Velodynes lidar mounted atop Googles self-driving Lexus. Photo courtesy Google.

Mission CritiCal

Winter 2012

17

Lidar continued from Page 17

How the Google car sees the path and obstacles ahead, using lidar integrated with other data and sensors. Photo courtesy Google.

To make all this data useful, companies integrate Velodynes lidar data with GPS and IMU data to determine how their robots should move. The vehicle needs to know where exactly it is, says Juchmann. Typically you have to feed in GPS information so you know where you actually are. With our sensor you can integrate and synchronize GPS information in order to determine not only the range, but also were you are. The IMU compensates for movements and angles that inherently occur when the sensor is moved in real life. The key to all this data, though, is the software each company creates that analyzes it all. The Google self-driving car, for instance, integrates this data with its Google Maps product so the robot will know the long-range terrain data and also can detect if, for example, a bicyclist is coming up behind the car that is about to turn.

If you have a robot or a self-driving car that moves around, its important to see whats around it, says Juchmann. Not all of the technological aspects of lidar have been overcome. Lidar sensors are affected, the same way human eyes are, by low-visibility situations. For instance, the laser beam can detect drops of rain, but if the rain is heavy enough it might view a downpour as an object. Juchmann likens it to watching an antenna TV with some white noise. You still see a picture, but only once in a while you have the full picture. If the rain becomes really, really heavy, you have more rain than picture. The same is true for fog and snowfall. If you have a little bit of that its all fine, and computer algorithms can figure out the once-in-a-while reflection, but if its heavy snowfall the reflections will outweigh the actual picture, explains Juchmann.

Other applications
Lidar has a lot of applications outside robotics. Right now, Velodyne is addressing the security and surveillance market, says Juchmann, which could use lidar to monitor military perimeters and border fences. Right now, many fences are monitored with cameras, which at their best have around 130-degree fields of view. Another big market that uses lidar is mobile mapping. Transportation department contractors put the sensors on manned vehicles and, using cameras and other sensors, give state transportation departments information on the conditions of bridges and roads. The accurate mapping provides an idea of roadwork and maintenance that needs to be done. AAI Textron uses Velodynes lidar on its Common Unmanned Surface Vehicle, to determine if there are intruders in the immediate vicinity and for collision avoidance.

18

Mission CritiCal

Winter 2012

REACH THE

UNMANNED SYSTEMS AND

ROBOTICS COMMUNITIES
Capture the attention of the most influential leaders on a dail , weekl and monthl y y y basis. Advertise in Unmanned Systems magazine, unmanned systems:mission critical, Unmanned Systems eBrief or AUVSI.org.

U.K. TECH ROBOTICS WEEK

BIOLOGICAL

SYSTEMS

Robots Helping Robots

auvsi.or Street,

g 2206, n , VA 2 USA

cy uth Quin 2700 So 2010 JUNE 2 8 N O. 6 VOLUME

gto 0, Arlin Suite 40

For more information call +1 571 225 7779 or email marketing@auvsi.org

Lidar continued from Page 18 Aside from Google, Juchmann says nearly every single major car manufacturer in the world uses one or two of the companys lidar to test out some of the other sensors that have made their way onto cars in the last 10 years. Auto companies will compare the results of the lidar with its backup warning signals, lane keeping and blindspot detection to measure their accuracy. Juchmann predicts, however, that the auto industry will be the big boon for lidar once they are adopted on every vehicle. The next big step is to get integrated into the large volume products, he says. For this to happen, the cost needs to come down and the sensors have to get smaller. Also, many cars still rely on outdated computing technology that isnt adequate for modern sensor technology anymore. While this isnt a problem for Google, which uses its own computers, in traditional cars these old systems can be a bottleneck to how we actually use all this data. And there is one more big hurdle. People dont want to have that thing on the top [of their car], and thats where the balance between form and function needs to be found, says Juchmann. The first thing all the car designers say is, Theres no way in hell this thing is going to be on top of the car. No one has come up with an answer for that yet, says Juchmann, but similar problems have been solved in the past. He points to satellite radio, which originally required large antennas. But at some point somebody made the decision that were going to have satellite radio inside the car. Thats the future we need, and the only way thats physically going to work is to have an antenna on the outside, he says. Lets come up with a design that doesnt look really bad, so they came up with the shark fin design. The small fin is now on the back of most cars with satellite radio. The best spot for lidar remains at the top of a vehicle, though, so how this final challenge plays out is still a question. Danielle Lucey is managing editor of Mission Critical.

AAI/Textrons CUSV uses a Velodyne lidar, the small sensor at the very top of the vessel, to image the maritime landscape. Photo courtesy Textron.

20

Mission CritiCal

Winter 2012

Need to Learn More? No Time to Leave Your Desk?

Tune into AUVSI Webinars


AUVSI webinars are a new and fastgrowing service that delivers expert speakers and hot topics in unmanned systems to your desktop. Topics over the coming year will include all facets of this dynamic industry.

Members receive a special discount on all webinar content. Listen to the webinars live, or download them to your computer later.

Check our website for the latest schedule: http://www.auvsi.org/publications/auvsiwebinarseries.


Mission CritiCal

Winter 2012

21

Smile! Surveillance cameras by city


statE oF tHE art
4,468 cameras in Manhattan (2005, New York Civil Liberties Union)

17,000 cameras in Chicago (2005, VinTechnology.com)

4,775 cameras in Washington, D.C. (2008, Los Angeles Times)

13,000 cameras in Mexico City (2011, Los Angeles Times)

378 publicly owned security cameras in Rio de Janiero (2011, Study: Cameras in Context: A Comparison of the Place of Video Surveillance in Japan and Brazil) 2,000 cameras in Buenos Aires (2011, InfoSurHoy.com)

22

Mission CritiCal

Winter 2012

hile UAS are known for their 60,000-foot view of areas of interest around the globe, many surveillance cameras are eyeing the residents of major cities mere feet from street level. While its difficult to get authoritative numbers, here is a compilation of what the Mission Critical staff could find.

422,000 cameras in London (2012, CCTV.co.uk)

300 cameras in Paris, plans to install more than 1,100 more (2012, France 24)

400,000 cameras in Beijing (2011, Beijing Daily)

500,000 cameras in Chongqing, China (2012, VinTechnology.com)

2,200 cameras in Sydney (2009, Daily Telegraph) 184 cameras in Johannesburg central policing district (2003, Book: Rainbow Tenement: Crime and Policing in Inner Johannesburg)

Mission CritiCal

Winter 2012

23

MAXIMIZE YOUR VISIBILITY

BECOME A CORPORATE MEMBER TODAY!

Discounts on Exhibits, Sponsorships, Advertising and AUVSI Products

Access to Members Only networking, education and Select VIP Events

Listing in Unmanned Systems Magazine and AUVSIs Online Directory

Complimentary Job Listings on AUVSIs Online Career Center Research reports and knowledge sharing through AUVSIs Online Community

Chapters and Regions around the Globe

Join today at www.auvsi.org/membership

From Star Trek to your house:

Communicators, phasers and other ideas that came true

PoP CUltUrE CornEr


ship invisible (Harry Potter has since borrowed the idea on a smaller level). While this, too, has not yet come to pass, the field of metamaterials is taking a look at it, so to speak, by altering the path of light as it moves through special materials. Numerous universities around the world are working on it, some funded by government agencies. Scientists at the University of Texas in Austin recently revealed that they had cloaked a cylinder from the microwave part of the energy spectrum, although, sadly, the scientists could still see it. Eventually, however, such an application could be useful to warplanes, which is essentially what the Romulans used it for. What we are thinking about is not necessarily cloaking the whole warplane but some hot spots, a part such as the tailplane that you would want to cloak because it reflects most of the energy from microwaves, one of the researchers said in the New Journal of Physics. Star Trek also had the Phaser, a raygun that could be set to stun or kill. At the time, the only existing technology was the regular gun, which had only one setting. In 1969 right about the time the original Star Trek series was canceled a NASA employee named Jack Cover began working on a stun gun that used small tethered darts to disable opponents. In the mid1970s he had finished his work on the Taser.
25

hey had some cool stuff in the TV show Star Trek even in the original show, where the sets were sometimes cardboard and the aliens looked a lot like humans wearing body paint. One memorable piece of equipment was the communicator, a flip-top walkie-talkie that was truly revolutionary in the late 1960s. Back then, when most homes had party-line rotary phones, being able to flip open a little box to talk was miraculous. In the intervening decades, it has become much less so. While the original cell phones of the early 1980s were clunky beasts that barely made phone calls, they have morphed into designs that would make Capt. Kirk quite envious. Well, in most ways the Star Trek communicators could operate over vast distances and rarely seemed to drop calls. Martin Cooper, who created the first personal cell phone while working at Motorola, has cited the Star Trek communicator as his inspiration. He hated talking on wired devices, and envied the freedom he saw on TV, so he helped create it. Another nifty device was the Tricorder, a doodad about the size of a tape recorder (now an obsolete piece of equipment) that could scan a surrounding area and analyze it. Various versions appeared on the TV show and its offspring, including a medical version that could diagnose illnesses. Alas, this is one area where science has yet to catch up, though not for lack of trying.

Various researchers have built something resembling the Tricorder, but if youd like to try your hand at it, the X Prize Foundation this year kicked off the Tricorder X Prize, a $10 million competition to develop a mobile solution that could diagnose patients better than a panel of boardcertified physicians. The prize is a collaboration with Qualcomm Inc., and the team used the son of Star Trek creator Gene Roddenberry to promote it. Its great to see two amazing organizations bring the technology of Star Trek to life and make the Tricorder a reality for people everywhere, Eugene Wesley Gene Roddenberry Jr. said in a press release. Star Trek also had the very futuristic transporters, which could beam anybody most anywhere. Like the Tricorder, such an invention has also proven to be a bridge too far, although here, too, science is giving it whirl. In the November issue of AUVSIs Unmanned Systems magazine, writer Dianne Finch reported on the phenomenon of quantum entanglement, where particles, such as photons, can be linked over great distances. If you change the state of one, the other changes to match. While this has given rise to technologies that may be able to use this effect, such as quantum computers, the teleporter remains well out of reach for now. The field of spooky science has also tackled another Star Trek technology, the Romulan cloaking device, which could render an entire space-

Mission CritiCal

Winter 2012

Driving factors
1968 Automotive Electronic Cruise Control was invented in
ElEctronic cruiSE control

tiMElinE

1968 by an engineer for RCAs Industrial and Automotive Systems Division. One of the two patents filed describes digital memory, where electronics would play a role in controlling a car an industry first.

1971 Though the technology was originally developed for aircraft


Antilock brAkinG SyStEm

in 1929, Antilock Braking Systems got their automotive debut in 1971 through a technology called Sure Brake on that years Chrysler Imperial.

1995 The Mitsubishi Diamante was the first to use laser-based


AdAPtivE cruiSE control

adaptive cruise control; however, instead of applying the brakes, the car would simply throttle down to a lower speed. Toyota added braking control to its radar-based cruise control system in 2000.

1996 In 1996, the National Highway Traffic Safety Administration


bAckuP wArninG SiGnAlS

tested backup warning signals, where ultrasonic sensors on a rear bumper and audible warnings work together to allow drivers to get a sense of how close an object is to the back of their car. Through these systems, the average driver is able to stop a vehicle from hitting an object in 1.5 seconds, with little difference in response times by age group.

1998 Though GPS existed long before


GPS

1998 in military technology, President Bill Clinton signed a law requiring the military to stop scrambling the systems of civilian GPS signals, so the general public could benefit from the technology. This move paved the way for in-car navigation devices, which Googles fleet of self-driving cars rely on for mapping.

26

Mission CritiCal

Winter 2012

lthough Google and auto manufacturers have made a lot of inroads into self-driving cars, technologies like lidar and Google Maps rest on the shoulders of a lot of sensor work thats been going on under the hood for decades. Heres a look at some of the formative sensor suites that have enabled more autonomy in our automobiles.

2001 Nissan was the first company to offer a lane-keeping system on


lAnE kEEPinG

its Cima, which it sold in Japan. The first car available stateside didnt debut until 2004, and Europe got the technology in 2005.

2003 Lexus and Toyota introduced the world to the Intelligence Parking Assist System,
PArkinG ASSiSt

which uses a rear-facing camera to guide a car into a spot and also helps avoid objects. The system has a series of arrows that help the driver tell how he is aligned in a space. Using these arrows, the driver would determine the parameters of the spot and press Set, allowing the car to park on its own. The system debuted in the United States in 2006.

2004 Pan-European

AdvAncEd front-liGhtinG SyStEm

research and development firm EUREKA worked to develop front-lighting systems, which use sensors to automatically make the headlines of a car work directionally. This around-the-corner lighting system was actually featured on cars dating back to the late 1920s, however it was mechanical instead of automated.

2005 In 2005, Volvo introduced its Blind Spot Information System, which
blind SPot dEtEction

used a camera-based system to keep an eye on the area alongside and near the rear of its XC70, V70 and S60 models. The system uses warning lights to inform the driver when another vehicle enters this area.

2014 Volvo recently announced that its traffic jam assist


trAffic jAm ASSiSt

feature would be ready by 2014, allowing drivers to keep their hands off the wheel in low-speed, high-congestion situations. The technology will work in traffic flowing at less than 30 mph.

Mission CritiCal

Winter 2012

27

VAULT.AUVSI.ORG
AUVSIs Knowledge Vault Your Go-To Industry Search Engine

AUVSIs Knowledge Vault is now live. Use one search bar to easily explore all AUVSI proceedings and publications. Did you miss AUVSI Unmanned Systems North America 2012? Visit the Vault to purchase and access proceedings from the conference. COMING SOON: The worlds largest database of unmanned vehicles and company information.

Air
2400+ platforms 700+ companies

Ground
750+ platforms 250+ companies

Maritime
700+ platforms 200+ companies

Researchers look for novel, new ways to communicate with unmanned systems
By BRETT dAVIS

Researchers and end users are constantly seeking new ways to communicate with robots and unmanned systems. One goal is to make such interactions as easy and intuitive as interaction with other humans, but that poses tough challenges on engineers and programmers. Research continues, however, on new ways to talk to robots.

live and work. While not specific to robotics, most of the 2013 technologies singled out could lead to a revolution in the way people interact with unmanned systems of all kinds. The first is touch: In the next five years, youll be able to touch through a phone. Youll be able to share the texture of a basket woven by a woman in a remote village halfway across the globe, says IBM Retail Industry Expert Robyn Schwartz in a company video. The device becomes just as intuitive as we understand touch in any other form today. The second is sight. In five years, IBM posits, computers wont just be able

to look at images, but can understand them. A computer could, for example, scan photos of skin melanomas taken on patients over time, possibly diagnosing cancer before physical problems result. This could be a boon for the emerging market of medical robotics. Dmitri Kanevsky, an IBM master inventor, who lost his hearing at age three, says in another video that in five years computers will be able to hear what matters, such as monitoring mountainsides in Brazil for audible signs that a mudslide is imminent. It can hear that a flood is coming, Kanevsky says. This is an example

Five in five
For the past seven years, IBM has been releasing a list of five technologies its researchers think have the potential to change the way people

Mission CritiCal

Winter 2012

29

Talking to Robots continued from Page 29 of how hearing sensors can help to prevent catastrophes. Another sense coming to computers is smell, according to the IBM researchers. This could lead to sensors in the home that literally can smell disease and then communicate that to a doctor. Smelling diseases remotely, and then communicating with a doctor, will be one of the techniques which will promise to reduce costs in the healthcare sector, says Hendrik Hamann, a research manager of physical analytics, who adds that your phone might know that you have a cold before you do. IBM further predicts that computers will be able to detect how food tastes, helping create healthier diets and even developing unusual pairings of food to help humans eat smarter. These five predictions show how cognitive technologies can improve our lives, and theyre windows into a much bigger landscape the coming era of cognitive systems, says Bernard Myerson, IBMs chief innovation officer. As an example, he cites a track-inspecting robot doing its work inside a train tunnel. A current robot could evaluate track but wouldnt understand a train barreling down that same track. But what if you enabled it to sense things more like humans do not just vision from the video camera but the ability to detect the rumble of the train and the whoosh of air? he asks on the IBM website. And what if you enabled it to draw inferences from the evidence that it observes, hears and feels? That would be one smart computer a machine that would be able to get out of the way before the train smashed into it. In the era of cognitive systems, he says, humans and machines will collaborate to produce better results each bringing their own superior skills to the partnership. The machines will be more rational and analytic. Well provide the judgment, empathy, moral compass and creativity.

Verbal commands
DARPA has been working for years with the Legged Squad Support System, or LS3, the follow-on to the legendary Big Dog robotic mule. In a new video, the defense research agency demonstrated how a ground robot could obey verbal commands, giving it roughly the same capability to follow a soldier as an animal and handler would do. In December, the LS3 was put through its paces, literally, at Virginias Fort Pickett, where it followed a human soldier and obeyed voice commands. This was the first time DARPA and MCWL [the Marine Corps Wafighting Lab] were able to get LS3 out on the testing grounds together to simu-

The LS3 goes through its paces at Virginias Fort Pickett. Photo courtesy DARPA.

30

Mission CritiCal

Winter 2012

Ground Tuesday 12 February . . . W h e r e

Air Wednesday 13 February B u s i n e s s

Maritime Thursday 14 February h a p p e n s

1214 February 2013

McLean, Va

2013
Mark your calendar
TUESDAY, February 12
through

THURSDAY, February 14

2013
Ritz Carlton
McLean, VA, USA

Start your year at the unmanned systems industrys premier event, featuring updates on the latest ground, air and maritime programs

auvsi.org/uspr

Talking to Robots continued from Page 30

32

Mission CritiCal

Winter 2012

late military-relevant training conditions, Lt. Col. Joseph Hitt, DARPA program manager, says in a DARPA press release. The robots performance in the field expanded on our expectations, demonstrating, for example, how voice commands and follow-the-leader capability would enhance the robots ability to interact with warfighters. We were able to put the robot through difficult natural terrain and test its ability to right itself with minimal interaction from humans. In a DARPA video, the LS3 turns itself on after a voice command, and then begins following the human leader. The LS3 program seeks to demonstrate that a highly mobile, semiautonomous legged robot can carry 400 pounds of a squads equipment, follow squad members through rugged terrain and interact with troops in a natural way similar to a trained animal with its handler, DARPA says. LS3 is being developed by Boston Dynamics, leading a team that includes Bell Helicopter, AAI Corp., Carnegie Mellon, the Jet Propulsion Laboratory and Woodward HRT. The December testing was the first in a series of demonstrations planned to continue through the first half of 2014, according to DARPA.

Social interactions
Interacting with robots in a social manner could become more important in the future, as service robots take on a greater role in everyday life.

An IBM chart showing how computers could understand photographs in the next five years.

Mission CritiCal

Winter 2012

33

Talking to Robots continued from Page 33 Researchers at Carnegie Mellon University have been working on what seems like a simple problem: how to let a robot tell where people are looking. Its a common question in social settings, because the answer identifies something of interest or helps delineate social groupings, the universitys Robotics Institute says. The institute developed a method for detecting where peoples gazes intersect, by using head-mounted cameras. By noting where their gazes converged in three-dimensional space, the researchers could determine if they were listening to a single speaker, interacting as a group or even following the bouncing ball in a pingpong game, the institute says. The algorithm used for determining social saliency could be used to evaluate various kinds of social cues, including peoples facial expressions or body movements. This really is just a first step toward analyzing the social signals of people, says Hyun Soo Park, a Ph.D. student in mechanical engineering, who worked on the project with Yaser Sheikh, assistant research professor of robotics, and Eakta Jain of Texas Instruments, who was awarded a Ph.D. in robotics last spring. In the future, robots will need to interact organically with people and to do so they must understand their social environment, not just their physical environment, Park said in a university press release. Head-mounted cameras, as worn by soldiers, police officers and searchand-rescue officials, are becoming more common. Even if they dont become ubiquitous, they could still be worn in the future by people who work in cooperative teams with robots. AUVSIs Unmanned Systems North America 2012. The downside for current power line systems is that users on both ends of such a connection have to be plugged into a wall, not a viable concept for a moving, stair-climbing robot. A team led by Shwetak Patel of the University of Washington, which included the U.S. Army and Duke University, have developed a concept that takes the power line idea and makes it mobile. According to the paper presented at AUVSIs Unmanned Systems North America 2012, the concept is called Sensor Nodes Utilizing Power line Infrastructure, or SNUPI. SNUPI uses tiny, lightweight sensor nodes that contain antennas that can connect wirelessly to a power line infrastructure, dramatically boosting their transmission range. A soldier could be on the bottom floor of a building, or even outside it, and use a single base station connected to the system to control and communicate with a robot exploring the upper floors. SNUPI features a low-power microcontroller that can provide coverage for an entire building while consuming less than one megawatt of power. The initial prototype of the system is just 3.8-by-3.8-by-1.4 centimeters and weighs only 17 grams, including the battery and antenna. Brett Davis is editor of Mission Critical.

Tapping the phones


Ground robots have sometimes been plagued by issues of bandwidth and range. These problems are especially problematic in urban areas, particularly in modern, multistory buildings, where communications can drop off fast. A research team from the U.S. Army, University of Washington and Duke University has demonstrated one way to help expand the communications bandwidth of ground robots inside buildings, by using the existing electrical systems to create a super antenna to achieve wireless, nonline-of-sight communications. The concept is based on the idea of power line networking, or using the bandwidth in electrical connections to send information as well. Such applications are already in use for streaming high-definition television and music and even providing highspeed Internet service using existing wall plugs. The power lines ability to receive wireless signals is a well-known phenomenon, but only recently has it been exploited for in-building communication, says a paper presented by the Armys David Knichel at

34

Mission CritiCal

Winter 2012

Pivot to Asia to drive new sensors


MarKEt rEPort
By dAVId L. ROCkWELL

etention of the current administration in the U.S. will mean some consistency regarding defense spending. A decade of war has taught U.S. and European services an unforgettable lesson scout with your unmanned aircraft, not with your soldiers. This applies to no-boots-on-the-ground conflicts such as Libya, where Europes painful intelligence, surveillance and reconnaissance inadequacies finally inspired NATOs $1.7 billion Alliance Ground Surveillance buy, as well as to grueling attrition battles like Afghanistan, where dominant ISR at all levels from tactical to strategic has prevented a grueling bloodbath like Vietnam. President Barack Obamas pivot to Asia will require new sensor capabilities much more than new striker platforms. Just as in the geographical pivot after the Cold War, the Wests new paradigm will not be arming against an adjacent land threat with thousands of tanks and fighters, but a potential threat with limited power projection capability, requiring monitoring ISR rather than bulked up defensive lines on the Rhine.

Tomorrows need for improved capability with decreased spending will lead to new UAS sensors, electronics upgrades and funding increases, even while manned shooter fleets shrink and nonsensor upgrades, such as new engines for manned JSTARS aircraft, are put on hold.

Electro-optical/infrared
Teal Group Corp. forecasts substantial growth in UAS EO/IR system funding available to U.S. manufacturers once the pivot is well underway, rising from $754 million in fiscal year 2013 to $1.2 billion in fiscal year 2021, but with a slow decrease in funding over the next few years as current systems and programs wind down. Production has now ramped up for the U.S. Army Gray Eagle, and Teal Group expects continuing orders beyond current plans, but with hundreds of Air Force Predators and Reapers already in service, and Block 30 Global Hawk production likely to end soon even if current air vehicles are not

retired, endurance UAS electro-optics spending will shrink in the near term. New technologies like wide field-of-view (WFOV) and hyperspectral imaging systems have a strong future, and development and production of increasingly sophisticated sensors for smaller tactical and mini/ micro-UAS will continue, but if there is any segment of the UAS sensor market likely to suffer losses in the near term, the already-ubiquitous gimbaled EO/IR sensor ball is it. With the Air Force already beginning to wonder what it is going to do with all those non-stealthy but not expendable Predator and Reaper orbits once the U.S. leaves Afghanistan, interest is moving to nextgeneration systems and sensors. In mid-2012, Northrop Grumman coldcalled Canada to offer three Block 30 Polar Hawks for Arctic surveillance, but there are few big opportunities out there (except the U.S. Navy). Instead, the vultures are already circling. In mid-2012, General Atomics offered its new extended range Predator B as an alternative to Global Hawk. The new version adds two fuel pods and a lengthened 27-meter wingspan, allowing a claimed 42-hour maximum

NATO expects to spend 2 billion euros over the next two decades to operate its five AGS Global Hawks. Photo courtesy Northrop Grumman.

35

Market Report continued from Page 35 Electro-optic/infrared


Global Hawk BAMS Predator/Warrior UCAV Tactical Mini/Nano Other U.S. Available International Total FY13 97 10 259 28 77 80 108 95 754 FY14 12 23 280 40 81 78 112 106 732 FY15 8 21 246 32 81 97 130 108 723 FY16 10 25 280 38 68 112 134 100 767 FY17 12 23 310 52 90 114 144 122 867 FY18 16 30 330 100 90 120 170 120 976 FY19 18 28 335 123 92 130 188 156 1,070 FY20 10 29 345 120 91 150 201 130 1,076 FY21 14 26 330 118 101 170 220 144 1,123 FY22 16 28 334 126 112 168 222 154 1,160 Total 213 243 3,049 777 883 1,219 1,629 1,235 9,248

endurance at 45,000 feet, versus Global Hawks 30-36 hours. But nonstealthy UAS at 45,000 feet offer neither the safety nor discretion of a Global Hawk at 60,000. Instead, the Predator C Avenger offers a much better future for near-peer UAV ISR, especially for a pivot to Asia. In January 2012, General Atomics flew its second Avenger. The Air Force has bought one, to be delivered by the end of 2014, to evaluate its performance characteristics. General Atomics has also considered developing a carrier-borne Avenger, with folding wings and a tail hook, for the Navys stealthy UCLASS (Unmanned Carrier-Launched Airborne Strike and Surveillance) development program. Regarding sensors, it is also a whole new ballgame. In mid-2012, the

Avenger was in testing with a Goodrich MS-177 multispectral EO targeting system, a follow-on to the SYERS sensor on the U-2. In February 2012, the Air Force acquired one BAE Systems SPIRITT hyperspectral system for the U-2, with more buys likely and transition to Global Hawk or Avenger possible. In 2012 the Army also evolved plans for a widearea surveillance capability for Gray Eagle, with autonomous scanning for its EO/IR payload. And General Atomics has suggested an internal WFOV sensor for Avenger. But all these programs are big future possibilities with little production planned for the next few years. Instead, the fastest growth will be seen in synthetic aperture radar (SAR) and electronic warfare systems.

Synthetic aperture radars


In January 2012, the USAF completed an analysis of alternatives for its next-generation SAR/Ground Moving Target Indicator fleet, calling for a mix of Block 40 Global Hawks with the Multi-Platform Radar Technology Insertion Program (MP-RTIP) radar and a manned, business jet-based ISR aircraft. But the Air Force also decided it did not have the money for a new manned program and would keep JSTARS flying indefinitely. MPRTIP testing is to continue through 2013, and while fleet numbers are not certain Teal Groups best guess is 19 for the Air Force expect MP-RTIP to remain the worlds most important SAR for decades. In May 2012, NATO finally signed a $1.7 billion contract for five MP-RTIP Global Hawks for AGS. The first air

Synthetic aperture radars Global Hawk MP-RTIP BAMS MFAS Lynx/Starlite Other endurance UCAV Tactical UAV Mini/Micro/Nano-UAV Available international Total FY13 135 42 160 100 26 75 10 44 592 FY14 214 72 137 123 36 81 22 50 735 FY15 270 78 142 142 34 76 28 54 824 FY16 367 90 148 143 40 96 32 56 972 FY17 377 84 139 156 60 108 36 58 1,018 FY18 326 98 120 160 68 134 44 60 1,010 FY19 367 110 110 194 88 134 60 62 1,125 FY20 355 98 97 190 102 144 58 75 1,119 FY21 246 102 99 198 126 158 66 90 1,085 FY22 148 108 92 210 128 168 74 98 1,026 Total 2,805 882 1,244 1,616 708 1,174 430 647 9,506

36

Mission CritiCal

Winter 2012

vehicle is to arrive at Sigonella air base in Sicily around 2015, with IOC in 2016. In February 2012, a NATO official also stated NATO expected to spend 2 billion euros over the next two decades to operate its five Alliance Ground Surveillance (AGS) Global Hawks. Germany may independently buy a few more. With a worldwide 20-30 air vehicle MP-RTIP fleet now pretty much settled, the biggest SAR wild card may be the Navys Global Hawk Broad Area Maritime Surveillance (BAMS) program. In mid-2012, the Navy spoke of the successful use of BAMS demonstrators providing maritime surveillance for the Navys 5th Fleet in the Persian Gulf region. Despite a BAMS-D crash in testing in 2012, the Navy plans to acquire 68 Global Hawks with the marinized MultiFunction Active Sensor SAR/inverse SAR, to maintain a standing operational fleet of 22 five-aircraft orbits and allow for attrition and depot maintenance requirements. But if only 22 operational aircraft are needed, Teal Group sees considerable scope for a reduced BAMS buy, especially as the Navy is becoming something of a lame duck user of non-MP-RTIP Global Hawks (South Korea canceled its planned Global
SIGINT & EA Global Hawk/Predator ASIP TSP/T-Pod Other endurance UCAV+EA Other tactical Available international Total FY13 40 32 78 70 50 70 340

Hawk buy due to cost increases, and Australia is no longer in the BAMS program). With Boeings 737-based manned P-8A Poseidon (and P-8I for India) ISR maritime patrol aircraft now ramping up a large production run to replace P-3C Orions, Teal Group sees BAMS as a top sequestration target, especially if the Navy wants to buy manned Joint Strike Fighters. As AGS plans and JSTARS history shows, upgrade and operations and maintenance funding for a 68-aircraft fleet with a unique radar will be massive. BAMS initial operational capability is currently planned for December 2015, but Teal believes this will be stretched, reduced or simply canceled. The biggest growth market for UAS SARs in the second half of the forecast period will likely be for tactical and smaller UAS. In September 2012, the Army awarded Northrop Grumman a contract option for an additional 44 Starlite SARs for Gray Eagle, bringing the total number of systems under contract to 174, and Starlite is also being downsized to about 45 pounds for the Shadow tactical UAS. Expect a much broader expansion of SARs to small UAS over the next decade, especially as small UAS endurance increases. All-weath-

er radio frequency sensors will offer great benefits compared to EO/IR when opponents can no longer hide behind clouds or smoke. Overall, Teal Group forecasts the UAS SAR market will grow from $592 million in fiscal year 2013 to $1 billion in FY22, with an 11.3 percent compound annual growth rate from fiscal 2013 to 2018 and 6.3 percent from fiscal 2013 to 2022.

SIGINT & EA
Teal Group has forecast UAS signals intelligence (SIGINT) and electronic attack (EA) as the fastest growing UAS sensor market, but the future of several major programs Northrop Grummans Advanced Signals Intelligence Payload (ASIP) and BAE Systems Tactical SIGINT Payload (TSP) has recently become uncertain. ASIP is tied to the endangered Block 30 Global Hawk, and in early 2012 the Army issued an RFI for TSP production, requesting no more than 95 systems at a mere $955,000 per unit. Quick Reaction Program T-Pod systems were bought from BAE Systems for Gray Eagle UAS for just $12.3 million. Teal Group still sees SIGINT sensors (essentially radio frequency ISR) migrating to nearly all types of UAS,

FY14 72 34 84 68 54 84 396

FY15 70 36 80 80 66 86 418

FY16 72 38 120 92 64 100 486

FY17 80 36 134 142 80 114 586

FY18 80 42 148 156 88 118 632

FY19 76 44 160 160 80 136 656

FY20 88 40 180 188 94 140 730

FY21 84 46 212 200 102 146 790

FY22 86 44 242 212 108 144 836

Total 748 392 1,438 1,368 786 1,138 5,870

Mission CritiCal

Winter 2012

37

Market Report continued from Page 37 but many small systems from urgent non-program-of-record developments are now already in service. With coming budget cuts, these may just suffice, with expensive major programs such as ASIP being considerably reduced or eliminated. As examples, BAE Systems has developed the company-funded NanoSIGINT payload for small UAS, and the Office of Naval Research has developed the Software Reprogrammable Payload C4I/SIGINT system for Marine Corps Shadow and other small UAS. A new market, likely to see much classified funding, is electronic attack systems for stealthy UAS. The Air Force is undoubtedly working on this, while the Navy will lead the nonblack market with its UCLASS or a dedicated carrier-borne EA UCAV. In mid-2012, reports indicated the Navy planned to subject its X-47B UCAS-D to a burst of electromagnetic interference (EMI) of 2,000 volts per meter, about 10 times the level used for most carrier-based aircraft testing. This reportedly indicates plans to develop an EMI-resistant EA platform, perhaps with a highpower microwave weapon intended to damage opposing electronics systems. But as with SIGINT, smaller programs have already put EA systems into service. The Armys Communications Electronic Attack with Surveillance and Reconnaissance (CEASAR) pod, based on Raytheons AN/ALQ227 Communications Countermeasures Set from the EA-18G Growler, has been flying in Afghanistan since 2011 on two Beechcraft King Air manned ISR aircraft. The Army plans to integrate CEASAR on the MQ-1C Gray Eagle in 2013. All told, Teal Group forecasts the UAS SIGINT and EA market growing from $340 million in fiscal 2013 to $840 million in fiscal 2022, with fairly steady 10 to 13 percent compound annual growth rate throughout. David L. Rockwell is senior electronics analyst for Teal Group Corp., a provider of aerospace and defense competitive intelligence based in Fairfax, Va. His email address is drockwell@ tealgroup.com.

Increasinghumanpotential.org
promotes the use of unmanned systems and robotics in the following categories:
By Land, Air and Sea Jobs and Economy Enhancing Public Safety Mitigating and Monitoring Disasters Helping the Environment Fostering Education and Learning Increasing Efficiency in Agriculture FAA Flight Restrictions

Discover the Endless Benefits of Unmanned Systems


38
Mission CritiCal

Winter 2012

Mesh networking:

Robots set up networks where there arent any

tEstinG, tEstinG

aving a wireless network can be key in remote areas that lack infrastructure and dangerous areas like battle zones; unfortunately, those are some of the very places where such networks arent likely to be found.

Several companies and universities have been looking into the use of mesh networks, where robots either deploy network nodes or become part of the network themselves. One such effort was undertaken by students at Northeastern University, who chose a robotic deployment system as part of their capstone design program, a project intended to display their computer and electrical engineering knowledge. The work began in the summer of 2011 and wrapped up in December, with the results published on a website later that month. The groups idea was for a ground robot to not only deploy network nodes to create a network, but for it to be controlled over that network at the same time. The students had hoped to use tiny Wi-Fi repeaters, but found them too expensive so instead resorted to off-the-shelf Linksys units running open-source firmware. The robot was built from scratch and ended up being more than 3 feet long, 2 feet wide and weighing 150 pounds. It proved capable of dropping two Linksys routers encased in Pelican waterproof cases, giving it a total range of one kilometer. And, in the words of one of its inventors, it proved to be utterly badass.

The Northeastern University node-dropping ground robot. Photo courtesty Northeastern University.

sCan it

or

Click it:

To see a video of the students work, click or scan this barcode with your smartphone. The Northeastern robot is also described by its inventors as a beast, and indeed it could carry multiple versions of another robot intended to create mesh networks. That would be iRobots FirstLook 100, the tiny, throwable robot that has made a splash at recent AUVSIs Unmanned Systems North America conferences. The 5-pound robot, now being evaluated by the Joint Improvised Explosive Device Defeat Organization, which ordered 100 of them last spring, is partly designed to use mesh networking to allow multiple robots to relay communications over greater distances, the company says. The U.K.-based Cobham specializes in providing radios to provide flexible, mesh networks that can be installed on mobile robots, manned vehicles or at fixed locations, and which can shift as the mobile nodes move. The Cobham technology was recently used by the Los Angeles police force to monitor the crosstown progress of the space shuttle Endeavour as it made its way to the California

Science Center in October. The network was hooked up to a network of Axis cameras that could leapfrog coverage as the shuttle moved along, according to Fox News.

In the cloud
Some efforts to develop mesh networking rely on small unmanned aircraft instead of ground vehicles. Germanys Project AirShield, demonstrated last spring at Rotterdam Harbor, was intended to show how a swarm of small UAS could share information with each other, and operators on the ground, to study the content of burning clouds of hazardous smoke. Such measurements can be made on the ground, but in the case of
Mission CritiCal

Winter 2012

39

Testing, Testing continued from Page 39 fires, most pollutants are in the air and moving. At present the fire brigade personnel are provided with special handheld devices that can only measure the concentration of different pollutants at ground level but are unable to survey and quantify the level of contamination carried in the atmosphere by winds and/or ascending columns of smoke, says a project briefing prepared by small UAS maker Microdrones, one of the partners in the program. Such a measurement is critical to the safety of outlying communities that may be affected by these aerial pollutants. The demonstration, which involved setting fires in three surface bins on the docks, used a single Microdrones md4-1000 vehicle, but the concept calls for a swarm of such systems using mesh network software to communicate with each other and the ground. The vehicle used in the demonstration employed a tiny Gumstix microcomputer with mesh networking software supplied by Germanys TU Dortmund University, which has sponsored a series of conferences on robotic mesh networking.

An overview of Project AirShield, funded by Germanys Federal Ministry of Education and Research.

sCan it

or

Click it:

To see a video of the demonstration, click or scan this barcode with your smartphone. work could be based on clouds of small UAS. Pirate Bay has its own motives for becoming mobile 14 countries have ordered their Internet providers to block its site, and its founders were found guilty of allowing illegal file sharing but supporters of the idea say mobile, airborne networks could also be useful in places like Egypt and Syria, where governments moved to shut down Internet access. Thinking along those same lines, the London-based think tank Tomorrows Thoughts Today said it has already created a small fleet of Internet-capable UAS, which it dubbed an aerial Napster, and demonstrated it in late 2011 at a festival in the Netherlands. At that event, the UAS hovered over the crowd, which could interact with the aircraft using cell phones. As we signal the drones they break formation and are called over, the think tank says on its website. Their bodies illuminate, they flicker and glow to indicate their activity. The swarm becomes a pirate broadcast network, a mobile infrastructure that passersby can interact with. Impromptu, augmented communities form around the glowing flock.

Microdrones md4-1000, a version of which was used in a test of AirShield.

File sharing
Not all of the network-in-the-sky ideas are aimed at generating networks in war zones or for first responders. One recent effort, in fact, was undertaken to allow file sharing. The group Pirate Bay, a Swedenbased site that allows users to swap content (much of it copyrighted, hence the name) announced in 2012 that in the future some of its net-

sCan it

or

Click it:

To see a video of Tomorrows Thoughts Todays electronic countermeasures UAS network, click or scan this barcode.

40

Mission CritiCal

Winter 2012

ADS-B tests may help expedite UAS flights in public airspace

tECHnoloGY GaP
aircraft flying nearby if they were using ADS-B. The companies dont claim that ADS-B is a magic bullet that solves the sense-and-avoid issue for unmanned aircraft, but say its a tool that could be used to speed the use of UAS in some instances, such as aircraft firefighting operations under temporary flight restrictions or during military range operations.

ADS-B in the news


Sagetech and Arcturus arent the only companies testing the uses of ADS-B. General Atomics Aeronautical Systems, the builder of the Predator, Reaper and Gray Eagle line of UAS, says it recently tested an ADS-B on the Guardian, the Office of Customs and Border Patrols marinized Predator platform. The aircraft used a prototype of BAE Systems Reduced Size Transponder, which has the military designation of AN/DPX-7. Its a friend-or-foe transponder that can operate with both military and civilian air traffic control systems and is capable of sending and receiving ADS-B signals. In a test off the Florida coast on 10 Aug., the Guardian detected other ADS-B-equipped aircraft in the vicinity, displaying their location on a ground control station display, and also sent its own location via ADS-B out. R3 Engineering, based in Lusby, Md., said in October that it had successfully tested its All Weather Sense and Avoid (AWSAS) system, which com41

The ADS-B display inside the Cirrus aircraft used in the North Dakota demonstration. Photo courtesy NASA.

ense and avoid has become a key technological goal for allow flights of unmanned aircraft in uncontrolled airspace around the world. One technology expected to help with sense and avoid is ADS-B, or Automatic Dependent Surveillance-Broadcast. Its part of the U.S. Federal Aviation Administrations NextGen system, and is basically a GPS-based transponder that reports an aircrafts position, including heading and altitude. Several demonstrations have been conducted in recent months showing the utility of ADS-B for senseand-avoid use. One took place at Camp Roberts, Calif., and involved cooperation between two AUVSI members: Arcturus UAV, based in Rohnert Park, Calif.,

and Sagetech Corp., based in White Salmon, Wash. In the demonstration, Kelvin Scribner, the president and founder of Sagetech, piloted a Cirrus SR-22 aircraft, taking off from nearby Paso Robles airport. An Arcturus T-20 UAV took off from Camp Roberts McMillan Airfield via its rail launch system and the two aircraft then flew an aerial ballet, although Scribner didnt stray into the restricted airspace and the T-20 didnt stray out of it. The system used Sagetechs tiny XP transponders to broadcast ADS-B position messages, which were then received by the companys Clarity receivers, which relayed them via WiFi to an iPad using Hilton Softwares WingX application. The aircraft then appeared over a terrain map, identified by name. It also identified other

Mission CritiCal

Winter 2012

The Arcturus T-20 before the flight. AUVSI photo.

manded an unmanned aircrafts autopilot to depart from its flight path to avoid another aircraft. The flight was conducted on 10 Aug. in Argentia, Newfoundland, Canada, following earlier demonstrations in Arizona, California and North Dakota. The development and testing of AWSAS has been funded by the Office of Naval Research, the Defense Safety Oversight Councils Aviation Safety Technologies program and Naval Air Systems Command. R3 plans to conduct further testing that will include sensor data tracking noncooperative aircraft that is, aircraft without transponders showing their location. On 20 Sept., aircraft flown by NASA and the University of North Dakota took to the skies to demonstrate a new kind of unmanned sense-andavoid technology. Over the course of two weeks of testing, NASA, UND and the MITRE Corp. worked on technology that could one day help unmanned aircraft better integrate into the National Airspace System.

MITRE and UND developed automatic sense-and-avoid computer software algorithms that were uploaded onto a NASA Langley Cirrus SR-22 general aviation aircraft. A supporting UND Cessna 172 flew as a simulated intruder aircraft. The Cirrus, which was developed as a test bed to assess and mimic unmanned aircraft systems, had a safety pilot in the cockpit, but researchers say computer programs

developed by MITRE and UND automatically maneuvered the aircraft to avoid conflicts. NASA and its partners are planning additional test flights in 2013. Follow-on testing is to feature additional advanced software by MITRE and UND, as well as sense-and-avoid software managed by a task automation framework developed by Draper Laboratory.

The ADS-B information was available both on computer monitors and iPad and iPhone screens. AUVSI photo.

42

Mission CritiCal

Winter 2012

Lip reading

IHMCs tongue sensor fills in for sight

EnD UsErs

he key to overcoming blindness might be on the tip of our tongues.

Thats the approach Dr. Anil Raj at the Florida Institute of Human and Machine Cognition is taking with his research subjects that have lost their sight serving in the military. Raj is repurposing a tactile sensor placed on the tongue along with an array of wearable infrared sensors to give a blind person an impression, quite literally, of his surroundings. The project uses a commercial tactile tongue sensor, called the Brain Port, meaning it creates a touch-based impression on the tongue of what its camera detects in the visible light spectrum. If a subject were, for example, focused on the letter E, like on an eye chart, he could feel the contrast of the E on his tongue. The big advantage of the tongue for vision is its high resolution, and that allows you to use the camera in a way that is very similar to the central vision that were all familiar with, that we read with or that we recognize faces with, explains Raj. The perception is high resolution, because the tongue and pharynx are tied into a lot of brain matter via the nervous system the reason humans can talk and similar species cannot. After the retina, the tongue and hands account for the largest amount of the brains ability to process senses.

But the tongue has a leg up on the hands its position on the body. The tongues short distance from the brain allows for very low signal lag between its location and where an image is processed, much like the eyes. The signal delay caused by nerve conduct velocity is not terribly significant, so if we can present a signal there on the tongue, it is perceived quickly relative to, say, sending that same tactile signal somewhere else on the body. These signals have quickly translated to a fill-in for sight in the subjects Raj has tested. The blind individuals weve tested it in fairly consistently describe their perception in visual terms, he says. Theyll say I looked at this, I saw that he moved this. They think in terms of it being visual after they get used to it. For one individual, the experience of wearing the tongue display was so akin to sight that he kept seeking sensory inputs even after the experiment was over. We had one gentleman that after he took the tongue display out, after we were done with training for the day, just off the cuff asked, Well, when does this image go away? says Raj. Well, what image? And he said ever since he became blind, everything seemed pitch back, the coldest, darkest black you can ever imagine. That was his entire visual perception. But

after using the system for six hours in one day, he said it felt like there was a television screen in his visual perception that was just showing signals [and] that the station had gone off the air. IHMCs work focuses on those who have recently lost their sight, who tend to have a better visual memory than people who lost their vision decades ago or have never had a visual memory, says Raj. If you have somebody thats been born blind, theyve never seen the letter E, he says. They have no sense of what the relationship is in general. They might have felt it, a raised letter E somewhere, but to translate it to what a camera picks up, I dont necessarily think that maps directly. When Raj wears the sensor himself, he suspects hes filling in missing information with his own visual knowledge. We had an interest of a military standpoint of trying to address the problems of our service members that are coming back blinded, he says. And so these are all young, healthy individuals who now are blind and trying to get on with their lives. We found theres a big difference for their ability to incorporate the information. For somebody thats been blind or has been born blind, their visual memories are not great or arent that rich anymore, or they might not have any visual memories if they were born blind.
Mission CritiCal

Winter 2012

43

End Users continued from Page 43

The BrainPort sensor that IHMC is using to reroute sight to the tongue for the blind. AUVSI photo.

Filling in the periphery


Focusing in on a persons central vision doesnt cover the entire picture though. Rajs work incorporates peripheral vision with what the tongue displays camera sees to give a person a wider field of view. If were reading the menu at the restaurant, we can notice when a waiter approaches us, explains Raj. Were not startled when they come up, whereas with something like the BrainPort for a blind person, they would have to be zoomed in tightly enough on the menu to be able to read the text, and with that setting they wouldnt necessarily be able to perceive that someone walked up. To accomplish this, Raj is using a series of infrared emitters that use a frequency-modulated beam to detect a persons surroundings, like doors and hallways. The sensors work the same way as a television remote does, only accepting a mod-

ulated signal that doesnt consider other light sources. Raj uses 24 pairs of these sensors in an array around the head, which through an algorithm the institute created can tell how far away objects are by measuring the echoes of infrared beams. But our real interesting part is our software algorithm that more or less creates this streaming map of the environment via multiple sources and sensors. The software filters out errors, like strange reflections, so it can work in nearly any environment. The algorithm also has another task because of the infrared arrays location canceling out any head unintentional head motion. So if youre walking, your head is bobbing around a little bit, or if you just turn your neck around, it doesnt really change how things are relative

to your body in the world around you. So we dont necessarily want to reflect every single one of those changes. To do this, Raj takes other measurements from accelerometers and gyroscopes and cancels out updates that are strictly related to head movements. The focus is to improve the sensory interactions between robotic systems, for example, and the humans that are interacting with those systems. This kind of focus is the essence of Rajs work at IHMC, he says. Robots dont go off and do things by themselves, he says. They do what we ask them to do or what we command them to do, and what were working on is when we can make that level of interaction more interactive and more functional with less cognitive effort. And that goes both ways.

44

Mission CritiCal

Winter 2012

12-15 August
Walter E. Washington Convention Center Washington D.C.

AVE
Conference from 12 - 15 August Tradeshow from 13 - 15 August 550+ Exhibiting Companies 40+ Countries Represented 8,000+ Attendees

thE

DAtE
Promoting and Supporting

Unmanned Systems
and

Robotics Across the Globe

auvsishow.org

You might also like