Image

Preventing Fatal Pilot Error Is Always a Challenge

July 5, 2016
The much-publicized accident involving Tesla’s Model S was a sober reminder that, while technology is great when it works, it can be fatal when it does not.

The recent tragic accident involving Tesla’s Model S running in autopilot mode was just one of many fatal accidents to occur that day. The reason this accident made headlines was because the autopilot is designed to prevent accidents. Keep in mind that the technology was still in beta testing. Drivers using it are supposed to keep their hands on the wheel and be aware of their surroundings as they would if they were driving. Of course, the whole purpose of the autopilot was to make it possible to let computers be aware of the surroundings so the pilot/driver can do other things.

Tesla’s autopilot is cutting-edge, but not the only automatic features available on cars: We have had cruise control for decades, as well as various braking assists. The main difference to date is the level of control given to the system by the driver and the amount of attention or expertise they need to provide. For example, newer cruise control systems incorporate front-looking radar that slow the vehicle if the distance to something in front of the car decreases. This feature is just part of those in the Model S’ autopilot repertoire.

The question is how quickly the car will respond to changes in its environment, and just how that will occur. Does it prevent closing on a vehicle or obstruction based on very generous parameters—such as leaving more than a full car length per 10 miles/hour of velocity allowing plenty of room for other cars to slip into the gap—or does it drive with the assumption that it can react faster than the vehicle in front and reduce the gap between them? This assumes that the lead vehicle is not involved in a crash that reduces its velocity to zero more quickly than is possible using a braking system.

Convoy, or platooning, vehicle systems (Fig. 1) try to have it both ways by providing communication between vehicles, so that the lead have a longer distance between itself and other traffic while the distance between vehicles in the convoy can be much shorter. In theory, the communication between the convoy vehicles is almost instantaneous, so braking initiated by the lead will ripple immediately through the system, stopping or slowing it as a whole. That is not something achievable with human drivers.

1. NXP Semiconductors and DAF Trucks demonstrated self-driving technologies in automated trucks as part of the European Truck Platooning Challenge.

The Great Airbag Recall

Technology is great when it works. It can also be fatal when it does not. Additional technology can be employed to address errors that occur within a system. For example, some elevators can detect when they are falling too quickly and implement mechanical braking. Redundant computers are the norm for avionic and space systems.

Unfortunately, not all systems have or can incorporate alternate or redundant systems to prevent problems due to all types of errors. In addition to the Model S accident, you probably also heard about the plethora of airbags being recalled. These, too, can and have caused fatalities.

The potential difference is that the airbags being recalled have a design or construction flaw that causes them to operate in improper manner often exploding with too much force, at inappropriate times or ejecting shrapnel in addition to the bag.

Airbags are not ubiquitous in cars with multiple airbags surrounding a single occupant in many vehicles. I, for one, prefer having airbags in my car, but I want them to work as designed.

The Challenge of Corner Cases and Sensor Integration

As design engineers and programmers, we have the challenge of developing these systems. Others are involved in specifying, testing, implementing, deploying, and servicing the systems. That is a lot of people and machines involved in making something work and keeping it operational over the lifetime of the system. There are many potential areas where errors can be incorporated, and typically plans are in place to address these issues (such as quality control, training, etc.).

Today’s designs tend to be more complex because of the capabilities of the computers involved within the system being able to handle more sensor and control systems. This can allow acquisition of more accurate information about the surrounding environment and to allow advanced analysis of this information.

Keep in mind that there were multiple sensors on the Model S to provide its autopilot capability. Some of those sensors were more accurate in measuring distances than a Mark One Eyeball. The system could also react faster than a person, but many of these types of sensor systems provides more limited information—albeit more accurate within their scope. For example, a radar system will not indicate what color an object is, only its range.

In this particular accident, there was a combination of issues that caused the system not to recognize that a tractor trailer had crossed the pass of the Tesla. The radar system was actually looking under the trailer where the bumper to a car or truck would normally be located. The image system was also fooled by the bright background and the white trailer. The radar would have detected the wheels on the front or rear, and I do not know if it had detected the truck passing in front.

In any case, this particular situation is one of the many corner cases that arise when dealing with complex systems. Many applications simply ignore these cases, or they might signal an error condition. The challenge with safety critical systems like this is what to do with conditions that can result in fatal results. The answers are not always good and the question about whether the exceptional cases are enough to prevent the use of the system.

Going back to airbags, there is a problem with their use around children and small adults because of the force of the bags when they explode. Some systems can detect the weight of an occupant and adjust the amount of force based on that information. The alternative, used early on, is simply to indicate that a child should not sit in the front seat.

Autopilot Successes

There are instances where autopilot functionality is commonly used safely. Many planes have advanced autopilot capabilities. The high cost of the systems allows more advanced sensors systems—such as radar or communication systems between planes—to exchange position information. There are also restrictions on where and when planes can fly, limiting the number of collision possibilities.

Higher-end quadcopters can use autopilots to fly via waypoints, usually avoiding collisions with fixed objects like trees and buildings. Flying them above the height of people potentially removes them from the equation as well, but what happens when objects wind up in the path of a quadcopter with a more advanced sensor suite? It depends upon the sensor system and computational capabilities of the autopilot application.

Newer, more-advanced systems incorporate object detection systems, allowing the software to recognize and adjust their flight plane to avoid the objects (Fig. 2). Having moving objects around the quadcopter makes the problem more complex, since the quadcopter is often the one that needs to avoid the other objects. As with the Model S, the sensor suite may not “see” an object or part of an object causing issues with avoidance.

2. IFM Technologies uses StereoLabs ZED Camera and NVidia’s Jetson to orient the quadcopter using SLAM (simultaneous location and mapping).

Human eyes and ears combined with a brain that has lots of experience is hard to compete with when using simpler sensors and artificial intelligence (AI). These sensors and the AI continue to improve, and like program language compilers, may eventually replace human experience and expertise for many common and mundane jobs.

Still, AI—especially deep learning—requires training. A system that is not trained in various corner cases may not recognize or handle these cases in the field.

The typical car will eventually be sprouting half-a-dozen cameras, radar, and other sensors. Some high-end systems already have these in place. There will also be more V2x support, including vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I), providing real time communication that will allow a vehicle to know about its environment without needing direct sensor input. Still, these systems will take time and money to come into existence.

Accidents like this recent Tesla one should be avoided when possible, but that is not always feasible given the various constraints in terms of money, convenience, liability, and practicality. We need to make sure we do not condemn new technologies, nor to adopt them too soon.

Looking for parts? Go to SourceESB.

About the Author

William Wong Blog | Senior Content Director

Bill Wong covers Digital, Embedded, Systems and Software topics at Electronic Design. He writes a number of columns, including Lab Bench and alt.embedded, plus Bill's Workbench hands-on column. Bill is a Georgia Tech alumni with a B.S in Electrical Engineering and a master's degree in computer science for Rutgers, The State University of New Jersey.

He has written a dozen books and was the first Director of PC Labs at PC Magazine. He has worked in the computer and publication industry for almost 40 years and has been with Electronic Design since 2000. He helps run the Mercer Science and Engineering Fair in Mercer County, NJ.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!