Even though they’re barely on the road, self-driving cars have been talked about so much that they already seem like they’re last year’s model. Google has been working on one for years. Apple is allegedly, possibly, working on one, too. And there’s even speculation that everyone from Uber to Tesla could join the race, too.
But before you give up the wheel, get familiar with the technology driving autonomous vehicles.
There are three things required to turn a regular car into an automated one, according to Sridhar Lakshmanan, a self-driving auto expert and engineering professor at the University of Michigan-Dearborn. The first is a GPS system pretty much like the ones found in many vehicles today. The second is a system to recognize dynamic conditions on the roads. And the third is a way to turn the information from the other two systems into action on your ride.
While having a GPS may seem like a no-brainer, it’s actually a vital part of a self-driving car’s over-arching technology. This system, which is essentially no different than Google Maps’ driving directions, defines the “mission” of the autonomous vehicle by setting the starting and ending point of the drive. It looks at all the roads, chooses the best path, and is often better than people at doing it.
“Human beings are not equipped to process tremendous amounts of prior data like maps,” says Lakshmanan.
But GPS alone is not enough to make a smart car. Its maps never (or rarely) change, and the reality of the road includes dynamics like detours, traffic, and other obstacles. “Autonomous driving requires a second level of intelligence with the ability to fill in additional details in the map,” says Lakshmanan. This system, which Lakshmanan calls a “differential GPS,” uses an array of technology such as radar and cameras to detect the ever-changing variables that surround it.
“If you think of the map as having a static view of the world, the sensor system is providing a dynamic fill-in to that map,” he says. “These two, together, provide what is called a ‘world model’ for that autonomous vehicle.”
Among the sensors feeding information into the differential GPS are cameras, radar, and lasers. Cameras, obviously, let the car’s computers see what’s around it. Radar, however, allows the vehicle to see up to 100 meters away in the dark, rain, snow, or other vision-imparing circumstances (Interestingly, “adaptive” cruise control systems in newer vehicles already use radar technology.) And the lasers, which look like a spinning siren light, continuously scan the world around your car and provide the vehicle with a continuous, three-dimensional omni-directional view of its surroundings.
“These sensors are providing you with raw information of the world. You need very sophisticated algorithms to process all that information, just like a human would,” says Lakshmanan.
Of course these sensors are necessary because autonomous cars are adapting to a human-driven world. There is hope that, in the future, all cars would be able to talk to each other in a connected vehicle environment. Your car would know precisely where other vehicles are, where they’re going, and where they will turn, so the computers can navigate smoothly. But we’re not there yet, says Lakshmanan, though its framework is in its experimental stage.
And lastly, the autonomous vehicle needs to be equipped to take the GPS and sensor information and turn it into actions, like steering, accelerating, or hitting the brakes. This is typically done by what’s called the “CAN bus” (which stands for controller area network). This in-vehicle electronic network has been in cars for decades, which means that autonomous vehicles of the future aren’t much different, mechanically, than the dumb-mobiles we’re driving today.
Feel free to contact E-SPIN for the various technology solution that can facilitate your infrastructure availability and security monitoring.
Related article: