There’s nothing especially remarkable about getting a ticket, even if it’s for the rare offense of driving too far below the speed limit. If you’re in one of Google’s self-driving cars, however, getting pulled over for the first time is not only noteworthy but also raises a plethora of questions about what can and will happen when more autonomous vehicles are road.

Can we trust self-driving cars to make their driving decisions based on programming rather than the instinct and experience on which we human drivers rely? Perhaps the biggest self-driving question to be answered is whether it’s truly possible to program a self-driving car to respond, quickly and appropriately to all eventualities we might face on the road.

Quality Crafted and Far-Reaching Press Releases That Make An Impact

Are you looking to make a big impact on your small business? Look no further than press releases - they're a powerful tool for amplifying your news! Learn how to use them to your advantage.

Credit: Travis Wise

A Less Than Spotless Driving Record?
After causing a traffic jam by traveling below its capped 25 mph speed in a 35 mph speed zone, one of Google’s self-driving cars came close to receiving its first ticket from a motorcycle officer in Mountain View, California.

As the officer saw traffic seriously backed up and pulled over the small vehicle causing the jam, he found no driver and thus issued no official citation. This means Google’s 1.2 million mile driving record remains technically spotless but the company’s self-driving car project still has a lot of important questions to answer.

Can Defensive Driving Decisions Be Programmed? 
Google’s autonomous cars can be programmed to follow traffic laws to the letter and respond to a number of different situations on the road, from people in crosswalks to the sight of flashing lights in the review mirror, but anyone who’s had even a little experience driving knows that it takes a lot more than just observing traffic laws to drive safely—and most important, defensively.

Google’s ability to account for the more unpredictable realities of driving is getting better, but will it ever be on par with a human’s ability to make quick decisions behind the wheel? Consider what the mind of a human driver must go through when deer dashes into the road or when he or she is in the midst of an unavoidable collision.

While the scanning systems incorporated into autonomous cars are meant to process information and respond accordingly, is there a way to program a response that’s appropriate every time and for events that no one can truly anticipate?

Credit: Alena Nesterova

For all the hype that stems from a future filled with self-driving cars, we may not really be able to hit the road in them or among them until we get better at understanding everything that happens and can happen behind the wheel, especially when all you see behind it is an empty seat. What are your thoughts on the logistic and ethical questions that remain regarding self-driving cars? Tell us in the comments?

Article Sources:

Scroll to Top