Ever since hearing the news of the introduction of driverless cars, I have been wondering if perhaps, technology has really gone too far. The first thing to pop into my mind was Stephen King’s ‘Christine’. But hey, don’t let that put you off!
It took a little while to wrap my head around the technology behind driverless cars, so for everyone else who is struggling with it like me, here’s the simple explanation; driverless cars have a range finder installed on top of their car, which uses a laser to generate a detailed 3D map of the surroundings. This combined with detailed maps of the world is supposed to enable the car to drive itself. It’s not too difficult once you’ve siphoned through the endless technology nuts using their fancy words, and find an explanation in simple layman’s terms!
But this laser – clearly it manages to pick up the surroundings, and can somehow interpret what colour the traffic lights are, but is it as good at picking up pedestrians, cyclists and motor cyclists, and more importantly, can it predict human nature. Most of us instinctively slow down if they see a child running around on the pavement, just in case that child was to run out, or perhaps fall into the road. Can a car do this?
This led me to the biggest issue I have with driverless cars, which is that when behind the wheel, you are left to make life or death decisions. No exaggeration involved. An article I found during my research by Millar (2014) highlights the decisions that we as humans make, that cannot be made by a robot with no conscience or true thought processes. If the choice is your life or the life of another road user/pedestrian, you’d have a split second to make your decision. A robot will just do what it was programmed to do, without ever having to consider the consequences. This incredibly interesting article can be found here; http://qz.com/245142/should-your-driverless-car-kill-you-to-save-a-child/
So essentially, the point that is persistently ringing alarm bells in the back of my mind, is whether robots should be given a “license to kill”. When it comes to a decision like that, are you happy to let a robot choose for you?
Some of you might be reading this thinking ‘what’s the big deal, you can take control if you want, it’s just for when you might want a rest’. That’s not entirely correct. Yes, the current cars have brakes, steering wheel, gas pedal- everything a standard car has, allowing you the option to regain control of your car. However, Google’s new prototype has none of this. There is no option of control at all, or any chance to take over if you see a hazard that the car hasn’t picked up, or if the car just so happens to malfunction – which, let’s be honest, is likely to happen. Name one piece of technology that has never malfunctioned? No, I can’t either.
One of the most basic and obvious points raised, is that some people simply enjoy driving, they don’t want to hand over control to a robot. Of course this currently isn’t an issue – if you don’t want a driverless car, don’t have one, pretty simple. But what happens when standard cars start to get phased out and there is little choice but to use driverless cars? Of course that isn’t an imminent change, but maybe in 50-60 years the idea won’t be quite so farfetched.
Finishing off my list of ‘technology is getting scary’, is that surely with all this ‘let’s be more environmentally friendly’ business, introducing cars that even those who can’t drive would be able to use is a rather large step backwards. We should be encouraging people to use public transport, not giving them more alternative options!
Of course putting all these negative points aside, it does of course give some people a quality of life they could never have otherwise. Many people with various disabilities rely on other people to be able to go out and do the things they want. These cars will give so many people the opportunity to get out of their houses and lead a much more independent life.
There’s also the frequently made point that driverless cars are safer. It’s believed that most crashes on the roads are the cause of human error, so a robot that can make a choice in milliseconds, compared to an average human response of half a second, would be a great deal safer. Reaction times are quicker and ultimately, without the consideration of ethics and lacking a conscience, the decision these cars make would most likely be the most logical action.
Personally, I’m still a little cautious about these driverless cars. We all know that there are some absolute idiots on the roads, and I’d be happy for them to give control to a computer, as undoubtedly it will make their driving 100% safer, but to drive one myself – I’d be more on nerve letting a computer drive my car and watching every single movement, than I would be to just drive myself.
For people who have a lesser value of life due to a disability or a similar reason, I think driverless cars should be offered to them, but for everyone else, I don’t believe that driverless cars should even be a choice.
I’d love to hear some opinions on this? Will you be one of the first in a driverless car when they become available to the public?
www.mybiggreenfleet.com Call us on 0845 1634 141