Autonomous cars are slowly but surely making their way onto the world’s roads, with several car manufacturers testing and recalibrating self-driving car technologies. The manufacturers claim that road accidents can be reduced if human error is removed from the equation, but how do the cars ‘know’ what to do in particular driving situations?

There are three functions that autonomous cars require to replace the need for drivers. First is the ability to know and understand where it is located in relation to other cars or objects around it; second is the faculty to figure out the most convenient (and safest) route to its destination or next location; and third is the capability to move there. These may be basic tasks for a human driver, but each function has posed unique challenges for automotive and software engineers, which have to train autonomous cars to mimic or simulate the way humans perceive physical space as they reposition and decide the next course of action. Physically steering the car, changing gears, or hitting the breaks are easy to motorise, but how and when to perform each of these actions remains a complex question.


Autonomous cars being tested today make use of a combination of lasers, radar, sonar, cameras, and sensors, along with 3D road maps called prior maps. Google has placed a Velodyne 64-beam laser on each of their cars, which spins like the flashing light of a police car and scans the surrounding area to gather information. The use of both laser and radar has resulted in what is now called LIDAR, or Laser Illuminating Detection and Ranging technology. LIDAR is used to build a three-dimensional map that would enable the car to ‘see’ its surroundings and spot potential hazards. LIDAR and cameras, together with digital maps and GPS, work to make the car perceive its surroundings and allow it to determine its position.

Autonomous cars also make use of sophisticated programs or software that allow the cars to process the data in real-time. Algorithms to process the information gathered by the autonomous car technology and enable the car react accordingly to variables such as traffic, road construction, detours, and varying weather conditions such as rain and snow, and the actions of human drivers on the road.

A connected future

As autonomous cars become more commonplace in the coming years, they will be able to connect and communicate with each other to share information about their surroundings and enable them to move more efficiently and in-sync with one another. Through this communication and information system, autonomous cars will be able to know exactly where other cars are and where they are going so they can anticipate movements and navigate the roads as smoothly as possible. The ‘Internet of Things’ is opening up many areas in which our technology already talks to each other – making this the case for cars is an important part of the move to driverless vehicles and safer roads.

Currently, most autonomous cars still require a driver to be ready to take control of the car in case the car comes across a scenario for which it has not been programmed. However, in Google’s recent tests of their self-driving vehicles, the crashes the cars have been involved in have all been due to other drivers crashing into them. Driverless cars may be coming soon, but it appears that practical and theory tests are still very important for human drivers for the time being.


Comments are closed.