Tesla Inc has removed radar sensors from its semi-autonomous driving system which is known as autopilot. This move has raised safety concerns over the camera only semi-autonomous driving version of Tesla which is known as Tesla vision.
Tesla Inc‘s chief executive Elon Musk has always been infamous for surprising his company and the world. If he could prove that an electronic car maker company would be the most profitable car company in this world, he sure can make people believe in the precision of Tesla vision.
But it’s still over a long haul and sometimes the most beaten road is the safest. Tesla wants to make the driver assist system fully autopilot. Many players in the driverless car world are raising red flags over the safety of a vision-only sensor. Vision-only sensor fails to work when weather conditions are poor like darkness, sun glare, dust, fog etc. Vision only system needs a definitive condition to function which is not always possible in the real world.
What exactly is Tesla autopilot and how does it work?
Tesla autopilot is a combination of advanced driver assistance systems and features. It amounts to level 2 vehicle automation. There are in all six levels of driving automation, where level two automation is partial automation where execution of steering wheel, accelerator and brake is controlled partially by the driver. It has lanes entering, self-parking, traffic-aware cruise control, changing lane automatically, auto steer, smart summon, semi-autonomous navigation and an ability to summon the car from where it is parked. Almost everything that car of Iron Man has. Smart summon helps in navigating around complex environments and parking lots where car manoeuvres to come find us.
Ex: let’s say you have a long drive towards a destination. Due to working all day long, you are very tired and not in the shape to drive to the destination. In this case, the autonomous car will help in reducing the burden on the driver alone as compared to manually operated cars.
How does autopilot work?
Autopilot works with a combination of radars, lidars and cameras sensors.
Radar – radar sensors are very useful among drivers because they help a car owner to navigate the speed limit. It’s used to comprehend distance and speed. It helps in knowing how far is an object from us and at what speed it is moving. A radar device emits radio waves that travel at high speed. If an object comes between those radio waves the waves bounce back to the device. These bouncing back waves help us to know that there is an object nearby. When the radar detects a moving object the frequency of radio waves bouncing back switches. This helps us to determine the speed of the object. Radio waves will change frequency according to the distance of the object from us. These are radio receivers. Radar sensors help navigate through unfavourable climate.
Lidar- it is also called as eyes of driverless cars. It’s mounted on the bonnet or the roof of the vehicle. It provides a 360° view of the neighbourhood helping drivers to navigate easily. Lidar systems rotate continuously sending thousands of laser beams. Just like radio waves these beams collide with objects around them and bounce back to the sensor. The laser beams which bounce back to the centre create a 3-D view of the surrounding. The computer present on the car comprehends this information and alters it into an animated 3-D presentation. It also monitors the distance and speed of the vehicles passing by and directs speed, brake or stopping the vehicle. But lidar may not work in unfavourable environmental conditions.
Camera only version or tesla vision- The camera only version of Tesla has recently dropped radar from its configuration. Eight cameras surround the car provide a 360° view. They can spot objects up to 250 mts of range. It has 12 ultrasonic sensors to complement the vision. It helps in the detection of hard and soft objects, covers double the distance than above two sensors. All the information gathered by the cameras is processed on-board computer. This computer uses Tesla’s neural net. The camera only version sees in all the directions that the driver can’t. According to other players in the autonomous car world, the camera only version cannot work in unfavourable weather conditions.
What about the cost and efficiency of these systems?
The camera only and radar sensors are cheaper as compared to lidar. The camera only sensor is very hard to design but it’s very cheap. Lidar is the most expensive navigation system because it uses laser beams to navigate.
Camera only and radar sensors are affordable but they lack high definition resolution to map the objects moving around the vehicle. They often fail to determine the shape of the objects leading to accidents. Lidar on the other hand is expensive but it creates the best resolution pictures of the objects moving around the vehicle. It creates the sharpest 3D point cloud.
Lidar is relatively expensive and generates high-resolution pictures of objects around it, but fails under unfavourable weather conditions like sun glare, fog, dust, darkness et cetera.
Radar systems are inexpensive and work fine under poor weather conditions. It creates the problem of phantom braking, where the car stops abruptly if they are passing under an overbridge.
In May 2016 a driver driving an autonomous Tesla car was killed after his car crashed into a semi-truck because the autopilot system failed to identify it. After that, there has been a total of 24 Tesla car accidents or crashes where the autonomous driving system failed to identify semi-trucks, stationary police vehicles and fire trucks.
Elon Musk due to the poor performance of radar sensors is dropping them from his car models, Model 3 and Model Y. He concludes that the camera only system is safer because of fewer disturbances and disorganised signals.
Waymo – another participant in driverless cars race
Alphabet Inc‘s self-driving cars – waymo uses all 3 types of sensors in its driverless cars. It uses a camera only, radar, lidar.
As discussed above how each of these three sensors has its pros and cons. Waymo believes that operating all three sensors can endow the user experience of autonomous cars and prevent accidents. So even if there is poor weather condition or the poor resolution picture, three different sensors cover weaknesses of each other.
Let’s say there is a person who is standing with a stop sign near the road. The sensors will detect the stop sign, they will match this information with a map looking for prior knowledge. The computer will then ask a question if our map contains a stop sign here? If there is no prior knowledge like roadwork or construction there is no need to stop.
Waymo’s cars are going to be costlier than Tesla’s because all three sensors are used which will cost a lot. In the market, it’s going to give Tesla a competitive edge due to its price. But when safety is concerned Waymo will be given the upper hand.
Future of autonomous cars in India-
The future of autonomous cars in India is like getting blood out of a stone. As of 7th May 2020, self-driving cars will not be allowed to run on Indian roads. This statement was given by Road transport and highways minister Nitin Gadkari. The advent of autonomous vehicles has initiated in the form of self-driving trucks and tractors. Mahindra and Mahindra, Flux Pluto, Escorts are trying to penetrate the Indian market.