Self Driving Cars Disrupting the Future of Transportation

A self-driving car orienting in traffic with the help of sensors

Self-driving cars promise to be an efficient and sustainable mode of transportation for everyone, preventing accidents and making commuting convenient for all. Hardware and especially AI software development is what’s driving the industry forward.

Fully autonomous cars are no longer science fiction and soon will become part of everyday life. We live in a new era of transportation where vehicles driven by humans are being replaced by computer-aided driverless cars. Experts anticipate a $556.67 billion market size for the global autonomous vehicle market by 2026, with a 39.47% growth rate within the next 5 years. Imagine how reality might look in the near future:

Early morning your car alerts you to wake up for an important meeting. As you are almost ready, finishing up your cup of coffee, you can see the garage door open and your car moving out in smooth coordination with it. Your car parks at your front door awaiting you as it sends an alert to your smartphone about a potential traffic jam and that you must move out in a couple of minutes to get to your meeting on time.

You are amazed as you sit in your self-driving car. The car is driving steadily, adjusting the speed in coordination with stop signs and is barely coming to a halt, aiming for the optimized fuel consumption you set earlier. Finally, you hear, “you have arrived at your destination, have a nice day sir”, as it drops you in front of the entrance before proceeding to the parking lot.

About 1.2 million people die and up to 50 million get injured in traffic collisions each year. 94% of these crashes involve human error. AI-powered autonomous cars are the technology that can put a stop to this, however many aspects need to be incorporated into these systems to make them fully capable of replacing humans at the wheel.

Inside the Brain of a Driverless Car

Several self-driving cars moving and analyzing each other's position

An autonomous car needs to perceive the environment that includes potholes, road signs, pedestrians, and other vehicles. Moreover, the car has to predict changes in the environment to navigate safely. For example, it must approximate the trajectory and position of its nearby vehicle based on its estimated speed.

The car also needs a controller to make decisions concerning steering, accelerating, decelerating, and braking. Further, it needs to plan its motion using real-time location updates to reach its destination. Smoothly switching lanes while choosing the best route are tasks that only sophisticated driverless cars can accomplish. This forms the basis of “informed decisions” in driverless cars powered by AI and machine learning solutions.

Levels of Autonomy in Self-Driving Cars

A self-driving car moving in the mid lane

The self-driving car is a vehicle that drives itself based on five levels of autonomy. The Society of Automotive Engineers (SAE) has set the standard which helps in defining the following levels:

  • Hands-on. Chrysler’s Imperial was the first car to deploy cruise control to control the car speed based on ground speed calculation. However, the driver still needs to steer.
  • Hands off. Nissan’s ProPILOT Assist system is designed to provide both lateral (speed) and longitudinal (steering) control without the intervention of the driver. However, the driver is still present behind the wheel. This level of automation is deployed by several car brands including but not limited to Tesla, Mercedes and General Motors.
  • Eyes off. Audi A8 Sedan represents the first car to implement lateral and longitudinal control associated with object and event detection and response. However, the driver must intervene in case of an emergency.
  • Mind off. Waymo is the first car known to handle emergencies autonomously under limited operational design domain. In October 2018, Waymo reached 10 million miles of fully autonomous testing. However, testing was conducted under good weather and road conditions.
  • High Driving Automation. A fully autonomous car can navigate without any driver intervention at unlimited Operational Design Domain (ODD). Such autonomous cars can drive you anywhere under any weather condition.

The fifth level of autonomy promises a future of automotive vehicles that will bring us a better daily commute. The car will give us a safe ride to our destination while we focus on preparing for an important meeting rather than driving. That is to say, self-driving cars will increase safety as compared to manually driven cars.

Self-Driving Car Sensors

A self-driving car stopped before a crosswalk to give way to pedestrians

Sensors form the integral components of self-driving cars. They allow the car to perceive the environment through different measures. They can be exteroceptive to measure an external property of the car surroundings or can be proprioceptive to track changes inside the vehicle.

The widely used exteroceptive sensors in self-driving cars are:

  • Camera provides high-resolution data and correctly perceives the environment.  Although the camera is a cost-effective solution, it offers substandard performance in harsh weather conditions and has poor in-depth estimation. Alternatively, stereo cameras enable depth estimation from synchronized image data.
  • Radar is mainly used for object detection and tracking. It is a low-cost solution offering good performance in extreme weather, however, radars do only achieve low-resolution outputs.
  • LIDAR or Light Detection and Ranging sensor provides highly accurate depth information with better resolution than Radar. It provides 360 degrees of visibility and can output a three-dimensional scene geometry.
  • Sonar or the ultrasonic sensor is used for measuring short distances with sound waves. It’s very effective, regardless of the lighting conditions, and is mainly applied in parking scenarios.

The widely used proprioceptive sensors in self-driving cars are listed below:

  • GNSS or Global Navigation Satellite Systems are worldwide position and time determination sensors. They are used in high-precision localization of the self-driving car.
  • IMU or the Inertial Measurement Unit incorporates multi-axis precision gyroscopes, magnetometers, pressure sensors, and accelerometers. These components are widely used for accurate and direct measures of position, acceleration, and angular rotation rate. It can operate under extreme and complex motion dynamics.
  • Wheel odometer is used to accurately measure wheel speed and for orientation of the self-driving car.

Self-Driving Car Software

The vector icon of a self-driving car surrounded by the icons of Diagnostics, IT Security, Location Services, Voice Control, GPS, and AI

AI and machine learning-based software form the brain of a self-driving car. Its architecture is responsible for using all of the measured data to perform the following tasks:

Environmental Perception

Environmental Perception software informs the self-driving car of its current location and helps it distinguish between objects in the surrounding environment. This includes, but is not limited to road signs, traffic lights, bicycles, vehicles, lane marks, and pedestrians.

This module takes a combination of the measured signals from the wheel odometer, GPS, and IMU to locate the self-driving car. The data from the LIDAR, cameras, and radars are fed into this system to enable dynamic object detection.

Environmental Mapping

The software responsible for environmental mapping creates three different types of maps of the surrounding environment, listed below:

  • Occupancy grid map used to locate all static objects and is based on the data provided by the LIDAR sensor.
  • Localization map based on LIDAR and camera outputs and used to determine the motion of the autonomous vehicle with respect to other components of the environment.
  • Detailed road map merged from the grid and localization maps and combined with real-time measured data from the sensors to render a 360-degree overview of the world and the surrounding objects.

Motion Planning

Motion Planning software utilizes environmental mapping and perception modules. Both long term and short term plans are needed to configure the optimum route along with maneuvering tasks respectively to reach the destination.

Steering Control

The controller takes the previously mapped trajectory and converts it into lateral (steering angle) and longitudinal control (speed) over the throttle and braking elements of the car. These calculations are used to find the best steering angle and brake application.

System Supervisor

A system supervisor ensures that all components of the self-driving car are functioning correctly and adhering to safety requirements. This software continuously monitors the state of hardware components for faults in operation and provides end-to-end management of all the other modules in the system.

Top Automakers’ Thoughts and Projections

Don’t just take our word for it. To give you an insider’s perspective, here’s what the leaders of top automotive brands expect from the nearest future in autonomous vehicle development.

Mary Barra, the chairwoman and CEO of General Motors Company:

We expect to be the first high-volume auto manufacturer to build fully autonomous vehicles in a mass-production assembly plant.

Mark Fields, CEO of Ford Motor Company:

We expect to have a level 4 vehicle in 2021, no gas pedal, no steering wheel, and the passenger will never need to take control of the vehicle in a predefined area.

Rodney Brooks, an Australian roboticist predicts that the majority of American cities will ban manually driven cars with drivers by 2045.

Elon Musk, the founder and CEO of SpaceX:

The upcoming autonomous coast-to-coast drive will showcase a major leap forward for our self-driving technology. Additionally, an extensive overhaul of the underlying architecture of our software has now been completed, which has enabled a step-change improvement in the collection and analysis of data and fundamentally enhanced its machine-learning capabilities.

Mo Elshenawy, General Motors Cruise Vice President:

Unlike other autonomous vehicle companies, being deeply integrated with one of the world’s largest automakers like General Motors positions Cruise to manufacture self-driving cars on an assembly line in Orion, Michigan, which is capable of producing hundreds of thousands of vehicles per year.

Robin Li, CEO of Baidu:

Major developing countries are employing a host of measures to support research into, and testing of autonomous driving.

Nick Twork, senior communications counsel at Argo AI:

In a very short period of time, we’ve been able to basically put the system at a level of maturity far beyond what other companies of our age have been able to do.

Conclusion

The transition from self-driving cars with varying levels of autonomy to fully autonomous vehicles is yet to be made. However, modern AI technologies and machine learning development are making rapid leaps forward in this direction, and that is what’s driving the industry forward. Top automotive brands such as General Motors, Ford, and Tesla are in the final stages of testing their driverless vehicles which means we are on the verge of seeing a revolutionary change in the way we commute.

Get updates about blockchain, technologies and our company

We will process the personal data you provide in accordance with our Privacy policy. You can unsubscribe or change your preferences at any time by clicking the link in any email.

Follow us on social networks and don't miss the latest tech news

  • facebook
  • twitter
  • linkedin
  • instagram
Stay tuned and add value to your feed