Embeddedadvisor
US
APAC
EUROPE
  • Home
  • Insights
  • Whitepaper
  • Conferences
  • Newsletter
  • Subscribe
  • News
  • About us
Go to...
  • Home
  • Insights
  • Whitepaper
  • Conferences
  • Newsletter
  • Subscribe
  • News
  • About us
  • Categories

  • IP Design
  • Telecom
  • Wearables and Sensor
  • Consumer Electronics
  • IoT
  • More
      • Industrial Computing
Go to...
  • Categories

  • IP Design
  • Telecom
  • Wearables/Sensor
  • Consumer Electronics
  • IoT
  • Industrial Computing
×
#

Embedded Advisor Weekly Brief

Be first to read the latest tech news, Industry Leader's Insights, and CIO interviews of medium and large enterprises exclusively from Embedded Advisor

Subscribe

loading

THANK YOU FOR SUBSCRIBING

  • Home
  • Insights
  • IP Design
Editor's Pick(1 - 4 of 8)
left
Safeguarding Most Important Wealth - Intellectual Property

Richard Caron, CIO & VP of Business Process Management, Isola Group

Some Simple Steps You can Take to Keep Devices Secure

Aaron Gette, CIO, The Bay Club Company

Only IPv6 has the backbone to carry the IoT

Richard Jimmerson, CIO, ARIN

Leveraging Operational Excellence through IoT in Aerospace

David Jarvis, VP/CIO, Honeywell Aerospace

Artificial Intelligence in Our Innovations

Joseph S. Codispoti, Chief Intellectual Property Counsel, BEDGEAR

5 Misconceptions Executives and Engineers Have about Patents

Steven G. Saunders, Co-Chair Intellectual Property Department/ Patent Attorney, Nutter

Blockchain: When Reality Meets Utopia

Nathaniel Karp, Chief Economist, BBVA Compass

Industrial IoT - Automating EnterpriseWorkflows: Adoption and Growth Patterns

Yogi Sikri, Enterprise Mobility, Workplace and IoT Leader, DXC Technology

right

How Autonomous Vehicles Perceive and Navigate their Surroundings

By Anand Gopalan, CTO, Velodyne LiDAR, Inc. And Sally Frykman, Director of Communications, Velodyne LiDAR, Inc.

Tweet

Anand Gopalan, CTO, Velodyne LiDAR, Inc.

Situational awareness is the key to good driving. To navigate cars to the desired destination, drivers need to know their locations and observe their surroundings in real time. These observations allow the driver to take actions instinctively such as accelerate or brake, change lanes, merge onto the highway, and maneuver around obstacles and objects.

Fully autonomous vehicles (AVs) work in much the same way, except they use sensor and GPS technologies to perceive the environment and plan a path to the desired destination. These technologies work together to establish the location of the car and the correct route to take. They continuously determine what is going on around the car, locating the position of people and objects in close proximity to the vehicle, and assessing the speed and direction of their movements.

The constant flow of information into the car’s onboard computer system decides the safest way to navigate safely within its surroundings. To better understand how sensor technologies in autonomous cars work, let’s examine how these vehicles perceive their location and environment to identify and avoid objects in their pathways.

"Due to the limitations in camera technology, we are yet to achieve complete autonomy or “self-driving” capability in the real world"

Precisely Measuring the Vehicle’s Location and Surroundings

Sensor technologies provide information about the surrounding environment to the vehicle’s computer system, allowing the car to move safely in our three-dimensional world. These sensors gather data that describe a car’s changes in position and orientation.

Autonomous vehicles utilize high-definition maps that guide the car’s navigation system. Recent developments in AV technology aim to generate and update these maps in real-time. While this is still a work in progress, it is necessary because the conditions of our roadways are not static. Congestion, accidents, and construction complicate real-life movement on our streets and highways. On-vehicle sensing technologies, such as lidar, cameras, and radar, perceive the environment in real-time to provide accurate data of these ever-changing roadway situations.

The real-time maps that these sensors produce are often highly detailed, including road lanes, pavement edges, shoulders, dividers, and other critical information. These maps include additional information, such as the locations of street lights, utility poles, and traffic signs. The vehicle must be aware of each of these features to navigate the roadway safely.

Detecting and Avoiding Objects

Sensor technologies provide onboard computers with the data they need to detect and identify objects such as vehicles, bicyclists, animals, and pedestrians. This data also allows the vehicle’s computer to measure these objects’ locations, speeds, and trajectories.

An example of object detection and avoidance in autonomous vehicle testing is a dangerous tire fragment on the freeway. Tire fragments are not usually large enough to spot easily from a long distance, and they are often the same color as the road surface. AV sensor technology must have high enough resolution to detect the fragment’s location on the roadway accurately. This requires distinguishing the tire from the asphalt and determining that it is a stationary object (rather than something like a small, moving animal).

In this situation, the vehicle not only needs to detect the object but also classify it as a tire fragment which must be avoided. Then the car must determine the right course of action, such as to change lanes to avoid the tire fragment and other incoming vehicle or object. For the car to have enough time to change its path and speed, these steps must all happen in less than a second. Again, these decisions made by the vehicle’s onboard computer depend on accurate data provided by the vehicle’s sensors.

Sally Frykman, Director of Communications, Velodyne LiDAR, Inc.

A Closer Look at Sensor Technologies

To be categorized as “fully autonomous,” a car must be able to navigate between destinations without any intervention from a human driver. Self-driving cars aim to increase safety by eliminating human errors from driving situations, such as cell phone distractions or drowsy inattention.

Sensor technologies perceive a car’s environment and provide the onboard map with information about current roadway conditions. To build redundancy into self-driving systems, automakers utilize an array of sensors, including cameras, radar, and lidar.

Camera-centric sensor suites can monitor the environment and enable limited driving automation. Cameras can identify colors and fonts, so they are capable of reading traffic signals, road signs, and lane markings. However, images produced by cameras – even stereo cams – do not always provide the level of accurate depth perception necessary for full autonomy. Due to the limitations in camera technology, we are yet to achieve complete autonomy or “self-driving” capability in the real world. Radar systems complement cameras very well. They typically offer better range and horizontal field of view. Radars are unhampered by inclement weather or lack of light.

Additionally, radars provide accurate information on speeds of other vehicles. That said, radars have poor resolution (>10+ cm), so a radar’s 3D image is unacceptably fuzzy. Radars also have difficulty detecting stationary objects. Consequently for accurate object detection and classification radars cannot be used without combining them with cameras.

Lidar provides high-resolution, three-dimensional information about the surrounding environment. Unlike radar, lidar offers much higher resolution computer perception data, enabling accurate object detection. Unlike cameras, lidar provides accurate depth perception, with distance accuracy of a few centimeters, making it possible to precisely localize the position of the vehicle on the road and detect available free-space for the car to navigate. Lidars also offers 360 degrees horizontal field of view and up to 40 degrees vertical field of view, providing the vehicle the ability to generate dense, high-resolution 3D maps of the environment.

Autonomous vehicles depend on the data provided by their sensors to perceive and navigate the environment. AVs will be equipped with lidar, cameras, and radar to enable safe, reliable full autonomy.

tag

Sensor

GPS

Read Also

Artificial Intelligence in Our Innovations

Artificial Intelligence in Our Innovations

Joseph S. Codispoti, Chief Intellectual Property Counsel, BEDGEAR
5 Misconceptions Executives and Engineers Have about Patents

5 Misconceptions Executives and Engineers Have about Patents

Steven G. Saunders, Co-Chair Intellectual Property Department/ Patent Attorney, Nutter
Blockchain: When Reality Meets Utopia

Blockchain: When Reality Meets Utopia

Nathaniel Karp, Chief Economist, BBVA Compass
Industrial IoT - Automating EnterpriseWorkflows: Adoption and Growth Patterns

Industrial IoT - Automating EnterpriseWorkflows: Adoption and Growth Patterns

Yogi Sikri, Enterprise Mobility, Workplace and IoT Leader, DXC Technology

Weekly Brief

loading
Top 10 IP Design Service Companies - 2020
Top 10 IP Design and Solution Companies - 2020

IP Design Special

Featured Vendors

  • The Western Design Center: Guiding The Past, Present And Future Of Microprocessor Technology
    The Western Design Center: Guiding The Past, Present And Future Of Microprocessor Technology
  • LN2: Novel Phy Decoding Engines for Improved IOT Connectivity
    LN2: Novel Phy Decoding Engines for Improved IOT Connectivity
  • Fractal Technologies: One-Stop-Shop for IP Design Validation
    Fractal Technologies: One-Stop-Shop for IP Design Validation
  • Brass Roots Technologies: Specialist IP Core-Powered Solutions
    Brass Roots Technologies: Specialist IP Core-Powered Solutions

I agree We use cookies on this website to enhance your user experience. By clicking any link on this page you are giving your consent for us to set cookies. More info

Copyright © 2021 Embedded Advisor. All rights reserved. Registration on or use of this site constitutes acceptance of our Terms of Use and Privacy Policy.
follow on linkedin follow on twitter
This content is copyright protected

However, if you would like to share the information in this article, you may use the link below:

ip-design.embeddedadvisor.com/cxoinsights/how-autonomous-vehicles-perceive-and-navigate-their-surroundings-nid-304.html