Path: EDN Asia >> News Centre >> Automotive >> Is it possible to build accident-free cars?
Automotive Share print

Is it possible to build accident-free cars?

07 Nov 2014  | Junko Yoshida

Share this page with your friends


5. Vision vs. radar

Forward-facing ADAS

Forward-facing advanced driver assistance systems with vision-based ADAS outperformed "vision-less" for AEB accident avoidance. (Source: AUTONOMOUS EMERGENCY BRAKING TEST RESULTS, Thatcham Research, 2013)

When we talk about ADAS, the first feature most consumers think is forward-facing advanced driver assistance systems for collision avoidance. The use of an extra pair of eyes feels somehow safer, especially if it can perceive objects before driver sees them.

Radar and lidar both are commonly used in variety of ADAS applications, especially for distance determination, said Cognivue's Wilson. Meanwhile, vision technology has been traditionally deployed to classify what it has detected. That division of labour, however, might be fast blurring.

Citing 2013 European New Car Assessment Program (Euro NCAP) test results, Wilson said that the test results revealed significantly better detection performance from the Subaru Outback vision-based system versus radar and lidar. Subaru's "EyeSight" system uses a stereo camera arrangement for depth sensing. Subaru's vision-only system actually saw objects 80m ahead and braked sooner.

Euro NCAP autonomous emergency braking (AEB) testing is focused primarily on avoiding collisions with other vehicles, Wilson said. However, AEB for pedestrian avoidance is on the organisation's evaluation roadmap. Subaru's EyeSight system, based on its own algorithms and FPGA-based hardware has already implemented pedestrian support.

Wilson noted that vision technology, traditionally used for classification, is now increasingly used also for distance determination.


6. Detect pedestrians

Subaru Eyesight System

Subaru Eyesight System can detect obstacles ahead.

Automakers must keep up with Euro NCAP's roadmaps in hopes of getting higher ratings. But some automakers are moving ahead of the rating system, said Cognivue's Wilson, by building autonomous emergency braking systems that recognise pedestrians.

Pedestrian detection, however, is no cakewalk. Making such vision-based ADAS implementations harder are "strict real-time requirements, highly variable environmental conditions and thermal dissipation that needs to be minimised," explained Wilson.

First, the vision technology uses stereo vision to discern distance. Vision processors are required to run disparity mapping to generate the 3D depth map. Second, the vision technology must process a range of classifiers by using such techniques as Histogram Oriented Gradient (HOG) and Support Vector Machine linear classifier. The problem is that new classifiers keep coming up, each requiring a separate evaluation.

Detect pedestrians

(Source: Cognivue)

For example, HOG alone is computationally challenging, said Wilson. It requires high-performance processors that dissipate power. Making matters more complicated is that multiple classifiers need to run in parallel, said Wilson, such as one for cars, one for people, one for bicycles, etc.

Long latency often becomes an issue with serial processing, explained Wilson, as the number of operations involved in the algorithm is huge.


 First Page Previous Page 1 • 2 • 3 • 4 • 5 Next Page Last Page


Want to more of this to be delivered to you for FREE?

Subscribe to EDN Asia alerts and receive the latest design ideas and product news in your inbox.

Got to make sure you're not a robot. Please enter the code displayed on the right.

Time to activate your subscription - it's easy!

We have sent an activate request to your registerd e-email. Simply click on the link to activate your subscription.

We're doing this to protect your privacy and ensure you successfully receive your e-mail alerts.


Add New Comment
Visitor (To avoid code verification, simply login or register with us. It is fast and free!)
*Verify code:
Tech Impact

Regional Roundup
Control this smart glass with the blink of an eye
K-Glass 2 detects users' eye movements to point the cursor to recognise computer icons or objects in the Internet, and uses winks for commands. The researchers call this interface the "i-Mouse."

GlobalFoundries extends grants to Singapore students
ARM, Tencent Games team up to improve mobile gaming


News | Products | Design Features | Regional Roundup | Tech Impact