Path: EDN Asia >> News Centre >> Automotive >> Can Ethernet take on drive-by-wire?
Automotive Share print

Can Ethernet take on drive-by-wire?

09 Dec 2014  | Junko Yoshida

Share this page with your friends

"This conservation of bandwidth allows substantially more data to flow on an AVB network vs. a MOST network even at equivalent network bitrates," according to the AVnu Alliance whitepaper. Topologies such as stars and trees are easily supported.

Roadmap—from A/V to autonomous

The AVnu Alliance is trying to answer key questions posed by the automotive industry:

Does automotive Ethernet have enough bandwidth for various sensors coming into a car? Can it react fast enough for safety-critical applications? How resilient can it be? Can it protect cars from hackers?

 AVnu Alliance

The AVnu Alliance calls for the vision of a fully networked car that allows access to all sensors and cross-domain communication. Control messages can flow with minimal latency. Network redundancy is used as a critical vehicle backbone. (Source: AVnu Alliance)

In hopes of "accelerating the road from A/V to autonomous," Kreifeldt explains, the AVnu Alliance is tackling each of these issues "in phases, year by year." The initial profiles and features have already been developed for in-vehicle infotainment and viewable cameras for non-safety critical features. Specific features that would allow automotive Ethernet for safety-crucial applications are scheduled for release in 2016 in phase 2. Phase 3, planned for 2017, will introduce redundancy in automotive communication networks, he says.

Much of AVnu Alliance follows progress made at IEEE standards working groups. "Initially, we were playing catch-up. But now that we are working hand-in-hand with them, we are getting ready with a certification process" for each of the new profiles. The certification tests are critical for "carmakers and Tier 1s" as they need to use the components "with assurances that they work."

Precise timing and synchronisation

Precise timing and synchronisation are critical, not only for the A/V experience, but also in aligning images coming from surround-view cameras, for example. Once every sensor node inside a car is precisely timed in sync, it's easier for a DSP (digital signal processor) to stitch views, according to Kreifeldt.


BMW has a wide-aspect ratio centre stack display showing the surround view and a proximity graphic when it detects objects. (Source: BMW)

Technically, synchronisation has two purposes. First, it provides a common time base for sampling data at a source device and presenting that data at one or more destination devices with the same relative timing. Second, it synchronises multiple streams (for example, front and rear audio), according to the AVnu Alliance whitepaper.

AVB achieves this via the IEEE 802.1AS Precision Time Protocol (PTP). It provides a common time-reference base to all nodes on the network, called the "wall clock." The IEEE 1722 AV Transport Protocol then introduces the concept of "presentation time," derived from the common wall clock, allowing the sending node to specify when (in network time) a packet should be presented at the receiving end.

Data acquisition

All camera and sensor data acquisition can be precisely synchronised to minimise stitching efforts. The ADAS units and head unit can equally access the camera and sensor data. (Source: AVnu Alliance)

 First Page Previous Page 1 • 2 • 3 • 4 Next Page Last Page

Want to more of this to be delivered to you for FREE?

Subscribe to EDN Asia alerts and receive the latest design ideas and product news in your inbox.

Got to make sure you're not a robot. Please enter the code displayed on the right.

Time to activate your subscription - it's easy!

We have sent an activate request to your registerd e-email. Simply click on the link to activate your subscription.

We're doing this to protect your privacy and ensure you successfully receive your e-mail alerts.

Add New Comment
Visitor (To avoid code verification, simply login or register with us. It is fast and free!)
*Verify code:
Tech Impact

Regional Roundup
Control this smart glass with the blink of an eye
K-Glass 2 detects users' eye movements to point the cursor to recognise computer icons or objects in the Internet, and uses winks for commands. The researchers call this interface the "i-Mouse."

GlobalFoundries extends grants to Singapore students
ARM, Tencent Games team up to improve mobile gaming

News | Products | Design Features | Regional Roundup | Tech Impact