Path: EDN Asia >> News Centre >> Automotive >> Self-driving Ford Fusion hybrid can navigate in winter
Automotive Share print

Self-driving Ford Fusion hybrid can navigate in winter

30 Mar 2016

Share this page with your friends

Driving during winter is indeed a challenge faced by millions in the United States. Now, as self-driving cars are making its way into public roads, self-driving cars must be able to navigate snow-covered roads. Ford unveils six facts about its latest technology that allows for a self-driving car to efficiently navigate itself in snow.

1. Mapping the way: Ford first creates high-resolution 3D maps using LiDAR technology to scan the area its autonomous vehicle will later drive in the snow.

To operate in snow, Ford Fusion Hybrid autonomous vehicles first need to scan the environment to create high-resolution 3D digital maps. By driving the test route in ideal weather, the Ford autonomous vehicle creates accurate digital models of the road and surrounding infrastructure using four LiDAR scanners that generate a total of 2.8 million laser points a second. The resulting map then serves as a baseline that's used to identify the car's position when driving in autonomous mode. Using the LiDAR sensors to scan the environment in real time, the car can locate itself within the mapped area later, when the road is covered in snow.

2. Better have an unlimited data plan: Ford's autonomous vehicles collect and process significantly more mapping data in an hour than the average person uses in mobile-phone data in 10 years.

While mapping their environment, Ford autonomous vehicles collect and process a diverse set of data about the road and surrounding landmarks, signs, buildings, trees and other features. All told, the car collects up to 600Gb per hour, which it uses to create a high-resolution 3D map of the landscape. In the United States, the average subscriber of a cellular data plan uses about 21.6Gb per year, for a 10-year total of 216Gb.

3. Super smart sensors: Ford uses LiDAR sensors that can even identify falling snowflakes and raindrops.

Ford's autonomous vehicles generate laser points from the LiDAR sensors that bounce off falling snowflakes or raindrops, returning the false impression that there's an object in the way. Of course, there's no need to steer around precipitation, so Ford, working with University of Michigan researchers, created an algorithm that recognizes snow and rain, filtering them out of the car's vision so it can continue along its path.

4. Not your average navigation: The way Ford's autonomous vehicles identify their location is more accurate than GPS.

When you think about vehicle navigation, GPS usually comes to mind. But where current GPS is accurate to just more than 10 yards, autonomous operation requires precise vehicle location. By scanning their environment for landmarks, then comparing that information to the 3D digital maps stored in their data banks, Ford's autonomous vehicles can locate themselves to within a centimetre.

ford fusion hybrid

Figure 1: Sensor fusion–the combination of data from multiple sensors–plus smart monitoring of sensor health help keep Ford’s autonomous vehicles out of the blind.

5. No need for glasses: Sensor fusion–the combination of data from multiple sensors plus smart monitoring of sensor health help keep Ford's autonomous vehicles out of the blind.

In addition to LiDAR sensors, Ford uses cameras and radar to monitor the environment around the vehicle, with the data generated from all of those sensors fused together in a process known as sensor fusion. This process results in 360-degree situational awareness. Sensor fusion means that one inactive sensor—perhaps caused by ice, snow, grime or debris build-up on a sensor lens does not necessarily hinder autonomous driving. Still, Ford autonomous vehicles monitor all LiDAR, camera and radar systems to identify the deterioration of sensor performance, which helps keep sensors in working order. Eventually, the cars might be able to handle ice and grime build-up themselves through self-cleaning or defogging measures.

6. Look Mom, no hands: The first person behind the wheel of a demonstrated autonomy test in snow is an astrophysics major who never dreamed he'd be in a self-driving car.

Before Wayne Williams joined Ford's autonomy team, he worked on remote sensing technology on behalf of the federal government. A self-described 'geek,' Williams was intrigued by autonomous vehicles. But he never envisioned one day being part of a team working to bring them to reality, let alone being behind the wheel of the auto industry's first publicly demonstrated autonomous snow test. The mood in the car that day was all business, he recalls, with a coworker monitoring the computing system from the back seat. Because of the extensive development work, we were confident the car would do exactly what we asked of it, says Williams. But it wasn't until after the test that the achievement began to sink in.

Ford is the first automaker to publicly demonstrate autonomous vehicle operation in the snow. The company's winter weather road testing takes place in Michigan, including at Mcity, a 32-acre, real-world driving environment at the University of Michigan. Ford's testing on this full-scale simulated urban campus is aimed at supporting the company's mission to learn about and advance the emerging field of autonomous driving.

Want to more of this to be delivered to you for FREE?

Subscribe to EDN Asia alerts and receive the latest design ideas and product news in your inbox.

Got to make sure you're not a robot. Please enter the code displayed on the right.

Time to activate your subscription - it's easy!

We have sent an activate request to your registerd e-email. Simply click on the link to activate your subscription.

We're doing this to protect your privacy and ensure you successfully receive your e-mail alerts.

Add New Comment
Visitor (To avoid code verification, simply login or register with us. It is fast and free!)
*Verify code:
Tech Impact

Regional Roundup
Control this smart glass with the blink of an eye
K-Glass 2 detects users' eye movements to point the cursor to recognise computer icons or objects in the Internet, and uses winks for commands. The researchers call this interface the "i-Mouse."

GlobalFoundries extends grants to Singapore students
ARM, Tencent Games team up to improve mobile gaming

News | Products | Design Features | Regional Roundup | Tech Impact