banner



What Does Ford Uses For Controls Development

Ford introduces its next generation of Fusion Hybrid autonomous evolution vehicles.

By Chris Brewer, Chief Program Engineer, Ford Democratic Vehicle Evolution

I'm proud to introduce the next-generation Ford Fusion Hybrid autonomous development vehicle.

It'southward been three years since we hit the streets with our first Fusion Hybrid autonomous research vehicle, and this latest version takes everything we learned and builds on it.

This new motorcar uses the current Ford democratic vehicle platform, merely ups the processing power with new reckoner hardware. Electrical controls are closer to production-fix, and adjustments to the sensor technology, including placement, allow the automobile to better encounter what's around it. New LiDAR sensors take a sleeker design and more than targeted field of vision, which enables the car to now use just two sensors rather than iv, while still getting but as much data.

Every bit we've discussed before, at that place are two principal elements to creating an autonomous vehicle — the autonomous vehicle platform, which is an upgraded version of the car itself, and the virtual driver system. This new vehicle evolves both measures, peculiarly with regard to the evolution and testing of the virtual driver system, which represents a large leap in sensing and computing power.

What do nosotros hateful by virtual driver system? Well, to make fully autonomous SAE-defined level 4-capable vehicles, which practise not demand a driver to take command, the car must be able to perform what a human tin perform behind the wheel. Our virtual driver system is designed to do just that. It is fabricated upwards of:

  • Sensors — LiDAR, cameras and radar
  • Algorithms for localization and path planning
  • Reckoner vision and machine learning
  • Highly detailed 3D maps
  • Computational and electronics horsepower to make information technology all piece of work

Building a machine that will not be controlled by a homo driver is completely different from designing a conventional vehicle, and this raises a whole new prepare of questions for our autonomous vehicle engineering science squad: How practise you lot replicate everything a human commuter does behind the wheel in a vehicle that drives itself? A simple errand to the store requires a human driver to make decisions continuously en route. Does she have the right of way? What happens if an accident or construction blocks his route?

Just as we have confidence in ourselves and other drivers, we demand to develop a robust virtual driver system with the aforementioned level of dependability to brand decisions, and then deport them out appropriately on the get. We're doing that at Ford by taking a unique arroyo to help brand our autonomous cars see, sense, think and perform similar a human being — in fact, meliorate, in some cases.

How the car "sees"

I'thou going to become a trivial technical here, so please stay with me. Based on current and anticipated applied science, our engineers are working to build ii methods of perception into the virtual driver organisation of an autonomous vehicle: mediated perception and direct perception.

Mediated perception requires the creation of high-resolution 3D maps of the surround where the autonomous car will be driving. These maps encompass everything the virtual driver system knows about the road earlier the machine fifty-fifty starts driving — locations of end signs, crosswalks, traffic signals and other static things. When out on the road, the virtual driver uses its LiDAR, radar and camera sensors to continuously browse the area around the auto and compare — or mediate — what information technology sees against the 3D map. This allows information technology to precisely locate the vehicle's position on the road, and to identify and understand what's around it. Mediated perception besides includes the system that knows the rules of the road, and so it can set up and abide by those rules.

Straight perception complements mediated perception past using the sensors to see the vehicle's positioning on the road, as well equally dynamic entities — like pedestrians, cyclists and other cars. The sensors can even help interpret hand signals, such as a police officeholder in the road directing traffic. Naturally, the capacity for directly perception requires even more sophisticated software and computing power to identify and classify various entities, particularly pedestrians who are on the movement.

This hybrid approach, incorporating both mediated and straight perception, will enable our virtual commuter system to perform tasks equal to what a human driver could, and potentially, even amend.

Now, allow's explore what goes into transforming a human being-driven Ford Fusion Hybrid into a fully autonomous car. To keep it simple, nosotros'll break the virtual commuter'south responsibilities into three tasks — sensing the surrounding environment, using that perception to make decisions on the route, and controlling the auto.

Sensing around the car

A technician installs one of ii new LiDAR sensors, each generating millions of beams to provide a 360-degree view around the car.

From the exterior, our autonomous research vehicle's sensors are the most noticeable differentiator from a conventional Fusion Hybrid. Think of them like a man'due south eyes and ears.

Two hockey-puck-sized LiDAR sensors, each generating millions of beams, jut from the car's front pillars, providing a 360-degree view. These new sensors possess a sensing range roughly the length of two football game fields in every direction. High-definition LiDAR is specially well-suited for distinguishing where an object is, how big it is, and what it looks like.

Three cameras mounted on ii racks are installed atop the roof. A forward-facing photographic camera is mounted nether the windshield. These cameras work to identify objects and read traffic lights on the road.

Short- and long-range radar sensors — good at seeing through pelting, fog and heavy snow — add another level of vision, helping to determine how an object is moving relative to the car.

Data from all three sensors feed into the autonomous vehicle'south encephalon, where the data is compared to the 3D map of the environment and other calculator vision processing takes place.

Thinking and making decisions

The computer, located in the trunk, acts as the brain of the autonomous development vehicle.

Ford's autonomous vehicle brain is located in the trunk. There, the equivalent of several high-end computers generate 1 terabyte of data an 60 minutes — more the average person would use in mobile-phone data in 45 years.

But what really brings the computing platform to life is Ford's virtual driver software, developed in-house.

There are a lot of considerations an autonomous car has to process on the fly: What'southward around it? What are other drivers doing? Where is it going? What'south the best route? If merging into another lane, does it speed upward, or deadening downwards? What does that decision hateful for other vehicles on the route?

The sophisticated algorithms our engineers write process millions of pieces of data every second, helping the autonomous vehicle to react but as it is programmed to do.

Controlling the auto
But as our brain tells the muscles in our hands and feet what to exercise, decisions are relayed to the autonomous vehicle controls via a network of electrical signals. This means tweaking the Fusion Hybrid's software and, in some cases, the hardware, then that electrical commands tin exist sent to steering, braking, throttle and shifting systems. To ensure all of the mechanical systems perform as they are instructed requires a sophisticated network, like to a man torso's nervous organisation.

Of class, additional functions require boosted power — a lot of it. A standard gas-powered car doesn't have enough electric power for an autonomous vehicle, then we've had to tap into Fusion Hybrid'south high-voltage bombardment pack past calculation a second, independent power converter to help create two sources of power to maintain robustness.

This new development vehicle brings Ford a step closer in its commitment to offer a fully autonomous vehicle in 2021 for ride-sharing and ride-hailing services. For at present, the car notwithstanding comes with a steering cycle and pedals — equipment our ride-sharing vehicles ultimately won't include.

Looking alee, we have a lot more to do. An expanded fleet is accelerating our real-globe testing already taking place on the roads in Michigan, California and Arizona. We plan to grow the fleet even more, tripling its size to nearly 90 cars in the new year's day.

And y'all'll start hearing more well-nigh how we're thinking through the user feel of an autonomous ride-sharing or ride-hailing fleet vehicle. For instance, we're working on what to do if a passenger accidentally leaves items in the vehicle, or fails to shut a door after exiting.

Our engineers are unrelenting in their mission to develop a robust, capable and trustworthy virtual commuter arrangement. And our adjacent-generation autonomous vehicle is a clear step forward — 1 that takes u.s. closer to the self-driving car we envision our customers volition i mean solar day ride around town.

The futurity is coming. And we tin can't await.

Ford'south expanded fleet will accelerate road testing already taking place in Michigan, California and Arizona.

What Does Ford Uses For Controls Development,

Source: https://medium.com/self-driven/building-fords-next-generation-autonomous-development-vehicle-82a6160a7965

Posted by: lorenzothaveres.blogspot.com

0 Response to "What Does Ford Uses For Controls Development"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel