Echodyne steers its high-tech radar beam on autonomous cars with EchoDrive

Echodyne set the radar industry on its ear when it debuted its pocket-sized yet hyper-capable radar unit for drones and aircraft. But these days all the action is in autonomous vehicles — so they reinvented their technology to make a unique sensor that doesn’t just see things but can communicate intelligently with the AI behind the wheel.

Echodrive, the company’s new product, is aimed squarely at AVs, looking to complement lidar and cameras with automotive radar that’s as smart as you need it to be.

The chief innovation at Echodyne is the use of metamaterials, or highly engineered surfaces, to create a radar unit that can direct its beam quickly and efficiently anywhere in its field of view. That means that it can scan the whole horizon quickly, or repeatedly play the beam over a single object to collect more detail, or anything in between, or all three at once for that matter, with no moving parts and little power.

But the device Echodyne created for release in 2017 was intended for aerospace purposes, where radar is more widely used, and its capabilities were suited for that field: a range of kilometers but a slow refresh rate. That’s great for detecting and reacting to distant aircraft, but not at all what’s needed for autonomous vehicles, which are more concerned with painting a detailed picture of the scene within a hundred meters or so.

“They said they wanted high resolution, automotive bands [i.e. radiation wavelengths], high refresh rates, wide field of view, and still have that beam-steering capability — can you build a radar like that?” recalled Echodyne co-founder and CEO Eben Frankenberg. “And while it’s taken a little longer than I thought it would, the answer is yes, we can!”

The Echodrive system meets all the requirements set out by the company’s automotive partners and testers, with up to 60hz refresh rates, higher resolution than any other automotive radar, and all the other goodies.

An example of some raw data – note that doppler information lets the system tell which objects are moving which direction.

The company is focused specifically on level 4-5 autonomy, meaning their radar isn’t intended for basic features like intelligent cruise control or collision detection. But radar units on cars today are intended for that, and efforts to juice them up into more serious sensors are dubious, Frankenberg said.

“Most ADAS [advanced driver assist system] radars have relatively low resolution in a raw sense, and do a whole lot of processing of the data to make it clearer and make it more accurate as far as the position of an object,” he explained. “The level 4-5 folks say, we don’t want all that processing because we don’t know what they’re doing. They want to know you’re not doing something in the processing that’s throwing away real information.”

More raw data, and less processing — but Echodyne’s tech offers something more. Because the device can change the target of its beam on the fly, it can do so in concert with the needs of the vehicle’s AI.

Say an autonomous vehicle’s brain has integrated the information from its suite of sensors and can’t be sure whether an object it sees a hundred meters out is a moving or stationary bicycle. It can’t tell its regular camera to get a better image, or its lidar to send more lasers. But it can tell Echodyne’s radar to focus its beam on that object for a bit longer or more frequently.

The two-way conversation between sensor and brain, which Echodyne calls cognitive radar or knowledge-aided measurement, isn’t really an option yet — but it will have to be if AVs are going to be as perceptive as we’d like them to be.

Some companies, Frankenberg pointed out, are putting the responsibility for deciding what objects or regions need more attention on the sensors themselves — a camera may very well be able to decide where to look next in some circumstances. But on the scale of a fraction of a second, and involving the other resources available to an AV — only the brain can do that.

EchoDrive is currently being tested by Echodyne’s partner companies, which it would not name but which Frankenberg indicated are running level 4+ AVs on public roads. Given the growing number of companies that fit those once narrow criteria, it would be irresponsible to speculate on their identities, but it’s hard to imagine an automaker not getting excited by the advantages Echodyne claims.

Gadgets – TechCrunch

BMW’s magical gesture control finally makes sense as touchscreens take over cars

BMW has been equipping its cars with in-air gesture control for several years and I never paid attention to it. It seemed redundant. Why wave your hand in the air when there’s dials, buttons, and touchscreens to do the same? Until this week, that is, when took delivery of a BMW 850i loaner equipped with the tech. This is about the future.

I didn’t know the 850i used gesture control, because, frankly, I had forgotten BMW had this technology; I stumbled upon it. Just make a motion in the air to control the volume or tell the navigation to send you home. Now, in 2019, with giant touchscreens set to takeover cars, I find BMW’s gesture control smart and a great solution to a future void of buttons.

It’s limited in use right now. There are only a few commands: volume, nav, recent calls, and turning on and off the center screen. It’s easy to see additional functions added in the future. It’s sorely missing the ability to step back a screen. I want that function the most.

Here’s how it works: to control the volume, take one finger and spin it in the air above the center stack. Anywhere. The range is impressive. A person can do this next to the screen or two feet away. A person’s arm could be resting on the center armrest and lift in the air and twirl their finger. Bam, it controls the volume. Put two fingers up – not spinning, like a flat peace sign – and the screen turns on or off. Make a fist and open it twice to load the navigation or phone (user picks the function).

After using the system for several days, I never had a false positive. The volume control took about 10 minutes to master while the other gestures worked the first time.

In this car, these commands work in conjunction with physical buttons, dials, and a touchscreen. The gestures are optional. A user can turn off the function in the settings, too.

I found the in-air control a lovely addition to the buttons, though. At night, in the rain, they’re great as they do not require the driver to remove their focus from the road. Just twirl your fingers to turn down the volume.

I’m not convinced massive touchscreens are better for the driver. The lack of actual, tactile response along with burying options in menus can lead drivers to take their eyes off the road. For the automaker, using touchscreens is less expensive than developing, manufacturing, and installing physical buttons. Instead of having rows of plastic buttons and dials along with the mechanical bits behind them, automakers can use a touchscreen and program everything to be on screen. Tesla did it first, Ram, Volvo, and now Ford is following.

In-air gesture control could improve the user experience with touchscreens. When using BMW’s system, I didn’t have to take my eyes off the road to find the volume — something that I have to do occasionally, even in my car. Instead, I just made a circle in the air with my right hand. Likewise, BMW’s system lets the user call up the nav and navigate to a preset destination (like work or home) by just making another gesture.

BMW debuted this system in 2015. The automotive world was different. Vehicles were

Gadgets – TechCrunch

Flying taxis could be more efficient than gas and electric cars on long-distance trips

Flying cars definitely sound cool, but whether they’re actually a good idea is up for debate. Fortunately they do seem to have some surefire benefits, among which you can now count improved efficiency — in theory, and on long trips. But it’s something!

Air travel takes an enormous amount of energy, since you have to lift something heavy into the air and keep it there for a good while. This is often faster but rarely more efficient than ground transportation, which lets gravity do the hard work.

Of course, once an aircraft gets up to altitude, it cruises at high speed with little friction to contend with, and whether you’re going 100 feet or 50 miles you only have to take off once. So University of Michigan researchers thought there might be a sweet spot where taking a flying car might actually save energy. Turns out there is… kind of. The team published their results today in Nature Communications.

The U-M engineers made an efficiency model for both ground transport and for electric vertical take-off and landing (VTOL) aircraft, based on specs from aerospace companies working on them.

“Our model represents general trends in the VTOL space and uses parameters from multiple studies and aircraft designs to specify weight, lift-to-drag ratio and battery-specific energy,” said study co-author Noah Furbush in a U-M news release.

They looked at how these various theoretical vehicles performed when taking various numbers of people various distances, comparing energy consumed.

As you might imagine, flying isn’t very practical for going a mile or two, since you use up all that energy getting to altitude and then have to come right back down. But at the 100-kilometer mark (about 62 miles) things look a little different.

For a 100 km trip, a single passenger in a flying car uses 35 percent less energy than a gas-powered car, but still 28 percent more than an electric vehicle. In fact, the flying car is better than the gas one starting at around 40 km. But it never really catches up with the EVs for efficiency, though it gets close. Do you like charts?

ICEV: Internal combustion engine vehicle; VTOL: Vertical takeoff and landing; BEV: Battery electric vehicle. The vertical axis is emissions.

To make it better, they had to juice the numbers a bit bit, making the assumption that flying taxis would be more likely to operate at full capacity, with a pilot and three passengers, while ground vehicles were unlikely to have their average occupancy of 1.5 people change much. With that in mind, they found that a 100 km trip with three passengers just barely beats the per-person efficiency of EVs.

That may seem like a bit of a thin victory, but keep in mind that the flying car would be making the trip in likely a quarter of the time, unaffected by traffic and other issues. Plus there’s the view.

It’s all theoretical right now, naturally, but studies like this help companies looking to get into this business decide how their service will be organized and marketed. Reality might look a little different from theory, but I’ll take any reality with flying cars.

Gadgets – TechCrunch

The best and worst of CES 2019: Monster displays, VR in cars and crazy personal gadgets

CES 2019 is here and there have been a lot of technology announced at the show. From the latest autonomous vehicle technology to the coolest personal gadgets, here’s a round up of the best from the show so far.

Autos

Smart Home

Personal Gadgets

The Worst

Gadgets – TechCrunch