On Tuesday—press day at CES, the major consumer electronics show held each year in Las Vegas—carmakers made a number of announcements related to connected-car technologies.
Ford Grows Autonomous Driving Program, Announces Drone Experiment
Mark Fields, Ford Motor Co.‘s president and CEO, described a range of autonomous-vehicle initiatives and discussed new approaches to mobility that the firm is testing as part of its year-old Smart Mobility research program. These included one involving drones and aimed at helping the United Nations Development Program respond to disasters.
Ford is collaborating with DJI, which develops software to operate unmanned aerial vehicles, or drones, and issues an open call to developers to create a means of using Ford SYNC AppLink and OpenXC (a voice-activated interface with the Ford SYNC infotainment system, and an open-source hardware and software SYNC interface, respectively) to support drone-to-vehicle communications.
The goal is to enable United Nations first-responders arriving at the scene of an earthquake or other disaster to quickly deploy a drone—from, say, the bed of a Ford F-150 pickup truck—in order to survey and map disaster zones. The drone would then redock in the truck bed autonomously, even if the F-150 had moved. The winning entrants will receive $100,000 to develop their proposal.
Fields also announced that Ford is tripling its test fleet of Fusion Hybrid autonomous vehicles, and is testing its virtual-driver software in both urban and suburban environments. The fleet, which is growing to 30 vehicles, operates in Arizona, California and Michigan.
The fleet is being equipped with a new LiDAR device from Velodyne, a manufacturer with which Ford has been working since its first autonomous prototype vehicle was developed in 2007. The new LiDAR unit, known as the Solid-State Hybrid Ultra PUCK Auto (so named for its hockey-puck shape) boosts the sensor’s range by 200 meters (656 feet) and can be integrated into a car’s side-view mirror. In press materials, Ford claims that Ultra Puck will “accelerate the development and validation of Ford’s virtual-driver software, which serves as the decision-making brain that directs vehicle systems.”
Ford’s corporate messaging has, for the past five years, focused on the changing transportation landscape: how technology, such as electrification and autonomous driving, will change the way in which we drive, and how the growth of megacities and the rise of alternative transportation systems, such as car-sharing, will change how people move and whether they will own cars. Tuesday’s press conference was no different.
“Beginning this year, you’ll see us change from an automotive company into a mobility company,” Fields remarked. Yet, rather than unveil any major new initiatives focused on new approaches to transportation, Fields and Ken Washington, Ford’s VP of research and advanced engineering, provided a short update on Ford’s Smart Mobility program, which it rolled out at last year’s CES conference.
The program is designed to test more than two dozen schemes aimed at everything from car-sharing to semi-autonomous driving to parking assistance. IOT Journal reported on the program last year.
“Some projects are finished, some have morphed [in new directions] and some continue,” Washington explained. “We’re now focusing and homing in on two areas: flexible use and ownership of vehicles, and multi-modal urban mobility solutions.”
Washington cited two specific efforts: Ford’s GoDrive, a car-sharing experiment using a fleet of Ford-owned vehicles and designated parking spaces that the company launched with 2,000 drivers in London last spring, and Peer-2-Peer Car Sharing, a pilot project launched this summer in six U.S. cities and London. The Peer-2-Peer program allows Ford owners to rent out their personal vehicles to other drivers, who pay a per-minute usage fee.
As one might expect, Washington said, Millennial drivers are showing the most interest in both car-sharing programs. Overall, the chief concerns among Peer-2-Peer program participants are damage to their cars, the cleanliness of their vehicles and the worries that renters may fail to return them on time.
Toyota Gives a Glimpse Into Autonomous Vehicle Research Efforts
Last year at CES, Toyota announced that it was opening its portfolio of more than 5,600 patents related to its development of a car that uses hydrogen as fuel, in order to stoke the propulsion system’s development. This year did not see the carmaker announce a hydrogen fuel car. Instead, the company focused on the work it is doing to develop autonomous vehicles.
In November of last year, Toyota announced that it has launched a new R&D-focused company, the Toyota Research Institute (TRI), into which it will invest $1 billion over five years to develop the artificial intelligence and machine-learning science needed to advance autonomous vehicles. TRI is opening twin laboratories near Stanford University and the Massachusetts Institute of Technology (MIT), and is working closely with academics at both institutions.
On Tuesday, Gill Pratt, TRI’s CEO, revealed two of the 30 research projects the two labs are planning—both with curious names that Pratt cheekily said “only a genius professor could dream up.” The first, called Uncertainty on Uncertainty, aims to crack one of the trickiest logic problems autonomous cars face: how to react to unexpected events. In describing this, he explained that a car, bicycle or other object veering into an autonomous car’s path is, to some degree, an expectable event, insofar as the vehicle can be programmed to respond to the predictable series of events that would come next. But what if, for example, a truck traveling in front of an autonomous car were to lose part of its load? The resulting debris could break into smaller pieces and occupy a compact or very large area in the roadway.
The second project, dubbed The Car Can Explain, is a bit more esoteric. “We can’t trust what we can’t understand,” Pratt said, by way of introducing this project’s goal, which is to essentially translate software-based decisions that an autonomous vehicle makes into text that its driver can readily understand.
These projects might seem far removed from the sensors that control autonomous vehicles, but they reflect the importance of the millions of lines of code that are the true brains behind driverless cars.
Volkswagen Unveils BUDDe Concept Microbus
Early on Tuesday, Panasonic announced that it is bringing back a long-beloved product: the Technics turntable. Tuesday night, Volkswagen rolled out its own blast from the past: the highly anticipated electric version of the VW microbus. Dubbed BUDDe, this is only a concept vehicle, but it is being imagined with a long list of IoT features, including connectivity to smart-home devices. Such connectivity will essentially turn the van’s large digital infotainment screen into an extension of the smart-home interface. From the vehicle, a driver will be able to adjust the home’s temperature, view any alerts generated by the home’s connected security camera, or see who is ringing the connected doorbell—all without having to dig for his or her smartphone.
It will take a pretty novel approach to infotainment as well, Herbert Diess, VW’s chairman of passenger cars, told CES attendees. As riders enter the vehicle, the bus will wirelessly connect with each person’s mobile device (perhaps via a Bluetooth connection, though he did not specify this). Then, using a special app, each rider will build a playlist for the trip based on the music in that person’s phone or tablet, while a large digital display in the back of the vehicle will present photos.
CES Car Computer News
VW and General Motors both announced partnerships with Mobileye, an Amsterdam-based company that uses input from car-mounted cameras to create a highly detailed map of the roadway—based not just on lane markers, which is what semi-autonomous cars currently do, but on actual imagery of the road. These images are uploaded to Mobileye’s servers via a cellular link inside the vehicle. With a critical mass of cars carrying these cameras and recording the roadway, Mobileye will create and continually update vision-based maps that should enable fully autonomous operation of vehicles.
Autonomous cars will also need especially robust computing power to operate safely, and on Monday night, Nvidia kicked off CES by announcing its latest automotive offering, the Drive PX 2. The liquid-cooled computer sports 12 CPU cores and four Nvidia-branded Pascal GPU chips. This computing power matches that of 150 MacBook Pro computers, Nvidia CEO Jen-Hsun Huang told attendees.