Reality Computing and the State of Image Sensing

The tools required to measure and sense remote places and objects, in three dimensions, are at your disposal.
Published: March 4, 2015

Last week, I attended REAL 2015, a summit of users of Autodesk‘s software that enables what the company calls “reality capture.” For example, with Autodesk’s ReCap, one can upload high-resolution images collected from a laser scanner and create 3D images that have a range of applicability in terms of building construction, product design and, of course, 3D printing.

Software from Autodesk and other companies can also create 3D images from other types of imaging devices—even the camera in your smartphone. In fact, Autodesk announced at the summit that it has released a beta version of Memento, a cloud-based software program that uses a process called photogrammetry to enable anyone to create 3D images by uploading multiple photographs of an item, taken from several perspectives. The software stitches a 3D image by taking many overlapping 2D images of a subject, extracting the camera’s location and orientation for each image, and then plots the pixels on X, Y and Z coordinates, by creating polygons that form point clouds.

So, what does this have to do with the Internet of Things? Quite a bit, when you consider the ways in which imaging places and things in three dimensions can quantify and qualify them—especially when using specialized cameras. A thermal camera can be used for planning the placement of solar panels, for example. Energy companies employ thermal or multispectral cameras to detect machinery that is running hot. Magnetic resonance imaging (MRI) and ground-penetrating radar are two other examples of powerful imaging technology. All of these tools become extensions of the IoT when they create images that are conveyed as part of a wider context, which includes where the thing or place being imaged is located, when it was photographed, and other factors that can simultaneously be sensed, such as temperature or vibrations.

When combined with unmanned aerial vehicles (UAVs, or drones), reality-capture technologies can be especially powerful, since UAVs can take the imaging devices to places where humans could not quickly, easily or cheaply go themselves.

BNSF Railway has hired 3DR, a Berkeley, Calif.-based UAV manufacturer, to help it search for safety hazards by flying UAVs with cameras aimed at sections of the 32,000 miles of track on which BNSF moves freight within the United States. BNSF hopes that images from those UAVs, combined with sonar images taken from train-mounted cameras, can help it pinpoint weaknesses or obstructions in the railway, and fix those problems before they cause accidents. Given the increasing amount of Bakken and other crude oil moving on rails today—as well as the recent derailment and spill of crude into West Virginia’s Kanawha River—rail safety is an increasingly important issue to railway operators.

We’ve reported on how Skycatch, another Bay Area drone company, uses Autodesk reality-capture software to help construction firms and mining companies sense anything from how closely a construction project is sticking to its architectural plans to how much ore a mining firm extracts during a shift.

AECOM, a construction management firm that serves a wide range of industries, also uses UAVs and imaging software. Jon Amdur, the company’s VP of unmanned aerial systems, told REAL 2015 attendees that AECOM sometimes uses UAVs to fly over fixed sensors, such as motion sensors embedded in bridges or other infrastructure, to collect data from those sensors.

With the emergence of Memento and other imaging software that anyone can use without special training or experience, combined with the falling costs and growing capabilities of UAVs—plus, the fact that the Federal Aviation Administration (FAA) is inching toward setting regulations that will clear up the fog around where and when UAVs can be legally used for commercial purposes—the time is ripe for companies to consider ways in which they might benefit from having a set of eyes in the sky.

The FAA’s proposed rules would require a UAV operator to receive some special training, to not allow the device to move outside that person’s visual line or sight, and to operate the UAV only during daylight hours. There are a number of additional requirements, but those are the key factors.

Bill O’Connor, a co-chair of law firm Morrison & Foerster’s unmanned aerial systems practice group, said that in the future, the FAA may need to create some type of air traffic control to manage the proliferation of UAVs in low-altitude airspace. The agency has exercised discretion, he added, by not enforcing the rules regarding UAVs as commercial entities have used them to date. That, he said, is because they have, by and large, been used responsibly.

O’Connor’s advice to attendees, while the FAA works out its final rules, is that “If you don’t want to worry about the FAA, don’t do anything unsafe” with a drone. Operating the device while it is outside your visual line of sight would fall under unsafe behavior, for example.

Chris Anderson, 3DR’s co-founder, told me that he and his clients are pleased with the FAA’s proposed rules, as they are in line with what was expected and match the rules listed in the exemptions that the FAA has granted to some firms (including 3DR and Skycatch), allowing them to use UAVs while the FAA is developing its regulations for commercial UAV use.

Anderson also believes that the ability to sense and avoid objects in their path will make operating UAVs at night, or when they’re outside an operator’s visual line of sight, completely safe. Once this is proven over time, he predicts, the FAA will drop those restrictions.

Peter Blake, who leads fleet operations for Skycatch, concurs. “We see autonomous operation [of UAVs] as the way of the future,” he says. In fact, he adds, the technology needed for safe UAV autonomous operation is here today—within limits. Moving around a construction site? Sure, that’s possible. Sending your drone to the next town to make a 3D scan of an office building? Not quite. But someday—and probably sooner than we can imagine—it will be.

Mary Catherine O’Connor is the editor of Internet of Things Journal and a former staff reporter for RFID Journal. She also writes about technology, as it relates to business and the environment, for a range of consumer magazines and newspapers.