A few months back, I interviewed the CEO of a robotics company who was wearing a pair of Google Glasses. From the beginning of our talk, the glasses distracted me… because they distracted him. He stopped mid-sentence and looked over my shoulder, moving his head slightly. Apparently, he was reading some text or e-mail.
Sure, he’s a busy man and clearly wants to stay on top of his communications, but it was difficult not to feel like I was a low priority. Given this experience, I was not surprised to learn last month that Google is putting the brakes on its $1,500 Google Glass for consumers, and is instead grouping all of its smart glasses activities into its enterprise-focused Glass at Work program.
As I reported last month, a number of companies are developing or promoting sensor-packed smart glasses for commercial uses, ranging from applications that could make it easier for surgeons to access a patient’s medical history without leaving the surgery bay, to enabling customer service technicians to see what a customer is seeing (via video content streamed into their glasses) and then verbally walk that customer through a repair or diagnose a failure.
Last week I spoke with Ed English, the chief marketing and product officer of APX Labs, which makes a smart-glasses operating software known as Skylight, which runs on smart glasses made by Sony, Epson, Google or Vuzix (and is designed to be hardware-agnostic). Boeing recently posted a video describing how employees are trying out Google Glasses to guide them through the steps for assembling a wire harness. This lets them rely on voice commands and visual instructions displayed through the glasses, rather than on their existing paper-based directions.
Smart glasses seem to have tremendous potential in terms of making manual assembly work more efficient, or in helping employees troubleshoot repair issues or provide step-by-step job training. And, of course, the biggest benefit is being able to interact with others or one’s work environment and remain hands-free (or, more accurately, remain “hands busy at work”).
Yet, I think smart glasses represent an interesting opportunity to employers as well. The glasses can not only empower workers on, say, a production line to perform tasks more efficiently, but they can also put employees in direct synch with back-end systems—enterprise resource planning (ERP) or manufacturing software—that form the digital backbone of whatever products the company makes. In that way, smart glasses could serve as a powerful employee engagement tool.
English concurs. “The hands-on workforce has not benefited from IT the way knowledge workers have,” he told me. That’s a shame, because people who actually assemble products, perform quality tests or repair machines probably have some great ideas about ways in which they could improve the systems or infrastructure supporting that work. Maybe if these workers started using smart glasses in their workplace—especially during beta tests, when feedback is so key—they would see the glasses as a conduit for them to open a discourse with management regarding how this new technology could best improve workflow, for example. In this scenario, the workforce can be an active participant in a new technology deployment that directly impacts their own jobs.
Skylight offers a software development kit and application programming interfaces for integrating its smart-glasses operating system with back-end software and creating custom workflows. According to English, enterprise software provider SAP is hopeful that smart glasses can help better connect the hands-on workforce with its software products. “[SAP executives] love what we’re doing,” he said of SAP’s interest in smart glass applications, “because they have great [software] systems but they don’t reach the last mile [of the workplace] to the guys who are doing the hands-on work.”
You may be thinking that this is not the first technology that will extend from back-end IT systems to the hands-on workforce. Voice-to-pick systems guide order pickers in warehouses, for example. Who’s to say smart glasses will engage the hands-on work force, instead of the opposite—such as making them feel like cyborgs, or convincing them that management doesn’t really understand how things work on the assembly line, and that the smart glasses are just another sorry attempt to bridge the gap?
Sure, that might happen. But unlike technologies such as voice-guided order picking, smart glasses contain interfaces that would feel pretty similar to those that workers already use on their cell phones—with the obvious exception that screen taps are replaced by voice commands and head or hand movements—and they’ll thus come to the technology with a certain level of familiarity.
It’s ironic to think about: Smart glasses, which I found disengaging while interviewing that Google-Glasses-wearing executive, could be transformed into a tool for workforce engagement.
Mary Catherine O’Connor is the editor of Internet of Things Journal and a former staff reporter for RFID Journal. She also writes about technology, as it relates to business and the environment, for a range of consumer magazines and newspapers.