LightAnchors array: LEDs in routers, power strips, and more, can sneakily ship data to this smartphone app
Video A pentad of bit boffins have devised a way to integrate electronic objects into augmented reality applications using their existing visible light sources, like power lights and signal strength indicators, to transmit data.
In a recent research paper, “LightAnchors: Appropriating Point Lights for Spatially-Anchored Augmented Reality Interfaces,” Carnegie Mellon computer scientists Karan Ahuja, Sujeath Pareddy, Robert Xiao, Mayank Goel, and Chris Harrison describe a technique for fetching data from device LEDs and then using those lights as anchor points for overlaid augmented reality graphics.
As depicted in a video published earlier this week on YouTube, LightAnchors allow an augmented reality scene, displayed on a mobile phone, to incorporate data derived from an LED embedded in the real-world object being shown on screen. You can see it here.
Unlike various visual tagging schemes that have been employed for this purpose, like using stickers or QR codes to hold information, LightAnchors rely on existing object features (device LEDs) and can be dynamic, reading live information from LED modulations.
The reason to do so is that device LEDs can serve not only as a point to affix AR interface elements, but also as an output port for the binary data being translated into human-readable form in the on-screen UI.
“Many devices such as routers, thermostats, security cameras already have LEDs that are addressable,” Karan Ahuja, a doctoral student at the Human-Computer Interaction Institute in the School of Computer Science at Carnegie Mellon University told The Register.
“For devices such as glue guns and power strips, their LED can be co-opted with a very cheap micro-controller (less than US$1) to blink it at high frame rates.”
The system relies on an algorithm that creates an image pyramid of five layers, each scaled by half, to ensure that at least one version captures each LED in the scene within a single pixel. The algorithm then searches for possible light anchor points by looking for bright pixels surrounded by dark ones.
Candidate anchors are then scanned to see if they display the preamble binary blink pattern. When found, the rest of the signal can then be decoded.
With a warehouse of unsold AR googles, Magic Leap has a brainwave… let’s rebadge ‘em and sell to business!
Some of the example applications that have been contemplated include: a smoke alarm that transmits real-time battery and alarm status through its LED, a power strip that transmits its power usage, and a Wi-Fi Router presents its SSID and guest password when viewed through a mobile AR app.
Ahuja said the scheme works across different lighting conditions, though he allows that in bright outdoor lighting, device LEDs may get missed. “But usually the LED appears to be the brightest point,” he said.
The initial version of the specification (v0.1) has been published on the LightAnchors.org website. It requires a camera capable of capturing video at 120Hz, to read light sources blinking at 60Hz. The data transmission protocol consists of a fixed 6-bit preamble, an 8-bit payload, 4-bit parity, and a fixed 8-bit postamble. Mobile devices that support faster video frame rates and contain faster processors could allow faster data transmission.
Future versions of the specification may incorporate security measures against potential malicious use, such as a temporary token to ensure that users have line-of-sight to devices. Sample demo code for Linux, macOS, and Windows laptops, along with Arduino devices, can be found on the project website.
The boffins have also created a sample iOS app and interested developers can sign up on the website to receive an invitation to try it out through Apple’s Testflight service. ®
Sponsored: Your Guide to Becoming Truly Data-Driven with Unrivalled Data Analytics Performance
READ MORE HERE