Paper gets 'smart' with drawn-on, stenciled sensor tags
- Date:
- May 11, 2016
- Source:
- University of Washington
- Summary:
- Researchers have created ways to give a piece of paper sensing capabilities that allows it to respond to gesture commands and connect to the digital world.
- Share:
A piece of paper is one of the most common, versatile daily items. Children use it to draw their favorite animals and practice writing the A-B-Cs, and adults print reports or scribble a hasty grocery list.
Now, connecting real-world items such as a paper airplane or a classroom survey form to the larger Internet of Things environment is possible using off-the-shelf technology and a pen, sticker or stencil pattern.
Researchers from the University of Washington, Disney Research and Carnegie Mellon University have created ways to give a piece of paper sensing capabilities that allows it to respond to gesture commands and connect to the digital world. The method relies on small radio frequency (RFID) tags that are stuck on, printed or drawn onto the paper to create interactive, lightweight interfaces that can do anything from controlling music using a paper baton, to live polling in a classroom.
"Paper is our inspiration for this technology," said lead author Hanchuan Li, a UW doctoral student in computer science and engineering. "A piece of paper is still by far one of the most ubiquitous mediums. If RFID tags can make interfaces as simple, flexible and cheap as paper, it makes good sense to deploy those tags anywhere."
The researchers will present their work May 12 at Association for Computing Machinery's CHI 2016 conference in San Jose, California.
The technology -- PaperID -- leverages inexpensive, off-the-shelf RFID tags, which function without batteries but can be detected through a reader device placed in the same room as the tags. Each tag has a unique identification, so a reader's antenna can pick out an individual among many. These tags only cost about 10 cents each and can be stuck onto paper. Alternatively, the simple pattern of a tag's antenna can also be drawn on paper with conductive ink.
When a person's hand waves, touches, swipes or covers a tag, the hand disturbs the signal path between an individual tag and its reader. Algorithms can recognize the specific movements, then classify a signal interruption as a specific command. For example, swiping a hand over a tag placed on a pop-up book might cause the book to play a specific, programmed sound.
"These little tags, by applying our signal processing and machine learning algorithms, can be turned into a multi-gesture sensor," Li said. "Our research is pushing the boundaries of using commodity hardware to do something it wasn't able to do before."
The researchers developed different interaction methods to adapt RFID tags depending on the type of interaction that the user wants to achieve. For example, a simple sticker tag works well for an on/off button command, while multiple tags drawn side-by-side on paper in an array or circle can serve as sliders and knobs.
"The interesting aspect of PaperID is that it leverages commodity RFID technology thereby expanding the use cases for RFID in general and allowing researchers to prototype these kind of interactive systems without having to build custom hardware," said Shwetak Patel, the Washington Research Foundation Entrepreneurship Endowed Professor in Computer Science & Engineering and Electrical Engineering.
They also can track the velocity of objects in movement, such as following the motion of a tagged paper conductor's wand and adjusting the pace of the music based on the tempo of the wand in mid-air.
This technique can be used on other mediums besides paper to enable gesture-based sensing capabilities. The researchers chose to demonstrate on paper in part because it's ubiquitous, flexible and recyclable, fitting the intended goal of creating simple, cost-effective interfaces that can be made quickly on demand for small tasks.
"Ultimately, these techniques can be extended beyond paper to a wide range of materials and usage scenarios," said Alanson Sample, research scientist at Disney Research. "What's exciting is that PaperID provides a new way to link the real and virtual worlds through low cost and ubiquitous gesture interfaces."
Story Source:
Materials provided by University of Washington. Original written by Michelle Ma. Note: Content may be edited for style and length.
Journal Reference:
- Hanchuan Li, Eric Brockmeyer, Elizabeth J. Carter, Josh Fromm, Scott E. Hudson, Shwetak N. Patel, Alanson Sample. PaperID. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, May 2016 DOI: 10.1145/2858036.2858249
Cite This Page: