Thursday, September 22, 2011

Paper Reading #12: Enabling Beyond-Surface Interactions for Interactive Surface with An Invisible Projection

Reference Information
Enabling Beyond-Surface Interactions for Interactive Surface with An Invisible Projection
Li-Wei Chan, Hsiang-Tao Wu, Hui-Shan Kao, Ju-Chun Ko, Home-Ru Lin, Mike Y. Chen, Jane Hsu, Yi-Ping Hung
Presented at UIST'10, October 3-6, 2010, New York, New York, USA

Author Bios
  • Li-Wei Chan is a PhD student in the Image and Vision Lab at National Taiwan University. He is interested in computer vision and tangible user interfaces.
  • Hsiang-Tao Wu is a student at National Taiwan University with four papers relating to tabletop computing.
  • Hui-Shan Kao is a student at National Taiwan University with four papers relating to tabletop computing and other forms of display.
  • Ju-Chun Ko is a student at National Taiwan University with six papers, most of which relate to interface design.
  • Home-Ru Lin is a student at National Taiwan University with four papers relating to tabletop computing.
  • Mike Y. Chen is a professor at National Taiwan University and previously worked at Intel Research- Seattle. He is interested in mobile computing, human-computer interaction, and social networks.
  • Jane Hsu is a professor of Computer Science and Information Engineering at National Taiwan University. She is interested in data mining and service-oriented computing.
  • Yi-Ping Hung is a professor in the Graduate Institute of Networking and Multimedia and in the Department of Computer Science and Information Engineering of National Taiwan University. He holds a PhD from Brown University.

Summary
Hypothesis
How effective is a system that combines infrared markers and multi-touch controls? What possible uses does it have?

Methods
Users were presented with the three systems the authors developed and were told to navigate through landmarks on a map. The first two systems could view the photo represented by a pin on the map. The third could see the perspective view. User feedback was collected.

Results
The 3D building viewer could only show a portion of what users wanted to see. Orientation of the phone and zooming were not handled, so user attempts to correct the lack of visibility failed. It was also considered to be too immersive. The flashlight encountered severe focus problems because users moved it rapidly. Users also wanted to use it as a mouse. The lamp was moved on occasion, but mostly remained in a single spot for a while.

Contents
The authors developed a programmable infrared tabletop system that allows mobile devices to interact with displayed surfaces through invisible markers. The system allows for on-surface and above-surface interaction and is based on a Direct-Illumination system. Two IR cameras detect finger touches. A standard DLP projector was converted to IR, with touch-glass placed under the surface's diffuser layer. Augmented reality markers change in size to match the mobile camera, which communicates with the system to update marker layouts and requires at least four markers at a time for calibration. Priority goes to the closest camera. Kalman filtering reduces jitteriness. The normal system of touch detection by subtracting the background is ineffective with the changing IR background, so the system instead simulates the background at each frame and then projects expected white areas for a maximum delay of one frame. The simulation stitches together the marker locations. Foregrounds are extracted through thresholds. Making the markers appear as black pixels provides to little illumination, so the authors adjusted the intensity of the black. Camera and projector synchronization keeps several previous frames in memory for the subtraction method. An additional camera is used only for calibration. The four points used for calibration allow for additional devices without extra input. Content is generated in Flash and warped through DirectX.

One proposed usage resembled a desk lamp that would project onto the surface. Where the lamp and tabletop intersect, the tabletop's projection is masked to reduce blur. A portable version, used like a flashlight was also proposed. It is a pointer to indicate certain information, but can also manipulate content through a button. It uses an integrated laser to resolve focusing issues. The third concepts uses a tablet to display 3D geographical content. A table boundary appears around 3D objects to remind users of reality.

Discussion
The authors were trying to create a basic framework and a few applications of a system that combined IR and a touch surface. My concern is that they did not perform a lot of user testing and apparently omitted several extremely useful features. However, as a proof of concept, I could see the potential of the technology, so the authors convinced me that this is a possibly viable area of research.

The combination of augmented reality with a touchscreen was particularly interesting to me. The authors combined two technologies that are not commonly used together to produce a new form of interaction. However, their claim that the 3D viewer was too immersive concerns me. Since the goal of the system is to combine to disparate elements into a harmonious new technology, users could only focus on the mobile device, which necessarily limits the effectiveness of the system. I was pleased that the authors intend to reform this area, but I am not sure how quickly, if at all, this goal can be obtained. 

No comments:

Post a Comment