Thursday, May 04, 2006

James Fogarty faculty talk HCI from Carnegie Mellon University

I'm attending the faculty talk by James Fogarty, a candidate for CS faculty, on Constructing and Evaluating Sensor-Based Statistical Models of Human
Interruptability.

Ubiquitous computing is here, however the devices do not have the capability to adapt according to the human. The computer causes interruptions with office workers, the user has to adapt to the machine. If we had systems that knew whether "now is a good time", then non-urgent notifications could be delayed until appropriate to deliver them. To approach this problem, he is presenting a sensor-based models of human interruptibility. Sensor-based models of human interruptibility is one approach to systems that can consider whether "now is a good time". This is done by extracting correlations between interruptibility observations and sensor output.

The traditional approach is to use a bottom-up one which is prone to failure. He is presenting a top-down approach using HCI techniques. How he did this is by performing a Wizard of Oz feasibility study and record people's work with video and collect the data, in order to determine what sensors would be useful. He then simulated the sensor output like office worker activities (sitting, standing, speaking), guest activities (number present, time present), environment (time of day, door open, door closed). This was then input into the Bayes model based on simulated sensors to identify non-interruptible and interruptible situations.

The next study was a human observer study to rate the interruptibility based on the data that he obtained from the Wizard of Oz feasibility study. Simulated sensors support models that identify "highly non-interruptible situations" better than human observers.

After that, he performed a third study which measured robustness by deploying implemented sensors to study a group of office workers with more diverse responsibilities. From the study of all sensors results, it is more reliable to detect interruptibility than from human observers. A typical laptop computer has sufficient sensing to support models that identify "Highly non-interruptible situations" more than from the human observers.

The next study had to deal with automatic recognition of interruptibility for task engagement. At random intervals, programmers were notified that an interruption was pending, and then got a mental arithmetic interruption. Software was constructed using a model based on automatically-generated features that identifies interruptible situations, and the study showed that automatically-generated features can capture indications of task engagement.

Top-down Wizard of Oz method shows a fruitful direction for detecting interruptibility. Manually-specified task models can add more additional value, but not too much compared to general. The software that he created is called Subtle to take advantage around his results, and use as a way for HCI researchers to use. Subtle collects sensor data, these sensors are desktop analyses (using the laptop to detect closing, opening, mouse click, etc.), audio analyses (using the microphone), and WiFi sensing. Subtle runs on a client and the interruptions are sent and received as XML-encoded events and states which gets stored in a database. A model learner is used in order to identify interruptible situations. The data can be collected on a server. Subtle dynamically examines existing features and creates new features by adding operators based on a feature's type and history of values. He gave the example of detecting the MAC address of the Wi-Fi access point.

One application that he built based on Subtle was the AmIBusy prompter to create a smart instant messaging type of client.

Another area of research that he is interested in is home activity recognition especially in elder care applications. Typical approaches are to embed sensors in the environment or to use a wearable device. His approach is not to intrude the existing home environment, but rather to use the existing infrastrucure. For his study, he attached microphone sensors to pipes and from the audio features, can determine the matched activities. For example, a certain audio signal pattern can determine a toilet flush. The study seemed interesting, because lots of home studies involve deploying sensors in the home, and this becomes intrusive and you get into privacy concerns.

jfogarty AT cs.cmu.edu

No comments: