Information inside the cap

Travelling on the train from Crailsheim to Nürnberg I saw several police officers on their travels back from an assignment at Stuttgarter Volksfest. When we got off the train the collected their caps from the luggage rack and observed an interesting (traditional) information display.

Inside the cap they carried a schedule and a description of the location they had to go. The size of the paper-display was about 15 x 15 cm. It seems an interesting place to display and access information – perhaps we will do a digital version of the cap as an assignment in our courses.

Watching movies on the train

At the moment I am travelling a lot on the train and it seems that there is an increase in people using their mobile devices (e.g. Sony PSP, mobile phones) to watch cinema movies and episodes of TV-shows. Some individually and others even share the experience. Over the last years it become popular that people watched DVDs on their notebook computer on the train – but it seems the real mobile age is moving on.

Even though the screen is very small it shows again that one needs little to create the illusion of a movie. In the end it comes always back to the story…

Tico Ballagas defended his PhD in Aachen, New insight on Fitts‘ law.

Today I finally got around visiting Jan Borchers (media computing group at RWTH Aachen). Tico Ballagas hat as part of his PhD defence a public talk and took the chance to go there.

There where new parts in the talk on the impact of the selection space resolution on Fitts’s law that I had not seen in his work before. It is published in 2006 as a technical report (Rafael Ballagas and Jan Borchers. Selexels: a Conceptual Framework for Pointing Devices with Low Expressiveness. Technical Report AIB-2006-16, RWTH Aachen, Dec 2006) which is worthwhile to have a look at. This could be very interesting and relevant for the work Heiko Drewes does on eye-gaze interaction. Discriminating between input and output space for the index of difficulty could be helpful to understand better the impact of the errors that we see in eye gaze interaction.

One part of Tico’s research was concerned with a definition of a design space for input devices. This is partly described in a paper in IEEE Pervasive magazine, see: Ballagas, R., Borchers, J., Rohs, M., Sheridan, J.G., The Smart Phone: A Ubiquitous Input Device. IEEE Pervasive Computing 5(1). 70-77. 2006.

Museum Audio Guides – is there a way to make this a good experience?

We visited the archeology and Stone Age museum in Bad Buchenau http://www.federseemuseum.de/. For our visit we rented their audio guide system – they had one version for kids and one for adults. The audio guides were done very well and the information was well presented.

Nevertheless such devices break the joint experience of visiting a museum! We had three devices – and we stood next to each other listening but not talking to each other. Even though it may transport more information than the written signs it makes a poorer experience than reading and discussing. I wonder how one would design a good museum guide… There are plenty of projects but so far I have not seen the great solution.

Observation at FRA, Terminal 1 B

The number of power plugs available to the public seems to be very close to zero at Frankfurt airport. If a persons sits somewhere on the floor in an odd corner it is likely that there is energy for the laptop or phone.

The number of Bluetooth IDs visible when scanning is amazing. It seems that many people have it switched on continuously now (quite different from 2 years ago). The friendly name used by people seems fairly boring, mainly the preset model name of the phone, combination of first name an phone model, initials, full name and the occasional “hi there”. These observations are quite encouraging for one of our projects.

Looking at the scans I wonder if it is likely that people who travel together have similar phones (e.g. same manufacturer).

Nothing Matches Real Experience

During our strategy meeting there was time for a canoe adventure and some late night reflections. To get the full experience we chose the rainy afternoon for our trip on the river Lahn. Even though we were pretty wet after the trip (some more some less 😉 it was a great experience.

When comparing real experience vs. virtual experience (e.g. second life) it becomes clear that the central issue is, that the virtual is risk-free with regard to our immediate physical well-being. This sounds great in the first place. But what does it lead to when we live in a risk free environment in the long term? How will it shape our perception in the further?

Enjoying the real experience inspired some ideas for mobile adventure games that take place in the real world with real experience but including virtual aspects. A central design goal would be to create a game, where the technology becomes invisible and the user only realises his or her activity in the real world.

Our Presentations at CHI’07 in San Jose

At this years CHI the Conference on Human Factors in Computing Systems we presented 3 contributions: a full paper, a CHI-note, and a work in progress paper. Have a look at them!

Holleis, P., Otto, F., Hussmann, H., and Schmidt, A. 2007. Keystroke-level model for advanced mobile phone interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (San Jose, California, USA, April 28 – May 03, 2007). CHI ’07. ACM Press, New York, NY, 1505-1514. DOI= http://doi.acm.org/10.1145/1240624.1240851

Atterer, R. and Schmidt, A. 2007. Tracking the interaction of users with AJAX applications for usability testing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (San Jose, California, USA, April 28 – May 03, 2007). CHI ’07. ACM Press, New York, NY, 1347-1350. DOI= http://doi.acm.org/10.1145/1240624.1240828

Holleis, P., Kern, D., and Schmidt, A. 2007. Integrating user performance time models in the design of tangible UIs. In CHI ’07 Extended Abstracts on Human Factors in Computing Systems (San Jose, CA, USA, April 28 – May 03, 2007). CHI ’07. ACM Press, New York, NY, 2423-2428. DOI= http://doi.acm.org/10.1145/1240866.1241018

Broadcasting your Heart Beat



This morning the first 3 students completed the exercise part of our lab class. For the first team the basics are done and we start with the exciting part 😉

We have different sensors that can be connected via Bluetooth to the phone (e.g. heart rate, pulse oximeter, GPS) and the task for the project is to invent a new application that makes use of sensors and creates a new user experience.

In the brainstorming session some were tearing out their hair – but it was rewarding. Some of the ideas that came out are really novel – and perhaps a bit to crazy to implement them. One example of such an idea is to create a web radio station that broadcasts the live heart beat of celebrities as an audio stream (not sure if this is the right way to fame). Other ideas centered on support for sport, exercise and mobile health.

Alexander De Luca (a former colleague from LMU Munich) was spending the last few days with us here. He helped greatly to write some code to get the Alivetec-ECG data parsed.

Lab started: Developing Interactive Mobile Applications

Our Lab course on „Developing Interactive Mobile Applications“ started today with 11 students (the limit was 8 – but 3 more is OK as we got some additional phones from Nokia).

Shortly after lunch everyone had there first application written and deployed to the phone. In the afternoon we looked into sending an SMS from JAVA using the messaging API…. And this is fairly easy.

For more details on the course see: http://uie.bit.uni-bonn.de/developing.php

Seeing the possibilities of JAVA ME and how quick people get applications running made me wonder how long it takes till we have massive mal-ware, viruses and spam on the phone. Secutity on phones may be one of the upcomming challenges.