Schlagwort: papers
Privacy – will our understanding change radically?
As one issue this morning we came across issues related to privacy. In particular it seems that social network analysis based on behavior in the real world (e.g. the reality mining project [1]) is creating serious interest beyond the technology people. Beyond measuring the frequency of encounters qualifying the way people interact (dominance, emotion, …) will reveal even more about social networks…
[1] Eagle, N. and (Sandy) Pentland, A. 2006. Reality mining: sensing complex social systems. Personal Ubiquitous Comput. 10, 4 (Mar. 2006), 255-268. DOI= http://dx.doi.org/10.1007/s00779-005-0046-3
Trip to Dublin, Aaron’s Display Project
http://eniac.hopto.org/~whazlewo/downloads/AIS08_Full_Proceedings.pdf (~8MB)
My Random Papers Selection from Ubicomp 2008
Over the last days there were a number of interesting papers presented and so it is not easy to pick a selection… Here is my random paper selection from Ubicomp 2008 that link to our work (the conference papers link into the ubicomp 2008 proceedings in the ACM DL, our references are below):
Don Patterson presented a survey on using IM. One of the finding surprised me: people seem to ignore „busy“ settings. In some work we did in 2000 on mobile availability and sharing context users indicated that they would respect this or at least explain when interrupt someone who is busy [1,2] – perhaps it is a cultural difference or people have changed. It may be interesting to run a similar study in Germany.
[2] Albrecht Schmidt, Tanjev Stuhr, Hans Gellersen. Context-Phonebook – Extending Mobile Phone Applications with Context. Proceedings of Third Mobile HCI Workshop, September 2001, Lille, France.
[3] Heiko Drewes, Albrecht Schmidt. Interacting with the Computer using Gaze Gestures. Proceedings of INTERACT 2007.
[4] Albrecht Schmidt. Implicit Human Computer Interaction Through Context. Personal Technologies, Vol 4(2), June 2000
Some random papers from Mobile HCI 2008
During mobile HCI I came across many interesting things (that is why one goes to conferences 😉 here is a selection of papers to look at – if you have more time it is worthwhile to look at the whole proceedings of mobile HCI 2008 in the ACM DL.
Gauntlet: a wearable interface for ubiquitous gaming – exploring a new gaming UI for gestures.
Mobile phones as artifacts children use in their games are discussed. Shows again how creative children are 😉
An Investigation into round touch screen Wristwatch interaction – interesting topic and good example how to do a small study. Ideas to create a tactile rim, e.g. 2 parts moving to have different tactile cues, were brought up in the discussion.
Programming with children – taking programming it into the environment away from the computer, relates to Tangible User Interfaces
Projector phone: a study of using mobile phones with integrated projector for interaction with maps
Interaction based on Speech seems possible – even in noisy environment – the paper reports interesting preliminary results in the context of a fishing boot. Interesting in-situ tests (e.g. platform in a wave tank)
Wearable computing user interfaces. Where should we put the controls and what functions do uses expect?
Learning-oriented vehicle navigation systems: a preliminary investigation in a driving simulator
Enrico Rukzio followed up the work from Munich pushing the idea of touch interaction with NFC devices further.
Color matching using a mobile phone. The idea is to use a color chart, take a photo of face with a color chart, sent by mms to server, server process look up color match, reply by sms; no software installation only using MMS, SMS. Application in cosmetics are discussed.
Using Second Life to demonstrate a concept automobile heads up display (A-HUD)
Paul Holleis presented our paper on Wearable Controls
Last year Paul did an internship a Nokia in Finland. He worked there on the integration of capacitive sensors in phones and clothing. After Paul was back we jointly followed up on the topic which resulted in an interesting set of guidelines for placing wearable controls [1].
The paper gives a good overview of wearable computing and interaction with wearable computers. In the work we focused on integrating touch sensitive controls into garments and accessories for a operating the music player integrated in a phone. The study showed that there are prime locations where to place controls on their body: the right hip and above the right knee (for more details see the paper [1]). It furthermore showed that it is not clear expectations of functions (e.g. forward, backward, volume up/down) with regard to controls laid out on the close.
During his internship he also did research on integrating touch into buttons, which was published at Tangible and Embedded Interaction 2008 [2].
[1] Holleis, P., Schmidt, A., Paasovaara, S., Puikkonen, A., and Häkkilä, J. 2008. Evaluating capacitive touch input on clothes. In Proceedings of the 10th international Conference on Human Computer interaction with Mobile Devices and Services (Amsterdam, The Netherlands, September 02 – 05, 2008). MobileHCI ’08. ACM, New York, NY, 81-90. DOI= http://doi.acm.org/10.1145/1409240.1409250
[2] Paul Holleis, Jonna Häkkilä, Jussi Huhtala. Studying Applications for Touch-Enabled Mobile Phone Keypads. Proceedings of the 2nd Tangible and Embedded Interaction Conference TEI’08. February 2008.
Andrew Greaves presents a study on photo browsing using projector phones
Since Enrico Rukzio (my first PhD student) went to Lancaster he discovered and advanced a very exciting topic for mobile interaction: mobile projector/projector phones. His group has a great presencs at this year’s mobile HCI (3 demonstrations, 2 short papers, 2 full papers, a workshop). In time for the conference the first projector phone appeared on the market (Cking Epoq EGP-PP01) – as to highlight the timeliness of the work.
The mobile projector study [1] revealed several interesting aspects. 1) it is faster to browser on the phone screen than using a project, 2) users do a lot of context switches between projection and device – even nothing is displayed on the screen, 3) the users see a great value in it (even if they may be slower). I am really looking forward to further results in this area. It may be significantly change the way we use mobile phones!
PS: see Enrico watching his student present I remember how exciting it is for a supervisor to just watch…
[1] Andrew Greaves, Enrico Rukzio. Evaluation of Picture Browsing using a Projector Phone. 10th International Conference on Human-Computer Interaction with Mobile Devices and Services (Mobile HCI 2008). Amsterdam, Netherlands. 2-5 September 2008.
GIST, Gwangju, Korea
Yesterday I arrived in Gwangju for the ISUVR-2008. It is my first time in Korea and it is an amazing place. Together with some of the other invited speakers and PhD students we went for a Korean style dinner (photos from the dinner). The campus (photos from the campus) is large and very new.
This morning we had the opportunity to see several demos from Woontack’s students in the U-VR lab. There is a lot of work on haptics and mobile augmented reality going on. See the pictures of the open lab demo for yourself…
In the afternoon we had some time for culture and sightseeing – the country side parks are very different from Europe. Here are some of the photos of the trip around Gwangju and see http://www.damyang.go.kr/
In 2005 Yoosoo Oh, a PhD student with Woontack Wo at GIST, was a visiting student in our lab in Munich. We worked together on issues related to context awareness and published a paper together discussing the whole design cycle and in particular the evaluation (based on a heuristic approach) of context-aware systems [1].
[1] Yoosoo Oh, Albrecht Schmidt, Woontack Woo: Designing, Developing, and Evaluating Context-Aware Systems. MUE 2007: 1158-1163
Visual aid for navigation – using human image processing
While browsing the equator website I came again across an interesting publication – I had seen two years ago at MobileHCI – in the domain of pedestrian navigation [1]. The Basic idea is to use a collection of geo-tagged photos to provide visual cues to people in what direction they should go, e.g. “walk towards this building”. This is an interesting application linking two concepts we discussed in the part on location in my lecture on pervasive computing. It follows the approach of augmenting the user in a way that the user does what he does well (e.g. matching visual images) and the computer what it does well (e.g. acquiring GPS location, finding pictures related to a location in a DB).
[1] Beeharee, A. K. and Steed, A. 2006. A natural wayfinding exploiting photos in pedestrian navigation systems. In Proceedings of the 8th Conference on Human-Computer interaction with Mobile Devices and Services (Helsinki, Finland, September 12 – 15, 2006). MobileHCI ’06, vol. 159. ACM, New York, NY, 81-88. DOI= http://doi.acm.org/10.1145/1152215.1152233
Impressions from Pervasive 2008
Using electrodes to detect eye movement and to detect reading [1] – relates to Heiko’s work but uses different sensing techniques. If the system can really be implemented in goggles this would be a great technologies for eye gestures as suggested in [2].
- Communicate while you sleep? Air pillow communication… Vivien loves the idea [4].
- A camera with additional sensors [5] – really interesting! We had in Munich a student project that looked at something similar [6]
- A cool vision video of the future is SROOM – everything becomes a digital counterpart. Communicates the idea of ubicomp in a great and fun way [7] – not sure if the video is online – it is on the conference DVD.
[1] Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography. Andreas Bulling, Jamie A. Ward, Hans-W. Gellersen and Gerhard Tröster. Proc. of the 6th International Conference on Pervasive Computing (Pervasive 2008), pp. 19-37, Sydney, Australia, May 2008. http://dx.doi.org/10.1007/978-3-540-79576-6_2
[2] Heiko Drewes, Albrecht Schmidt. Interacting with the Computer using Gaze Gestures. Proceedings of INTERACT 2007. http://murx.medien.ifi.lmu.de/~albrecht/pdf/interact2007-gazegestures.pdf
[3] Shwetak N. Patel, Matthew S. Reynolds, Gregory D. Abowd: Detecting Human Movement by Differential Air Pressure Sensing in HVAC System Ductwork: An Exploration in Infrastructure Mediated Sensing. Proc. of the 6th International Conference on Pervasive Computing (Pervasive 2008), pp. 1-18, Sydney, Australia, May 2008. http://shwetak.com/papers/air_ims_pervasive2008.pdf
[4] Satoshi Iwaki et al. Air-pillow telephone: A pillow-shaped haptic device using a pneumatic actuator (Poster). Advances in Pervasive Computing. Adjunct proceedings of the 6th International Conference on Pervasive Computing (Pervasive 2008). http://www.pervasive2008.org/Papers/LBR/lbr11.pdf
[5] Katsuya Hashizume, Kazunori Takashio, Hideyuki Tokuda. exPhoto: a Novel Digital Photo Media for Conveying Experiences and Emotions. Advances in Pervasive Computing. Adjunct proceedings of the 6th International Conference on Pervasive Computing (Pervasive 2008). http://www.pervasive2008.org/Papers/Demo/d4.pdf
[6] P. Holleis, M. Kranz, M. Gall, A. Schmidt. Adding Context Information to Digital Photos. IWSAWC 2005. http://www.hcilab.org/documents/AddingContextInformationtoDigitalPhotos-HolleisKranzGallSchmidt-IWSAWC2005.pdf
[7] S-ROOM: Real-time content creation about the physical world using sensor network. Takeshi Okadome, Yasue Kishino, Takuya Maekawa, Kouji Kamei, Yutaka Yanagisawa, and Yasushi Sakurai. Advances in Pervasive Computing. Adjunct proceedings of the 6th International Conference on Pervasive Computing (Pervasive 2008). http://www.pervasive2008.org/Papers/Video/v2.pdf