Your Browser does not support Flash.

Awards

We are especially proud of the students of the ISNM, who received some nice awards for scientific and artistic work.

2007

Nokia NSeries Mobile Film Award 2007
Nokia NSeries Mobile Film Award 2007
Ranjan Shetty (ISNM student of generation 2003) has been awarded the Nokia Mobile Film Award on May 2nd, 2007 on Whistling Woods Filmcity, Goregaon, Mumbai, India. Ranjan won the award for the Best Mobile film maker, which is given by Nokia in association with Whistling Woods International, the renowned film training institute and Mediamatics Entertainment Private Ltd. His film Solitude Dream is based on the film "Autophobia", which was produced in the "Digital Film and Video Production" course at the ISNM in 2004 (see also Mobile Cinema project). The film is about a man with hallucinations about someone trying to kill him and was filmed in the ISNM library at the McLuhan Documentation Center.
Solitude Dreams
See the movie at YouTube.
Nordic Film Festival 2007
This movie has also been presented and nominated as best short movie at Nordic Film Festival in Luebeck in November 2007.

2006

TTA
Europrix Top Talent Nomination
Wendy Ann Mansilla (ISNM student of generation 2004) has been nominated for the 2005 Europrix Top Talent Thesis Award for her master thesis on Emotion and Acousmêtre for Suspense in an Interactive Virtual Storytelling Environment in March 2006. The awards are organised by the International Center for New Media (ICNM) with the support of the Austrian Ministry of Economic Affairs, graphics technology company ATI, and 80 partner organisation in every European country.
It is often hard to simulate real world social interaction scenarios that are highly emotional and tense. Such situations can be represented in cinema. However, allowing the viewers to step into the drama and play a role as an actor and influence the story may not be possible using the traditional cinema. Borrowing the specifics of cinema, this thesis describes a virtual storytelling application, developed by the author to simulate the emotionally demanding social role as a protagonist of the story and goes beyond the sole display of visual and audio information: the generation of emotion and suspense sensations. To generate such sensations, this thesis investigates two emotional stimulants as a potential variable to effectively stimulate emotion and suspense in the developed application: (1.) virtual character’s emotion, and (2.) acousmatic presence. The implication of these two variables in the interactive virtual storytelling environment was evaluated by associating the occurrence of emotional sensations with the autonomic nervous system (ANS) activity of the users and was further validated through emotion report questioning. From this analysis, the author infers that the fusion of these two elements in the story world effectively elicits emotion and suspense, resulting in a rich interactive narrative experience and increases the viewer’s engagement.

2005

Eurographics 2005
Ken Brodlie Price for Best Paper
T. Pfeiffer, M. Weber and Bernhard Jung received the Ken Brodlie Price for the best paper on the basis of both the written paper and its presentation at the Theory and Practice of Computer Graphics Conference Eurographics'2005 at the University of Kent, Canterburry, UK, June 17th, 2005 for their paper Ubiquitous Virtual Reality - Accessing Shared Virtual Environments through Videoconferencing Technology.
This paper presents an alternative to existing methods for remotely accessing Virtual Reality (VR) systems. Common solutions are based on specialised software and/or hardware capable of rendering 3D content, which not only restricts accessibility to specific platforms but also increases the barrier for non expert users. Our approach addresses new audiences by making existing Virtual Environments (VEs) ubiquitously accessible. Its appeal is that a large variety of clients, like desktop PCs and handhelds, are ready to connect to VEs out of the box. We achieve this combining established videoconferencing protocol standards with a server based interaction handling. Currently interaction is based on natural speech, typed textual input and visual feedback, but extensions to support natural gestures are possible and planned. This paper presents the conceptual framework enabling videoconferencing with collaborative VEs as well as an example application for a virtual prototyping system.
MMTMMT
Multimedia Transfer Karlsruhe
Bernhard Jung, Abdul Ahad and Matthias Weber reached the top 20 finalist within the Multimedia Transfer 2005 in Karlsruhe for there work on the Affective Virtual Patient.
The Multimedia Transfer (MMT) has been organized since 1995 by the computing center of the University of Karlsruhe. Prof. Andreas Schrader of the ISNM is also member of the committee. It offers young talents from academia the chance to present their final thesis work to the public. The Affective Virtual Patient project has been realized in cooperation with Prof. Dr. Lucas Wessel and Frank Albrecht from the Children Surgery Department of the University of Lübeck. It has reached to top 20 list from 115 contributions throughout Europe. Training of social interaction skills, particularly for emotionally charged situations is becoming increasingly critical in many professions. In cases such as the medical field, however, conventional real-world or role-playing training is extremely limited or even impossible. An example for this is the treatment of injured children. Therefore, we have developed an e-learning system that uses virtual agent technology to simulate the treatment of and human interaction with a hurt, upset boy and his emotional, easily agitated mother. Inconsiderate behaviors such as asking for insurance cards at the wrong moment, inappropriate joking, or addressing certain questions at the wrong person may contribute to the deterioration of the situation, hinder procedural progress, and lead to complaints about the medical personnel. The PC-based system employs Haptek virtual character technology allowing for highly emotional facial expression and lip-synchronized speech.
Special thanks goes to the ISNM students Panayiotis Petropoulos, Alma Salim, and Wendy Ann Mansilla for their support in this project.