Tag Archives: Interaction

Museum Interaction Projects

[nggallery id=25]

The project is located within the growing field of experience design and takes advantage of the exponential growth in the application of ICT applications and media technologies in cultural contexts.  This has been manifest explicitly in recent years through the strategies employed by museum curators and managers coming to terms with the challenges new technologies bring to the effective exhibition of historical and cultural artefacts. While heritage centres and institutions are a natural target venue for non-resident tourists the museum environment has been challenged by an ongoing need to attract visitors back in order to justify reinvestment in new exhibits. The process of developing and rotating static exhibits is both expensive and time intensive therefore the possibilities offered by new media technologies is naturally attractive to curators invested with the responsibility of preserving and exhibiting historical artefacts. A related challenge is to do with the public engagement with fragile works and how this can be made experientially rewarding without necessitating erosion or damage to them. These kinds of issues when partnered with the emergence of digital media and the interdisciplinary interests of multimedia practitioners and/or user-viewer experience researchers  has pioneered a specific subfield of HCI/Experience Design that focuses on museum interaction.

Such research in recent years has spawned a variety of museum technologies that are mobile, interactive, playful and reusable. The “Rethinking Technologies in Museums” initiative has recently demonstrated the widespread appeal technology augmented experience holds for the heritage sector. Research projects dedicated to developing multimodal approaches for improving visitor experience have produced multi-touch interfaces; VR interfaces; augmented and mixed reality objects; smart phone apps; eye tracking information assistants; AI agents.

This research project pulls together the interests of  four museums in the Cork City area and overlaps these with research priorities established in Interdiciplinary Arts and Informatics at Cork Institute of Technology.

The Museums involved are Cork City Gaol, Cork Public Museum, Cork Butter Museum, Blackrock Castle Museum and Observatory.

 

Prototype1 – Multi-Camera Tracking and Remote Server

Having tested in public a decision to test in Processing for speed was taken. Processing was found to be more efficient for the current work. While the Trailer Class in AS3 was a particle system (PS) written from scratch, in Processing an existing PS and adapted it to my own purposes [cite this later]. Having worked on using teh PS to reveal an image it was necessary to pass the camera tracking data from AS3 to Processing. This was achieved through using an XML socket. This tested well on a localhost running Apache.

[nggallery id=9]

Interactive Narrative Experiment

This system was developed in an early phase of experimentation to provide some agency to the viewer in activating content in a non-linear presentation. The system involves a number of objects tied together on a local machine developed in Director with embedded Flash movies which communicate with a remote server using PHP to access a mySQL database. On one of the three local machine was attached a webcam, routed through Director’s webcam xtra which sets a motion variable to true when someone walks into a specific region in the exhibition area (in the 3D visualisation below this is marked by a spotlight before the three screens). Previously another version was developed with ultra sonic sensors but these proved to be quite noisy particularly when nothing else was happening in the space.

When the position of the viewer was picked up by a webcam a message was sent to the database which in turn records the motion variable. This variable is then communicated back down to two machines that have a local store of video files. these video files were paired so that the left hand screen and right hand screen matched. The images demonstrated below were from a stopmotion sequence in the Derrynasaggart mountains in West Cork. The central screen held a short sequence of words which were read out by the text to speech engine in Director. When a specific word or character index was reached an event was triggered on the local machine displaying the text and communicated across the network; this enabled the video files to be switched while being kept in sync across the network. In this case there was a slight delay of approx. one second, a long time fro a computer, but did not inhibit the reading of the imagery since the users eye was flicking between three screens.

This system was tested with content generated by the MM2 students at CIT and may be further developed for students in the 09-10 year group. All content displayed in the clip above was produced by the author.