Interactive Narratives is designed to capture the best of online visual storytelling as practiced by online and print journalists from around the country and the world. Our goal is to highlight rich-media content, engaging storytelling, and eye-popping design in an environment that fosters interaction, discussion, and learning.
The site is sponsored by the Online News Association, which works to foster innovation among online journalists and help those journalists, freelancers, academics, and students learn from the best practitioners in the field and from each other.
Here’s what we’ve got:
This system was developed in an early phase of experimentation to provide some agency to the viewer in activating content in a non-linear presentation. The system involves a number of objects tied together on a local machine developed in Director with embedded Flash movies which communicate with a remote server using PHP to access a mySQL database. On one of the three local machine was attached a webcam, routed through Director’s webcam xtra which sets a motion variable to true when someone walks into a specific region in the exhibition area (in the 3D visualisation below this is marked by a spotlight before the three screens). Previously another version was developed with ultra sonic sensors but these proved to be quite noisy particularly when nothing else was happening in the space.
When the position of the viewer was picked up by a webcam a message was sent to the database which in turn records the motion variable. This variable is then communicated back down to two machines that have a local store of video files. these video files were paired so that the left hand screen and right hand screen matched. The images demonstrated below were from a stopmotion sequence in the Derrynasaggart mountains in West Cork. The central screen held a short sequence of words which were read out by the text to speech engine in Director. When a specific word or character index was reached an event was triggered on the local machine displaying the text and communicated across the network; this enabled the video files to be switched while being kept in sync across the network. In this case there was a slight delay of approx. one second, a long time fro a computer, but did not inhibit the reading of the imagery since the users eye was flicking between three screens.
This system was tested with content generated by the MM2 students at CIT and may be further developed for students in the 09-10 year group. All content displayed in the clip above was produced by the author.