Dan Hume's Blog

Production Begins
October 26, 2010, 10:24 am
Filed under: Future Cinema

On Thursday night, we began filming the scenes for our project. We went back to the tunnel walkways near Asda, as we really liked it as an environment to use for an audience to explore.

We didn’t know how well the footage would come out with our custom built camera rig, so we basically experimented freely in hope that we will get some really interesting footage to use.

We also used the car park at Asda as a different location.


Projection Materials
October 25, 2010, 2:00 pm
Filed under: Future Cinema

Last Thursday, we started the production side of this project. During the day we started researching materials that we would use to project the visuals onto. We found that lycra is the best material to use, because the visuals appear clear and you can stretch the material to create a complete flat surface with no creases.

Here are some pictures of our small set up for our testing.

Projecting onto tracing paper

The visuals looked a bit dull when projected onto tracing paper.

Projecting onto Screen Boards

The visuals on the screen board projected much more detail than the tracing paper, however the surface wasn’t flat as there where a few creases in certain places.

Projecting onto Lycra

Lycra proved to be the best for displaying visuals in great detail and with a smooth surface.

Camera Rig
October 21, 2010, 6:01 pm
Filed under: Future Cinema

Today we spent the day designing and making our very own camera rig, which will accommodate three  DV cameras. Below is my design of how the rig should roughly look once completed.


We want to project video footage onto three screens, in a cube shape. Using one video camera isn’t going to be wide to capture the surroundings of an environment. This is what lead us to using three cameras, so we are able to fill three large screens. The two cameras at each end will be at a 45 degree angle. We’re unsure whether the end cameras should be facing inwards or outwards in order to get the best result for filling three screens. Nonetheless we’ll film the footage with cameras inwards and outwards and then decide which angle is the most effective.


We were fortunate enough to have one of the teachers at the Workshop studio to help us make our camera rig from scratch.

This is how we started, by putting all three cameras together on a blank sheet of paper and lined them up in a position we felt gave the best panoramic view.


After that we took our design to the workshop studio to get the materials and actually put the rig together. It was a bit tricky trying to get the holes in the exact place as this determined the cameras position, but we overcame that problem. Below are some of the photos of one of the tech teachers who helped us make this.

The finished product was perfect.We haven’t attached a weight to the rig as it felt pretty steady without. Tonight when we come to film, we’ll see how effective it is.

Street View (Google Maps)
October 21, 2010, 5:06 pm
Filed under: Future Cinema, General

I feel our idea for this project relates to street view on google maps. This is where you can now have a virtual view of streets online. The main reason I can relate to this piece is because of the 360° environment that is captured through photographs and the interactivity when scrolling with the mouse. You can literally explore a town or village online by moving your mouse forward on the screen. In our project the same principle is occurring. The user moves towards the screen; therefore triggering off the sensors, which makes the image in front move as well.

Google Street View is a technology featured in Google Maps and Google Earth that provides panoramic views from various positions along many streets in the world. It was launched on May 25, 2007, originally only in several cities in the United States, and has since gradually expanded to include more cities and rural areas worldwide.

Google Street View displays images taken from a fleet of specially adapted cars. Areas not accessible by car, like pedestrian areas, narrow streets, alleys and ski resorts, are sometimes covered by Google Trikes (tricycles) or a snowmobile. On each of these vehicles there are nine directional cameras for 360° views at a height of about 2.5 meters,GPS units for positioning and three laser range scanners for the measuring of up to 50 meters 180° in the front of the vehicle. There are also 3G/GSM/Wi-Fi antennas for scanning 3G/GSM and Wi-Fi hotspots. Recently, ‘high quality’ images are based on open source hardware cameras from Elphel.

Where available, street view images appear after zooming in beyond the highest zooming level in maps and satellite images, and also by dragging a “pegman” icon onto a location on a map. Using the keyboard or mouse the horizontal and vertical viewing direction and the zoom level can be selected. A solid or broken line in the photo shows the approximate path followed by the camera car, and arrows link to the next photo in each direction. At junctions and crossings of camera car routes, more arrows are shown.

On November 21, 2008, Street View was added to the Maps application installed on the Apple iPhone. On December 10, 2008, Street View was added to the Maps application for S60 3rd Edition. Street View has now also been added to theWindows Mobile and BlackBerry versions of Google Maps. All versions of Google Maps for the Android OS features Street View, and the digital compass can be used to look around the locations.

Cameras Used

Google has used three types of car-mounted cameras in the past to take Street View photographs. Generations 1-3 were used to take photographs in the United States. The first generation was quickly superseded and images were replaced with images taken with 2nd and 3rd generation cameras. Second generation cameras were used to take photographs in Australia. The shadows caused by the 1st, 2nd and 4th generation cameras are occasionally viewable in images taken in mornings and evenings. The new 4th generation cameras will be used to completely replace all images taken with earlier generation cameras. 4th generation cameras take near-HD images and deliver much better quality than earlier cameras.

In October 2009, Google introduced the Street View Trike, a pedal tricycle with a 4th generation camera mounted to take images where cars cannot reach. All streetview images taken now will be taken with the 4th Generation streetview cameras.

In February 2010, Google introduced the Street View Snowmobile, a snowmobile with a 4th generation camera mounted to take images on the Whistler Blackcomb Ski Slopes in preparation for the winter olympics in Vancouver, Canada.

Virtual Multi-screen
October 19, 2010, 11:20 am
Filed under: Future Cinema

I went into After Effects and imported the three videos I shot of the tunnels. I put them into one composition and made rough version of what we hope to achieve for the final product of this project.

The images don’t match up well because this is just a test, but we hope to make the shots match up nicely when it comes to the final shoot. We’ve also decided that we’re going to film the scene instead of taking still images and then converting the footage into a Tiff format, which will allow us to extract frames from the video.

Future Cinema Project – Test Shoot
October 18, 2010, 6:53 pm
Filed under: Future Cinema

Last night I did some test shots with Kavi at the four tunnel crossway. The idea behind this is to create a sense of depth in the image. To do this I took a photograph after each step as I made my way through the tunnel.

Basically I’ve selected three frame shots to show on my blog out  of 48 shots I took for this. Below is a stop motion video consisting of all the photos I took. This shoot was to give a clear Idea of what our group wants to achieve for this project. I managed to compress the images I took and make a short stop motion animation at 15fps.

Eventually this video will be controlled in Max/MSP, which is an interactive programming environment. This allows you to create your own software using a visual toolkit of objects, and connect them together with patch cords. The basic environment that includes MIDI, control, user interface, and timing objects is called Max. MSP is a set of audio processing objects that do everything from interactive filter design to hard disk recording.

@ Bristol Science centre & Arnolfini Exhibition
October 17, 2010, 7:40 pm
Filed under: Future Cinema, General

On Friday we went to Bristol to check out the @ Bristol Science Centre for some inspiration for our future cinema projects. The groundfloor had alot of basic scientific installations to educate young children, which didn’t really interest me. The second floor had an animation exhibition, specifically focused on work produced by Aardaman. There were some other interesting installations within the second floor which caught my attention.

Here are a collections of pictures and videos I took from the exhibition:

Here is a photo I took of a Spinning wheel which you stare at for about 30 seconds to a minute, then you stare into the palm of your hands and it creates the illusion the lines on your skin are moving. Really cool effect!

This model of an eye is obviously there to demonstrate to children how the eye works. However, when I stared inside the eye using the built in goggles, I saw the image from the outside being reflecting into the eye in an obscure way.

Here is the inside of the eye model, which has the image from the outside reflecting upside down. This is exactly what camera obscura does to imagery when reflecting onto something.

On the second floor, there were lots of animation type pieces, which I really liked.

This circular table with a glass dome. In the centre of it is a mirror which stays stationary. Around it is a series of models of the same character, but in different positions. When you spin the table fast, these models are quickly reflected onto the mirror, which creates motion. Below is a short video I took of this process.

This is inside the planetarium. Going back to my original idea of having a 180 degree curving screen in a cinema, the giant curvy screen they used in the planetarium, was slightly similar to what I originally had in mind. the differences were that screen was more dome shaped, as it covered the roof area. It was an interesting experience being surrounded screens, but after a while it does hurt your neck if you around looking up constantly.

After checking out the planetarium, we went back to look at other installations in the centre.


This was a really nice and simple piece of interactivity. A projection of a pond with animated fish that react to human interaction when someone walks around on the pond. The techniques used for this installation are similar to what our group for the Future Cinema project are going to be using. Just after looking at this, I was talking to Jason about how we could avoid casting shadows in our environment and he said we should use infra-red lighting to project the imagery to avoid shadows being cast.

Some Nice Visuals

I don’t know what this piece of work was called, but was my favorite bit in the centre. It looks like small chunks of ice reacting in some concentrated liquid that makes the ice chunks fizz around and create these really nice abstract trails of smoke. I used  my camera to capture this piece in video. There was a glass surface, where I was able to place my lens onto and basically left it stationary until the particles started moving out then I tried to follow them as best I could. I wish I could have spent a bit more time capturing some footage. This is something I’d be interested in doing in future projects.

Arnolfini Exhibition

I had a quick look around the Arnolfini exhibition. It was basically three exhibitions as part of its Old Media season, focusing, from various perspectives, on the history of software art and its playfulness, alongside the impact of technology in relation to ‘progress’: consumerism and globalisation.

Open Circuit

Open Circuit is an interactive sound installation resembling a giant circuit board. Open Copper tracks trail across the floor carrying signals of ‘everyday’ sounds, which become audible when visitors place the portable speakers directly on top of them.

Different positionings form different soundscapes. The copper wires reveal the visual aesthetic of hardware, amateur electronic engineering, and the dynamics with which it is able engage with signals across users, space and time.

%d bloggers like this: