Filed under: Uncategorized
As I’ve finished uni, I’ve decided to start fresh with a new blog where I’ll be continuing to post and share things I love!
I’m now on Tumblr:
I have also set-up a website:
Short Project Evaluation
For the extended major project, I wanted to follow on from the Specialist Project, which was to shoot another short documentary on an iPhone. I wanted to carry on exploring this idea of being discreet when filming because I feel it’s something unique to filmmaking. On the last project, I didn’t stand out from the crowd; I was able be a ‘post-filmmaker’ and blend in within the people on location and capture the rawness of people’s state of mind. To be more concise, I felt Invisible because I was using a mobile phone, which to the general public doesn’t really suggest they’re being filmed or photographed. This led me to calling the overall project, The Invisible Camera.
My approach to the EMP has been relatively similar to the specialist project, however this time I’ve also been looking at how video content can alternatively be delivered in the 21st century. To take this further I not only wanted to deliver this documentary as a video, but also as an iPad publication as well. This new medium also meant that I could implement other different types of media into the project, such as photography.
I had a series of tutorials with storyboard artist, Tony Chance, who guided me through my development process. I initially was looking at creating a documentary that had an environmental focus to it, but this particular approach didn’t really ignite any creativity for me and I felt quite restricted by it. I then found myself exploring themes such as connection and disconnection, which then lead me to develop this idea of technology withdrawing people from interacting with each other face to face. I decided to name the documentary as iSolation, using the same concept as apple’s signature for its products, iPod, iPhone, etc, to correspond with the tool I was filming with.
This idea emerged quite late in the project, I was really pleased that I came up with topic that I could relate to from my own personal experience. I linked this idea to the philosophy of existentialism to which Steven Earnshaw describes as…
…a philosophy that takes as its starting point the individuals existence. Everything that it has to say, and everything that it believes can be said of significance – about the world we inhabit, our feelings, thoughts, knowledge, ethics – stems from this central, founding idea. (Earnshaw, 2007 P.1)
I wanted the documentary to explore the idea of isolation amongst a densely populated place. Our dependence on technology, in a way, has enabled us to cut ourselves off from interacting with other people and the environment we’re in. Over a period of 10 years, there’s been a radical change in mobile phone technology and it’s become a huge attachment in people’s lives. Even if we are aware of what’s going on around us, we’re not always 100% focused on one thing, for example, talking to someone face to face whilst texting. Listening to music on the move is another example of isolation amongst a crowed of people. People maybe looking face forward, but their minds could be immersed in the music. I’ve experienced this form of isolation myself, so I’m aware that I’m not paying attention to what’s going on around me. It appears we’re now spend a lot more time living in our imagination, because I believe we all want to avoid hearing all the negative events happening in the world.
For the editing, I was heavily influenced by the music. Like in koyaanisqatsi, the music dictates how the documentary flows. In my documentary, the opening beings with slow ambient music; therefore the editing was subtler. When the music became more upbeat, I wanted the shots to be fast and tightly cut. I began including a lot of these short transition cuts, which are roughly half a second long. They are basically extracts of quick movements from the footage I’d shot. I was holding the camera in a position where I wasn’t able to see what was being shot; therefore I recorded a lot of unwanted unsteady footage.
As an observational piece I felt that a musical score would be an appropriate accompaniment to the visuals. Depending on the genre of the music, I could really provoke any kind of emotion I want to the overall piece. My initial documentary, Life through an iPhone, had a Moby track called, Wait For Me, playing throughout. This is where I’ve taken inspiration from the film, koyaanisqatsi, where the music emphasizes how the audience is meant to feel. This is something I wanted to achieve in both my projects. I used another Moby track ironically called, Isolate, which I felt truly represented the mood I wanted to create in my piece perfectly.
Filed under: Extended Major Project
I’ve finally managed to upload the final version of my publication to the digital publishing suite online and have it downloaded onto the iPad. I decided to take some screenshots of all the pages in both orientations to show the layout of the article as you swipe vertically.
Swipe down to next page, which has a short extract of a poem. The rest of the poem continues as you work your way to the end of the publication.
The next page is the documentary page which is the main attraction to this publication. This documentary video is where the whole project derived from and has been my main focus. There is also a small synopsis underneath the video, which the user can scroll through to read whilst watching the video. There is also the option to view the video in full screen mode, which can be done when the video is tapped. This will bring up the controls menu and along it will have the full screen button and when it’s pressed the video will automatically go into full screen mode.
After watching the video, the user can then carry on exploring by swiping down. This will take you to the next page, which shows the second extract from the poem.
When you swipe to the next page you will come to the photography page. On this page, there is the option to tap the image and the slideshow will play automatically, or the user can swipe through the images manually themselves. As an additional feature, I embedded some audio on the page, which is a small excerpt from the audio I used in the documentary. This can be heard when the user taps the speaker icon on the bottom left hand corner. To stop the music, you just tap the icon again.
Once the user has finished viewing the photography gallery, they can then carry on scrolling to the next page, which shows the final excerpt of the poem.
Then after swipe down again, you then arrive to the final page of the publication. I called this the video montage page because I created another short video, that was inspired by David Hockney.
I made a short video demonstrating this publication fully working on the iPad. The only issue I have with it is the orientation switch between landscape and portrait as they don’t seem to correlate together. In the video you will see the page will go blank when switching to a different orientation. You have to swipe around quickly to bring in the different orientation. I’m unsure if this something that can’t be avoided as the digital publishing suite is still not 100% efficient in it’s workflow. However, I’m still pleased that both orientations can still be viewed on the iPad, despite this annoying glitch during the transition.
I’ve finally managed to overcome the orientation issue on my digital publication. I discovered that I was creating two separate articles, which didn’t correlate with each other.
Filed under: Extended Major Project
I’ve finally managed to obtain an iPad to test out my first digital publication upload to the iPad. I’d tested it out using the folio builder in Indesign. It’s been uploaded in both portrait and landscape view, although it does appear to be showing some glitches when changing its orientation.
Filed under: Extended Major Project
For the colour grading I used Magic Bullet Looks. I saw some really great examples of what you can produce using Looks, which is why I wanted to use it myself. I had often used Magic Bullet Mojo on previous projects, but I felt they all looked same so I wanted to try and use something different and with more control. I think that colour grading is a crucial aspect of putting the final touches to a film because it can dramatically change the perception of the whole thing. Here is an example of what people have produced using this plugin.
The second video show the contrast between the ungraded and graded footage.
Technicolor is a color motion picture process invented in 1916 and improved over several decades. It was the second major process, after Britain’s Kinemacolor, and the most widely used color process in Hollywood from 1922 to 1952. Technicolor originally existed in a two-color (red and green) system. The frames exposed behind the green filter were printed on one strip of black-and-white film, and the frames exposed behind the red filter were printed on another strip. After development, each strip was toned to a colour complementary to that of the filter—red for the green-filtered images, green for the red-filtered.
Here is an example of a recent film that appears to have been graded in a similar style, which is Madonna’s film, W.E. It’s definitely a solid colour grade to add to film that has a rich sense of emotion and history within its context.
After I had edited all my clips accordingly, I then opened up Magic Bullet Looks. When opening Looks it takes you to a separate window from Premiere Pro, like I’ve shown in the screen shot below.
I’ve screen captured some shots from my documentary to show the graded and ungraded footage side by side. I first made the contrast of the shots stronger as I felt all the dark colours looked a bit weak. I then applied the 2 strip technicolor filter onto the camera selection at the bottom of the screen. I also adjusted the saturation as some shots required some dulling down as they became increasingly vibrant when using the 2 strip technicolor filter. I can see a big different in terms of viewing the video colour graded in this way as opposed to it ungraded.
There is a sense of irony with the colour grading as I’m aiming to make the video look quite old fashioned, yet I’m trying to combine that with my development of exploring new practical filmmaking techniques.
Colour Graded Original Footage
One aspect that I’ve tried to embedd in my documentary and that is emotion. I feel colour grading the overall piece in this way will help emphasise that more.
After much criticism of my 3D composite of the title ‘iSolation’, I’ve changed the shot completely with a 2D version in a different scene. The only negative feedback I had after people had watch the documentary was 3D composite. Even after watching the whole piece, people were still irritated by this really short scene at the beginning. Tony Chance, who guided me through my development of this project, said that it looked out of place and unnecessary.
I initially chose to create a 3D title because I’d seen a really impressive tutorial (mentioned in a previous entry) that showed me how to composite 3D text into a live scene. What impressed me the most was the fact that you could light a 3D object using the pixel data from the shot you want to composite into. I was inclined to do this as I wanted to learn the basic principles of compositing 3D elements into video. It’s something I’ve been struggling to actually do for the last couple of years, but I think I’ve now managed to grasp the concept behind compositing really well. Despite all the efforts of this, the outcome didn’t work in the final piece.
However, I didn’t want to replace the 3D composited shot with something really basic, such as 2D text with a black background. I decided that it maybe better to composite a 2D version of the title into a slightly different scene that convey’s the idea of isolation a little bit better. I found a shot that I recorded at Maida Vale tube station when it was completely empty and I thought this would be a good little scene to have the title appear. I really wanted to make the text become an actual object within the scene to look as if it’s isolated itself.
This lead me to reflecting back to an initial tutorial I looked at, which was to composite some text into a still image. If you look at my header image at the top of the blog page, you’ll see I added some text into the scene of the image using a 2D text object. This was all created in After Effects using it’s 3D workflow.
Here is a video showing the new shot without the composite and then the shot with the 2D composited.
It was initially easy to set up the text into the scene. The first thing I did was created a new white solid and turned it into a 3D layer. Like in many 3D programs, you create a plane to allow you get the perspective right to align the object you want to fix into the scene.
I then turned the plane into a grid by going into the effects menu. This then gave me more flexibility to see how well the plane was positioned to the roughly the same angle as the ground in the shot. I also created a new camera, which I also used to help me make the 3D environment replicate the perspective of the actual scene. The camera is a way of controlling plane and anything else you add to the scene, without have to individually adjust the position of each inanimate object.
The next step was to create the 2d text, which was ‘isolation’. I then switched on the 3D for the text and then it moved position to the perspective of the virtual camera I’d just created. I then added a virtual spot light into the scene, which would be used to shine onto the 2D text to cast a shadow.
It was then a case of adjusting the text position to make it look like it was actually at the tube station when I shot this. I also positioned the light in place where there were actual lights; therefore replicating how a shadow would look if there was an object or person standing there.
I had to motion track the footage so that I could fix the text into the scene. I was having trouble tracking After Effects as the footage was shaky so I turned to Mocha, which is a stand alone Planar Tracking and rotoscoping utility built into After Effects. I was able to track the entire surface of the platform on the station, as opposed to object points. I then exported the tracking data to after effects and applied it to a null object, which I then parented to the Pre-comp of all the attributes of the 2d text. Although it tracked well, I still didn’t like the shakiness of the footage, so I ended up stabilising the whole video to get a smoother playback. Notice in the video example at the top, the original footage is shaky and the final outcome is much more stable.
I do feel the outcome to this technique is more realistic looking compared to the 3D version. As I said before, I wanted to make the text truly represent the title isolation by placing in a remotely empty scene; therefore it looks isolated.
Filed under: Extended Major Project
Even though I’m reaching the end of my project, I’m still continually researching the impact of the iphone as a tool for filmmaking. I’ve just been reading about this anonymous Al Jazeera journalist who filmed a documentary, shot on an iPhone 4 and 4S that shows a compelling first-person account of a country in turmoil and a revolution in progress in this intriguing episode of People and Power. one interesting thing that this writer has pointed out is that the ‘Footage from the iPhone 4 and iPhone 4S is good enough to make television with.’
The film premiered on the program Power and Power, which lasted approximately 25 minutes. According to Al Jazeera, the iPhone’s “tiny camera, filming secretly on street corners, through car windows, and behind closed doors, [let the correspondent] gather images that reveal ordinary people showing extraordinary courage.” It also makes it easier to hide from the Syrian government, which is actively targeting journalists trying to cover the conflict. This is exactly the same concept of what I’ve been doing on the last two projects, except I’ve not been filming in a war zone and trying to hide from the government. If it hadn’t been for the small camera in the iPhone, journalists would have seriously struggled to have captured any material to share with the media.
This is the first time an iPhone-filmed documentary has been aired on television and it goes to show that footage from the iPhone 4 is good enough to make television with. Mobile phone cameras have been responsible for a series of negative social changes–just think of the way phones are now commonly lifted into the air at concerts, ruining the view of the band. But around the world, from Occupy Wall Street to the revolutions in Syria and Egypt, camera phones have allowed citizens and activists to discreetly videotape events of genuine importance. And now it’s become a tool for intrepid undercover journalists, as well.
“The tiny camera, lets the correspondent gather images that reveal ordinary people showing extraordinary courage.”
However, it’s been reported that iPhone imports and served mobile phone-wielding activists with notices from the customs department of Syria’s Finance Ministry.
The full documentary can be viewed on youtube below:
After watching this short documentary, I feel incredibly happy that I’ve created a body of work this year that has been generated through an iPhone. I’ve come to realise it’s the mere concept of a camera being built into a phone that’s the main powerful aspect of both my projects this year. The iPhone is just a brand of a mobile device, but it’s camera capabilities are what make it stand it from other mobile phones. Stopping iPhone imports isn’t going to stop journalists doing the same thing, as there are plenty of other phones available with cameras. They Syrian government would have to stop imports of all mobile phones with built in cameras, to prevent people from exploiting the reality.