Hey everyone, Matt from soundrolling.com here with Dillon Bennett. He is a sound effects editor, supervising sound editor and sound designer for such films as Captain Phillips (2013) The Theory of Everything (2014) Sherlock Holmes: A Game of Shadows (2011). Thanks fo much for joining me Dillon and for everyone reading and supporting this resource.
I would love to start with Captain Phillips, I have already had a chat with Rachael Tate who worked as the Dialogue editor on Captain Phillips and would love to know your considerations when it comes to effects and dialogue:
Effects and Dialogue are two very different disciplines and there tends not to be a huge amount of interaction in the editing process. Obviously there are considerations in the areas where there are overlaps – such as PFX (production effects) and foley/FX, and also with crowd and backgrounds. If I’m working on an area where there are audible effects in the production I’ll sometimes check to see if the production sound is sync for the slate or a cheated alt take. If I’m working on backgrounds and editing library crowd sounds I’ll perhaps check how much crowd ADR is going to be shot for the sequence.
The two really come together when there are temp mixes for screenings and in the final mix. By the time of the final mix both camps will have premixed their work and have them in the best state possible and decisions are then made, moment to moment, about the harmony between effects and dialogue. Some sequences will benefit from the dialogue in the forefront with the effects played subtly, others will need to lead with effects just keeping enough of the dialogue there to keep you attached to the characters, and sometimes they both need to be mixed in a staccato way, taking it in turns to appropriately mark each moment on screen.
Of course, then the music comes along and obliterates everything!
You have recently finished work on the newest Steve Jobs movie this year, as a sound effects editor can you name some examples of sound effects you are editing? i’m interested in the relationship between fx and foley.
One of my roles on Steve Jobs was the machines. The film is set over 3 acts, each being a product launch and I needed to track down, record and edit the sounds of these original computers – not just the sound they make when they run, but the very different keyboards & mice, the beeps and tones and the actual sound of them being handled.
The computer that had the most airtime was the 1984 Macintosh 256 and it’s the one that involved recording and editing that would traditionally fall into both effects and foley categories. You could say that a good way to distinguish between the categories is that the effects are the sounds that the machine makes itself and the foley is the sounds of humans interacting with the machine.
This Macintosh doesn’t have a hard drive, instead using a floppy drive and disks to run everything from the operating system and programs to actual files. So unlike other more modern computers that have quite subtle noises of fans and hard drives spinning this machine has a noticeably mechanical sound. It whirs, stutters and grinds when doing the most basic tasks such as copying a file. It also beeps. It only has one type of beep. But it’s a very unique beep because it’s played through an old tiny speaker hidden behind 2 layers of plastic. These were the effects for the Macintosh and editing them was simply a task of putting the appropriate floppy disk drive sounds and beeps at times when the machine would be doing them.
The more foley elements involved the distinctive chunky clacking keys on the keyboard and the block-like mouse as well as the sound of the thick, coiled cables as the machine was moved about. When you are recording foley sounds you tend to try to capture it as in-sync to the picture as possible, so editing is more a case of fine-tuning.
Team dynamics must play an important role in making the post production process efficient and as creative as possible, how is the responsibility of your role decided at the start of a project like Everest where there are multiple editors and sound designers?
It really depends on the job. Factors such as the schedule, size of the crew, and the supervisor’s preferences will play their part. The editors/designers might each work on whole reels – covering all of the spot effects, backgrounds and design or sometimes in specific areas or on specific content.
On Everest, I was doing specific areas and content – mainly recording and editing bespoke sounds. From day one we had access to the production props (the climbing equipment, tents and so on) and I recorded as much of this as I could. Later I had the chance to travel to Nepal to shoot Crowd and ADR with the Nepalese actors and record effects in Kathmandu. Back in the UK the recording continued with a mammoth effects shoot of every conceivable type of snow on every possible surface and various specifics such as ladders and ropes.
I’m not sure how much was of my role was decided at the beginning of the job in this case. Nowadays its common for a large budget film to have several temp mixes scheduled before the final and so you might find yourself working on one area in a build up to a temp only to be doing something else entirely afterwards. You often hear people moaning about the need to do these temp mixes – they certainly do take time away from editing – but they are actually a great way to listen to everything at once, take stock, and see more clearly what areas need to the most work done.
Some of the films you have worked on are in 3D, does this have any effect on the sound effects editing?
In a sense, film sound has been 3D since 5.1 was introduced. The rear speakers allow editors to surround the audience with sound and with newer formats such as 7.1, 9.1 and Dolby Atmos becoming more commonplace we can be even more detailed and specific with these surrounding sounds. On the whole I tend to leave the sound of content that is happening on the screen in the LCR (the Left/Centre/Right speakers behind the screen) as anything that you see in front of you but hear in the surround speakers can be jarring. I think also that the days of 3D films having objects appear to fly in the audiences face is pretty much over and now it’s more about adding a realistic depth behind the screen as opposed to in front of it.
So – to answer the question – not really. We work with 2D picture in the cutting room editing and in the theatre mixing. There will be times where the film is run in the mixing theatre in 3D to check it translates but I’ve not heard of any mixes being done in 3D all the time.
You also worked on Frankenweenie as a sound effects editor, do you find you have to edit more effects for animation than you would The Theory of Everything as an example?
The short answer is no.
It’s expected that as an effects editor you cover everything. It became a necessity when foreign versions and M & E mixes became standard but its also expected by filmmakers and audiences – even if only on an unconscious level in the latter case. Most people I speak to outside of the industry don’t understand what we do. There is an assumption that a large amount of the sounds are recorded on set and generally bafflement at the lengths we go to in terms of detail. But play those people a film with just the original production sound and guaranteed they’ll say it sounds terrible because they are accustomed to hearing film sound that has had a huge amount of effects work done on it. Filmmakers obviously know how much work goes into the effects and they expect this level of detail so they can tell their story properly.
The thing that really determines how much editing needs to be done is the busyness of a film. A film that takes place in one room with two people talking, whether animation or live action, is going to require less editing work than an action film.