Viewing entries tagged
Sound Effects

Sound Behind the Scenes: Interstellar

Sound Behind the Scenes: Interstellar

Interstellar is definitely one of Christopher Nolan’s most adventurous and creative pieces of work, and when it comes to sound, the approach for such an experimental film was not an exception.

During an interview with The Hollywood Reporter back in 2014, the director described how he prefers to approach sound for his films. Speaking in detail, Christopher Nolan decided to approach this area in a highly impressionistic way, which is definitely a quite unorthodox approach for a mainstream blockbuster such as Interstellar; however, 5 years after its debut, we can assert that it was the perfect approach for an experimental film.

His approach was creative and audacious, Nolan said. And if we were to take a further look into how the film’s sound was developed, we could assert that, compared to other filmmakers who have approached sound in a rather bold way, Nolan did a great job. 

space background.jpg

In previous articles we’ve mentioned the importance of sound when it comes to storytelling —many people, especially sound professionals and directors with a vast knowledge of sound distance themselves from the idea that you can only achieve clarity through dialogue, especially the clarity of emotions and stories. And that is a really important takeaway.

If directors really were to try to make the most out of sound and sound elements, they would end up trying to achieve that in a more holistic, almost layered, way, using all the different elements at their disposal: moving images and sound.

You will probably remember some viewers complaining about the movie’s sound after its premiere on Nov 5th in 2014, claiming that they were unable to properly hear some key dialogue lines, which led to a myriad of conversations about whether it was the fault of the sound mix or the sound systems in some theaters across the world where the film was played. Nolan took a step forward and addressed all of these questions directly —he said the movie’s sound was exactly as he initially envisioned it and even praised theaters for presenting it properly.

Aside from his tremendous work, Nolan is also renowned for being a passionate believer that sound is as important as the moving images, which is why he is fond of hearing how his projects sound in actual theaters. During the same interview, Nolan also said he traditionally visits up to seven different theaters across the world just to see how the movie’s sound is performing.

As mentioned earlier, Interstellar caused people to question whether the film’s sound was right. Essentially, when it comes to films like these, it is possible to mix sound in an unconventional way as they did. Of course, that can catch some individuals off guard, but people, in general, will appreciate the experience, which is what happened with Interstellar in the subsequent weeks after it premiered.

The Team Behind the Sounds

The movie’s sound was initially attributed to a very tight teamwork amongst German composer Hans Zimmer, mixers Gary Rizzo and Gregg Landaker and sound designer Richard King. According to the director himself, they made cautiously considered creative decisions —the movie is full of surprises sound-wise. In fact, there are several moments throughout the film where Nolan decided to use dialogue as a sound effect, which is why from time to time it is mixed slightly beneath the other sound effect tracks and sound effect elements to emphasize how loud the encompassing sound actually is.

As an example, if you recall the film, there’s this scene during which Matthew McConaughey is driving through the cornfield, which is also extremely loud and, to some extent, frightening —considering that Nolan himself was riding in the back of the car while filming point of view shots—. Nolan wanted the audience to experience first hand how chaotic such situation was by making them feel all the turbulence that was going on through sound.

Another example is when they are in the cockpit and you hear the creaking of the spacecraft. That’s actually a very scary sound, and it was loud enough for people to get immersed into the story and actually feel what space travel might be like. It was definitely all about emphasizing intimate elements. 

The movie is definitely a case study on its own. Nolan also described that sound designer, Richard King, managed to get high-quality sounds inside the truck during the scene mentioned above; however, he decided to echo them later in the film, with one of the key spacecraft scenes, in hopes of making it more similar and truthful to what astronauts experience and hear in real life.

space capsule.jpg

Nolan also resorter to other elements to describe the different planets the protagonists visit throughout the film, not just with moving images, but with sound as well. Nolan stayed away from the traditional layering of sound elements and chose to delineate the planes based on recognizable sounds —the water planet is a lot of splashing in contrast to the ice planet, which has the crunchy sound of ice glaciers.

*The images used on this post are taken from

Oscar for Best Sound Mixing and Editing Explained

Oscar for Best Sound Mixing and Editing Explained

In this article, we’re going to be looking at perhaps two of the most confusing Oscars categories: Sound Mixing and Sound Editing. If you’re not familiar with the sound and audio post-production landscape, these categories might seem exactly the same thing; however, there are certain differences, and that’s why we often see a movie nominated for both.

The big thing to think about what’s sound editing and sound mixing is that sound editing refers to the recording of all audio except for music. And what’s audio without music? Dialogues between characters, the sound picked up in whatever scenario a scene was recorded at, and, also, sound recorded in the studio, for example, ADR, extra lines of dialogue, all those crazy sound created to mimic, for example, animals, vehicles, environmental noises, the foley, etc.

Sound mixing, on the other hand, is balancing all the sound in the film or the movie. Imagine taking all of the music, all of the audio, all of the dialogue lines, all the sound effects, the sounds going around, etc., and combining them together so they are perceived as balanced and beautiful tracks.

Some people refer to this last category as an ‘audio tiramisu’, as there are layers and layers of sound that, in the end, compose a beautiful orchestrated group of sounds. Layers of what’s happening in a film’s particular scene and the real realm and layers of what’s happening around it, like in the spiritual realm.

If you recall The Revenant, the American semi-biographical epic western film directed by Alejandro G. Iñárritu that was nominated for several Academy Awards categories including both sound editing and sound mixing, the exemplification of the film’s sound being a total ‘audio tiramisu’ is more noticeable. In the revenant, the sound was so perfectly crafted that it was like if two different stories were taking place at the same time side by side, and you could only distinguish between them by listening.

When it comes to sound editing, take for example another movie, Mad Max: Fury Road, the 2015 post-apocalyptic action film co-written, produced, and directed by George Miller. The movie contains all of these amazing and great recordings of cars, fire, explosions, the really subtle dialogue, which ultimately creates so much contrast between the action and what the characters were really saying. Max, played by Tom Hardy, was actually really quiet, whereas Imperator Furiosa, played by Charlize Theron, was screaming at the top of her lungs, and all of that happened in the middle of the most frenetic action possible. All the audio was used and mixed at the same time.

Having used and mixed the audio at the same time was, in reality, a huge achievement. Rumor has it they used up to 2,000 different channels, meaning they used 2,000 different audio pieces at one time, which is perfectly recognizable at the opening car chase sequence, allowing you to perceive how much sound was being used. The movie, in the end, managed to mix all the dialogue, the quiet dialogue, the effects, the action, the environmental sounds, etc., and to use it all together.

The Process Deconstructed

The relationship between sound mixing, sound mixing and storytelling, however, is perhaps the cornerstone of the whole audio post-production process. How audio design and sound mixing can be used to help storytelling, specifically in the films, is the main question that audio technicians strive to answer.

movie making.jpg

First, they approach both practices thinking how they can make the tracks sound better, and then how they can add to the story —make the audio tell the story, even if you don’t specifically see what’s going on. In terms of sound design, the whole idea behind this creative process is coming up with key takeaways regarding what is the purpose of the scene, or whether or not there are specific things that don’t appear in the moving images but still are ‘there’ and need to be told.

After having analyzed the scenes in terms of what can be done to improve the general storytelling, audio technicians start to balance the dialogues track by track, which is, of course, a process that takes several hours. Is it necessary to add the room tone? Is it necessary to remove it? Those type of questions normally arise during this part of the process. Afterward, the EQ part starts.

The EQ is normally that part of the process where audio technicians do a little bit of clean up by changing the frequencies of the sounds the audience will hear in order for them to hear them clearer and better. This is important in terms of the storytelling because by using an equalizer, audio technicians can add textures to the voice and the sounds people will hear, which is of course what the whole storytelling is about.

*The images used on this post are taken from

The Sound of An Oscar Nominee: A Star Is Born

The Sound of An Oscar Nominee: A Star Is Born

Have you ever wondered what it takes to craft a compelling sound? What techniques and technologies behind sound have been used for sound professionals to hit the spotlight and be recognized by the industry? Now that The Oscars are around the corner, a lot of conversations start to arise, especially about the nominees.

In this installment, we’re gonna go through the sound of A Star Is Born, as the movie has been nominated for best sound mixing. Steve Morrow, who later offered some behind-the-scenes insights at recording Lady Gaga and Bradley Cooper, was responsible alongside Tom Ozanich, Dean Zupancic, Jason Ruder for this part of the audio post-production process.

In a recent interview, sound mixer Steve Morrow said that both Gaga and Cooper wanted the film to have a particular style of sound: they wanted it to sound as if it was a live concert, which makes sense given Morrow’s experience in shooting at live concert venues like the Glastonbury festival; however, the request really ended up posing a real challenge: “In Glastonbury, we all went in there believing we had almost eight minutes to shoot, but we later found out the festival was actually running late so they only gave us like three minutes,” Morrow said.

The sound mixing crew asserted later on that the idea was to film three songs, but given those circumstances, they decided to play 30 seconds of each of those songs. As for the sound mixing process, Morrow also mentioned that the idea at the very beginning of the process was to capture all sounds live, all the performances, all the singing, etc., which ultimately ended up in a Lady Gaga mini show, as the music wasn’t amplified in the recording room.

Such conditions led Morrow to assert that his role on A Star Is Born differed a bit from a more typical production. On a normal set, it is the production’s responsibility to record lines of dialogue while filming all environmental or sound effects that would be happening at the same time during the filming process. During A Star Is Born, Morrow and the rest of the sound mixing crew had to do all that process whilst also recording the band and the live singing, making sure they had captured all the tracks.

After that, the team would hand those tracks to the editorial and the post-production crew. Sound people would then take all that information, mix it down accordingly, and that’s practically what you hear in the film. Nothing else.

As for the tricky part of the film, filming the live concert, Morrow took a rather uncanny approach to get those tracks. In the movie, the sound crew had to film twice at a real concert: Stagecoach and Glastonbury. The crew had to take advantage of the time between acts, and as soon as Willie Nelson was expecting his curtain call to come on stage, Morrow and the crew make the most out of the eight minutes they initially had to get the tracks.

Image from

Image from

What they would do, according to the mixing crew, which was ultimately different from all the other recordings they carried out in controlled spaces, is they would approach the monitor guy with some equipment and take a feed from the monitor through the mic Bradley Cooper was supposed to use.

Most of the time, they would do a playback of the band through the wedge —the small speakers a performer standing in front in live presentations. Morrow and the rest of the mixing crew would then put those playback tracks through so that Bradley Cooper could hear them, but the crowd couldn’t as they were standing far enough away from those speakers. So, in a nutshell, what they did to record the live concert scenes was to have Bradley Cooper singing live whilst hearing a playback of the instruments through the wedges.

An additional challenge was making sure not to amplify any of those tracks and performances, as Warner Bros. didn’t want the music to be heard by the crow in order not to risk losing impact. Such demands forced the mixing crew to mute practically everything as much as they could, which was also different from the way film producers film in different and controlled locations.

The fact of having a big crown in front makes the process way more challenging: the whole crew, film, picture, sound, etc., only have a few minutes to shoot, which increases the chances of not getting a lean and clean sound. In controlled scenarios, sound crew normally record up to ten different tracks, whereas in front of a live audience, they would need not only to prevent tracks from being heard but also to record the live audience for the desired effect.

Dialogue Editing and ADR With Gwen Whittle

Dialogue Editing and ADR With Gwen Whittle

If you recall the movies Tron Legacy and Avatar, they both, aside from having received Oscar nominations, have one name in common: Gwen Whittle. Gwen is perhaps one of the top supervising sound editors working today, which is why a lot can be learned from her work.

Gwen also did the sound supervision for both Tomorrowland (starring George Clooney and Hugh Laurie) and Jurassic World (starring Chris Pratt), and although she’s known for overseeing the whole sound editing process, she’s mentioned in several interviews that she’s highly fond of paying special attention to both dialogue editing and ADR sessions, as mentioned in previous articles by Enhanced Media in our blog.

Dialogue editing, as mentioned by George Lucas back in 1999 just before Star Wars: Episode 1 hit the theaters, is a crucial part of the whole sound editing landscape, and, apparently, even within this industry, nobody pays enough attention to it. In fact: dialogue editing is the most important part of the process.

So, what’s dialogue editing?

Dialogue editing, if it’s done really well, is, according to Gwen Whittle, unnoticeable —it’s completely invisible, it should not take you out of the movie, and you should pay no attention to it. Imagine taking all the sound from the set, take by take, just to take a much closer look at the dialogues captured for a specific scene.

Of course, not all dialogues recorded on the set sound the same —maybe the take was great, the acting was great, the light was great, but suddenly a truck was pulling over and an airplane happened to fly over the crew. It’s practically impossible to recreate that take as there are many aspects involved: air changes, foreign sounds, etc., and no matter how much you try to remove all those background noises, sometimes you need to resort to the ADR stage. In an ADR session, it all comes down to trying to recreate the same conditions that should apply to that particular scene.

Cutting dialogue often poses several challenges to sound editors, and it highly depends a lot on the picture department. A dialogue editor receives all the production from the picture department, everything that was originally shot on set, making sure that each mic has its own track. It’s the responsibility of the picture department to isolate each mic with its own track so dialogue editors can do their magic.

On set, the production sound mixer is recording anywhere from one microphone up to eight, usually, sometimes more, but the idea is for each actor to have their own mic and at least one or two booms. All this mix is passed onto the dialogue editing crew, isolating each track, matching the moving images just like the movie is supposed to be.

Once the dialogue editing crew has received the tracks, they listen to them and assess which parts can be used and which parts need to be recreated, organizing which tracks will make it to the next stage. Sometimes, since dialogues can be recorded using two different microphones such as the boom and the talent’s personal mic, sound editors can play with both tracks trying to make the most out of it whilst spotting which parts require an additional ADR session.

If there’s a noticeable sound, like a beep, behind someone’s voice, a dialogue editor can really get rid of that in case they need to; however, that’s not always the case. ADR sessions are quite familiar with the sound editing process. In films with a smaller budget, the dialogue process gets a bit trickier, since normally all tracks aren’t passed isolated onto the dialogue editing crew, so they need to tackle any hurdle in their tracks. Low budget films normally include more dialogue as they don’t have the resources to either afford fancy sets or include fancy visual and sound effects.

Do directors hate ADR?

Well, according to Gwen Whittle, not many directors are fond of ADR. David Fincher, for example, is. ADR is a tool. A powerful tool. And if you’re not afraid to use it, you can really elevate your film because it takes away the things that are distracting you from what’s going on.

ADR and dialogues.jpeg

Actors and actresses like Meryl Streep love ADR sessions because is another chance to perform what they just did on set. They see ADR as the opportunity go in there and try to put a different color to it, and it’s another way to approach what the picture crew just got on a couple of takes on set. Many things can be fixed, and even alter several lines. You can add a different twist to something. In fact, even by adding a breath to something, you can change the nature of a performance. It’s the opportunity for both the talent and directors to hear what they really want to hear.

*The images used on this post are taken from

Sound For Documentary

Sound For Documentary

Since the emergence of the sheer array of affordable camera recorders, the rising prevalence of mobile phones with decent video cameras and the ubiquity of social media channels such as YouTube as one of today’s major media diffusion channels, it has never been this easy to produce and subsequently sharing documentary videos. If we were to take a much closer look at the whole production process, it would be easy to assert that sound is the weakest part of many of these videos. Although it is relatively easy to shoot and record with a camera regardless of its quality, the art of placing a microphone, monitoring and taking care of volume levels still remains an ambiguous puzzle compared to the other components that take place when shooting a video documentary.

In today’s post, we going to go through a general outline of practical techniques and an end-to-end guide to the primary tools for recording, editing and mixing sound for documentary audiovisual projects. Whether you are using a mobile phone, a regular video camera, a D-SLR, prosumer or a professional camcorder for shooting your project, the sound will always be an important part of the storytelling.

There are many ways in which tremendously good results can be achieved with consumer gear in many different circumstances; nonetheless, professional gear comes with extra possibilities. Here are some fundamental concepts directors and documentary producers need to bear in mind every time they want to take one of these projects.

Sound, as a conveyor of emotions - Picture, as a conveyor of information

Documentary shooting.jpeg

Think of the scene in Psycho of a woman taking a shower in silence. Now add the famous dissonant violin notes, and you get a whole new experience. That leads to consider the emotional impact of a project, in this case of a scene in particular. Sound conveys the emotional aspects of your documentary. It’s practically the soul of the picture. Paying special attention to sound, both during shooting and afterward in the studio, can make the real difference. No matter if you’re planning on doing a simple interview with plenty of dialogue, an enhanced, or rich sounding, in this case, the human voice is the differentiating factor between an amateur and professional project.

Microphone placement and noise management are key

The main issue with the vast majority of amateur sound recordings is the excessive presence of ambient and environmental noises from all kinds of sources, and a low sound level relative to the ambient noise. As a result, we’ve all seen how difficult it is to understand the dialogues, which is ultimately detrimental to the intended emotional impact. This common situation is one of the consequences of poor microphone placement. Directors and producers need to learn to listen to the recording and experiment with different microphones and different placement options. It all boils down to getting the microphone as close as practical to the intended sound, and as far away as possible from the extra noise that interacts in a negative way with the whole recording.

Additionally, if the documentary takes place outdoors, the chances of getting unwanted wind noise are hight, which is why the use of a windjammer to control wind noise is always a good idea. Regardless of whether you’re a professional or an amateur taking on a documentary audiovisual project, with a little bit of practice and research, you can craft outstanding sound recordings, irrespective of whether you’re recording with professional gear or your mobile phone.

Monitor your recording

In order to craft a compelling and professional recording, you need to properly set recording levels first —not too soft so sound doesn’t get lost in the overall noise; not too loud so you can avoid possible distortion. When recording, always monitor the sound you’re getting with professional headphones in order to avoid possible surprises in the edition. When using digital recording devices, it’s impossible to record anything beyond full scale, so abstain yourself from crossing this limit, as otherwise, the recording will sound hideous, unless your camera or the device you’re recording with as an automatic gain control to adjust recording levels.

The shotgun myth

There seems to be a myth regarding microphones. Apparently, some people firmly believe that the shotgun microphone reaches farther than other devices. This is not true. Shotgun microphone simply does not work like a telephoto lens. Sound, unlike light, travels in all directions. Of course, shotgun microphones work; they have their place, and they really come in handy in somewhat noisy environments, especially when you cannot be as close as the individual doing the talking as you’d like in an ideal scenario. That being said, shotgun microphones are far from performing magic. What they really do is that they respond to sound differently in terms of reduced level, null point, and coloration. Although they look impressive, plenty of sound professionals and directors choose to use different types of microphones for their documentary project.

*The images used on this post are taken from

Mixing Audio For Beginners - Part 3

Mixing Audio For Beginners - Part 3

Here is the third installment of Mixing Audio For Beginners. If you’ve been following this illuminating compilation of the intricacies and the basics of sound and audio post-production, we’re gonna be addressing further topics taking it from where we left off in the last post about Mixing. Otherwise, we suggest you start off right from the very beginning. So, without further ado, let’s continue.


We mentioned last time that when editing dialogues in a studio through ADR, it is no less than pivotal to create the right environment for recording new lines. Every time a sound professional is tasked with re-recording lines and additional dialogue in a studio, they always have to pay special attention to several aspects that, if overlooked, could ruin the pace of the scene. Each dialogue edit inevitably comes with several challenges, like the gaps in the background environmental sound.

There’s nothing more unpleasant than listening to audio or a soundtrack where the background ambiance doesn’t match the action going on from one scene to the other. This phenomenon is highly common during ADR sessions, which is why, aside from helping the talent match the intensity each shot requires, sound professionals also need to edit the background sounds to fill any possible hole in order for the scene to feel homogenous.

The problem is when the production sound crew captures room tone on a specific location and then, once production is finished, the audio post-production crew needs to replace dialogue and fill the holes with room tone. Of course, there are tools to recreate room tones based on noise samples taken from existing dialogue recordings; however, it is indeed one of the most common tasks under the umbrella of audio post-production.

Sound Effects (SFX)

sound effects.jpeg

Whether coming across the perfect train collision sound in a library, creating dog footsteps on a Foley session, using synthesizers to craft a compelling spaceship pursuit, or just getting outside with the proper gear to record the sounds of nature, a sound effects session is the perfect opportunity for sound and audio professionals to get creative.

Sound effects libraries are a great source for small, and even low-budget, audiovisual projects; however, you definitely must not use them in professional films. Some sounds are simply too recognizable, like the dolphin sound every single time a movie, ad or TV show, shows a dolphin. Major film and TV productions use teams to craft and create their own idea of sound effects, which ultimately becomes as important as the music itself, for example. Think about the lightsaber sounds in any Star Wars movie.

After that, additional sounds can be created during a Foley session. Foley, as discussed in other articles, is the art of generating and crafting sounds in a special room full of, well, junk. This incredible assortment of materials allows foley artists to generate all kinds of sounds such as slamming doors, footsteps in different types of surface, breaking glass, water splashes, etc. Moreover, foley artists recreate these sounds in real time, which is why it is normal to have several takes of the same sound in order to find the one that best fits the scene —they are shown the action in a large screen, and then start using the materials they have at hand in order to provide the action with realistic sounds. Need the sound of an arm breaking? Twist some celery. Walking in the desert? Use your fists and a bowl of corn starch.


Just like with sound effects libraries, when it comes to music, sound professionals have two choices based on their talks with production —they can either use a royalty-free music library, or they can, alongside music composers, create a score for the film entirely from scratch. Be that is it may, the director and productions are the ones who have the final say over what type of music they want to use in the project and, perhaps more importantly, where and when music is present throughout the moving images.

Sometimes video editors resort to creating music edits to make a scene more compelling. Other times, it’s up to sound professionals to make sure the music truly fits into the beat and goes in accordance with what is happening. The trick is to make the accents coincide with the pace of the on-screen moving images as the director instructed, and that music starts and ends where and when it’s supposed to.


Assembling all the elements mentioned in the first two parts of this mini guide and this article into a DAW timeline and balancing each track and different group of sounds into a homogeneous soundtrack is perhaps where this fine art reaches its pinnacle. Depending on the size of the studio, it is possible to use more than one workstation and different teams working together simultaneously to balance the sheer array of sounds they’ve got to put in place.

*The images used on this post are taken from

Mixing Audio For Beginners - Part 2

Mixing Audio For Beginners - Part 2

According to the previous article, we mentioned the importance of establishing an intelligent workflow in your audio production process. As per defined by the dictionary, the word workflow means “the sequence of processes through which a piece of work passes from its initial phase to total completion.” Such definition, of course, can be integrated with the audio post-production workflow phases in order to see how they work in different types of productions.


A pre-production reunion is the meeting that gets you together with the production officials, whether it is the production company, director, or the advertising agency before the production starts. If you happen to be invited to this meeting, you can, of course, express your opinions to the production team, which might even save them hours and effort. If they seem to be open to receiving additional creative input, you could help develop the soundtrack at the concept phase. It means that your insights on the project can also have a certain impact on selecting the audio budget, which is always a positive thing. Remember: an hour of proper pre-production will spare you 10 hours of possible setbacks.


Makeup artists make their magic, services are consumed, lights are turned on, actors deliver their best performance, video is shot, audio is recorded, computers are then used to animate existing action sequences, etc., and the pretty much the whole budget is spent during this phase.

Video Editing

Once the visuals have been recorded and created, the director works with the video editor in charge to pick the best footage and assemble the moving images in a way that tells a compelling story. Once the editing has been done, the audio editor or sound engineer will receive a finished version of the audiovisual project that, in theory, will not suffer further changes —that’s known as “picture lock.” This final version of the recorded footage can only be achieved once the deadlines have been met and the budget for those processes spent.

Creating The Audio Session - Importing Data

The video editor is responsible for passing onto audio professionals an AAF or an OMF export compiling all the audio edits and additional media so they can re-create, or create from scratch, their own audio edits. Once sound editors and audio professionals import the files, they will have a much clearer idea of what they’ve got to do.

At this point, audio editors also import the moving images and the edited video, making sure they are in sync with the audio from the aforementioned exports (AAF and OMF).


During this phase, both the director or the producer sit down with audio professionals to tell them exactly what they want and, more importantly, where they want it. The entire film or video project is played, so audio professionals can take notes regarding the dialogues, the sound effects, the score, and the music, etc.


Dialogue is perhaps the most important part of the entire soundtrack. Experienced audio editors will always separate dialogue edits into different tracks, one per each actor. Sometimes, when audio is recorded on location, the audio person responsible for recording those tracks often records two different tracks for each actor —a clip-on mic and the boom mic. Once in the studio, the audio professional assesses both tracks and chooses the one that sounds best and is more consistent throughout the entire length of the moving images.

In case of coming across noise on the dialogue tracks, a common technique that sound editors employ is using noise reduction tools or similar software to repair that audio without compromising the final mix.


We’ve covered ADR before in previous posts, just in case you don’t know what ADR means.

Shooting film and ADR.jpeg

If, after having used the techniques mentioned in the last paragraph, the audio cannot be repaired through the use of noise reduction software, audio professionals always resort to performing ADR.

ADR means having the actors and the talent go to the studio to carry out several tasks, such as:

  • Replace missing audio lines

  • Replace dialogue that couldn’t be saved

  • Provide additional dialogue in case of further plot edits.

Actors have projected their scenes so they can recreate their lines. Normally, a cue is used to make sure they record in sync with what’s going on in the film. They also do four or five takes in a row, since the scenes are projected in a loop over and over (hence the word looping). The sound editor or audio professional then picks the best line and the best performance and replaces the original noisy/damaged take with the newer version. In order to match the intended ambiance, sound editors may use the same mich as the original take, but they will likely have to use further equalization, compression, and reverb to make the new performance be in synch with the timbre.

*The images used on this post are taken from

Mixing Audio For Beginners - Part 1

Mixing Audio For Beginners - Part 1

Have you ever wondered why your favorite films or TV shows sound so good? Or why TV ads and commercials are sometimes so much louder than other films and TV series? Or why that internet video that you like the sound so bad?

In this mini-guide, we want to go through the intricacies commonly associated with the creation of sound, audio, and soundtracks for both video and film. Crafting and mixing audio for film and video is a rather profound issue; covering all the basics would take hundreds of pages, due to the constantly changing nature of this business and the technology involved.

This first part covers basic aspects, a bit of background, some terms and terminology, and hopefully, will serve as a clear guide to understanding what mixing audio for video and moving images is about.

The World Of Audio For Video

Way back in the ages of the past century, recording engineers would often face a daunting dichotomy: they often had to make a career choice between either producing music or producing sound and audio for visuals and moving images, such as TV series, Ads, Films, etc. Since the aforementioned career choices were considered specialized assignments, they demanded specialized tools get everything done.

The inclusion of computerized digital audio systems in the late 80s made it possible, and definitely much easier, to use the exact same recording tools to produce and edit both music and soundtracks. Perhaps, if you’ve had any experience with audio post-production, tools, and systems such as AVID, NED PostPro and the early pro tools might ring a bell. That era marked the beginning of a new dynamism where terms such as convergence —where the lines of both worlds of audio and video production intertwine— started to become popular. As a result, the vast majority of engineers had to learn to do audio post-production sessions during the day and music sessions at night.

Be that as it may, the process has undoubtedly evolved throughout the years, and the modern and contemporary process of audio post-production has changed more than ever before.

Types Of Audio Post Production

In order for us to discuss the types of audio post-production, we need to start by making a necessary distinction between what is commonly referred to as audio and other types of soundtracks like radio commercials, audiobooks or the well-known podcast. Though a lot falls under the umbrella of audio post-production, we commonly mean by audio post-production as the audio especially crafter for a moving image or a visual component. Here are the most traditional forms:


TV shows can be practically any length, but the vast majority of US TV programs are intended to last between 30 to 60 minutes. Many are produced by highly qualified and experienced TV studios in Los Angeles. As for Reality Shows, although these can be shot and recorded anywhere, they also require a good and experienced audio post-production team to mix both audio and video in a professional fashion.


film making.jpeg

Films vary in their nature. Short films can span just a few minutes, whereas longer films can last several hours. This category includes today’s production for Netflix, HBO, and Amazon, as well as the famous traditional major studios. When talking about a film, it is also important to mention the financial aspect: independent filmmakers, known for producing small to no-budget projects still require an important dose of audio post-production. In fact, many sound engineers are fond of taking on these projects as it serves as the perfect opportunity to get some training prior to taking the big leap.


Commercials include several types of visual projects. The term “commercials” often refers to TV commercials, infomercials, ads, promos, political ads, etc. The nature of the aforementioned types of commercials is basically known for its rather short format —today, it is possible to come across commercials ranging from 5 to 60 seconds in length. There are of course much longer commercials; however, it is rather expensive pretending to buy airtime for something longer than sixty seconds.

Video games

Video games are extremely fun. And crafting audio for video games is even funnier. The vast majority of top-quality games, also known as AAA games, have behind a dedicated audio post-production team responsible for creating and capturing the sounds that will be included in the game. This, of course, is absolutely unique to every single game, and certainly demands a daunting amount of work, requiring hundreds of audio files, as the game will demand soundtracks in different languages, which ultimately increases the number of files the audio team will need to manage.

Audio Workflow

The process through which a piece of audio work completes initiation to completion is known as a workflow. And although we will get into more detail in a subsequent post, a traditional audio workflow is comprised of the following stages: pre-production, production, video editing, data import, spotting, dialogue, ADR, ambiance, sound effects, music, mixing, delivery, summary.

*The images used on this post are taken from

 ADR: Tips And Tricks

ADR: Tips And Tricks

Automated Dialogue Recording, or ADR, is an essential part of every audiovisual project, but knowing its intricacies is key when it comes to becoming a proper filmmaker. ADR, as many people like to call it, is basically a method of adding dialogue to an already filmed scene. By superimposing dialogue that has already been recorded in a studio, or at least in an acoustically treated room or space, filmmakers can get past the challenges commonly associated with location dialogue. The problem with location dialogue is that it oftentimes results a bit hectic when environmental noises are too high and difficult to mute, the equipment doesn’t work the way it is supposed to do, or when the crew cannot get the right background noise.

When it comes to films, almost every contemporary Hollywood film has 50% to 70% ADR dialogue. ADR is no less than pivotal for the success of any film, and if executed the right way it can definitely salvage an entire film.

The Basics Of ADR

Before we get into more detail, there are several elements associated with ADR that filmmakers must bear in mind so they can plan and set up their recordings properly. By looping, existing playback of a repeating loop from the project is given to the recording crew while simultaneously recording new voices and dialogue. There are two different types of looping: audio looping and visual looping. With the latter, an actor listens to the location take or recording several times to understand the nature of that scene in particular and get a feel of the situation prior to recording the new dialogue. Once they’re ready to record, they will not hear the location take but will take a look at the scene to match lip synchronization. They always hear themselves over the monitors so they can hear the lines they’re delivering in real time.

Audio looping, on the other hand, will traditionally produce the most desirable outcome. However, it is important to mention, it is normally more time demanding. The session is carried out the same way as visual looping, cutting the video monitor and hearing the original dialogue track. The vast majority of ADR engineers are fond of using both techniques simultaneously. They always break up the looped lines into much smaller parts so they don’t lose consistency and synchronization. As for synchronization, for better sync when starting a line, ADR engineers record three beeps exactly one second apart each, so actors know when the first voice starts. This is known as an audio cue; like a metronome, so actors can start in the right moment under the proper rhythm of the line being recorded.

An ADR Recording Space

In sound and audio post-production, filmmakers have essentially more control over audio than they do when recording on location. The basic goal of each audiovisual project is to provide the audience with lots of experiences, and audio is not the exception. When it comes to ADR, the main idea is to get a really clear and clean ADR recording so ADR engineers can put the dialogue in an acoustically treated space with proper equalization.

ADR Equipment And Gear


When recording ADR in an acoustically treated space such as an audio post-production studio, sound engineers and ADR professionals often try to use the same microphone the filmmaking unit used on location to capture the existing and original dialogue. The goal of ADR is to compellingly and adequately match the lines in both tonal characteristics and frequency response to the lines recorded on location. Since all microphones have different polar patterns and different frequency responses that yield different tonal nuances, it’s important, not only to use the exact same microphone—or at least a similar one—, but also to place them properly so acoustic features don’t get lost.

There are several digital audio workstations such as Pro Tools, Ableton Live, Logic, etc., that can help ADR engineers loop their recordings according to their needs. ADR demands, aside from microphones, other audio production software. A basic ADR toolkit looks like this:

  • Microphones

  • Digital Audio Workstation

  • Headphones

  • Preamp or Interface

  • Video Monitor

Microphone Placement And Delivery

Mic placement depends heavily on what type of microphones are being used. It is key to maintain a certain distance between the mic and the actor or actress to provide the recording with realism. Also, some ADR engineers are fond of using filters when deemed necessary. How an actor or an actress delivers the line is also pivotal for the success of the recording, as it affects the delivery itself and the tone of the ADR recording. Some actors tend to replicate the same movements being projected in the moving images, as it aids them in creating the exact same mood the filmmaker wants for that specific scene.

*The images used on this post are taken from

6 Tricks For Foley Sound Effects

6 Tricks For Foley Sound Effects

Foley artists are pivotal for any audiovisual project once it has been shot and edited, as they’re responsible for taking care of any possible missing sound, and, as described in a previous article, a crucial step in the audio post-production process is also what foley artists can do: perform and create sound effects to match the moving images being projected on the screen.

Common sound effects we always hear in movies for example footsteps, chewing, drinking, clothing movement, doors being opened, keys jingling, etc., are created through a set of different recording techniques and materials. Foley is more than simply manually editing sounds. In fact, it not only is more than that, but also more time efficient, and provides audiovisual projects with a much richer character and realism to other sounds in the film. Whenever a foley artist can’t create a sound in the studio, sound designers and sound editors will be always up for the task.

That being said, have you ever wondered what’s the best way to mimic or recreate the sound of a fight? The sound of fists going back and forth and hitting another body? Or how can you recreate the sound of footsteps in a snowy road in a recording studio? What’s the best way to mimic a sword fight? Here are some tips for coming with foley sound effects:


Wooden Creaks And Floors

People stepping on creaking wood and squeaking floors appear in practically every film you’ve seen. Footsteps on old floors or people walking over an old house porch are perhaps one of the most used scenes in films. Foley artists have at their disposal a sheer array of floors and objects to recreate these sounds. The advantage of using these accessories is that the sound, in this case, the creak or the squeak, can be to some extent controlled. Once Foley artists have developed a proper technique, coming up with these sounds and performing these creaks saves the picture a lot of time, as sound editors won’t need to edit all sounds on Pro Tools.


Fire is one of those sounds that also always appears in the vast majority of films. Foley artists often resort to accessories such as cellophane, potato chip bags, and even steel wool. The most common technique for recreating fire sounds is to scrunch up the accessory and then release it; the effect will be, of course, rather subtle, but when recorded with the mic closely a somewhat low-level fire sound will be achieved.


cash sound.jpeg

Money and stacks of cash have their own sounds as well. Traditionally, whenever a foley artist has to develop the sound of cash, they often resort to an old deck of poker cards or book pages. In order for foley artists to successfully achieve this sound is to use accessories, in this case, paper sources, with flexible and softer textures. In fact, the vast majority of the time, foley artists add actual bills in the middle of the paper roll, or on the top, or on the bottom, so they fingers actually brush its surface, creating the sound of cash.



Galloping horses is one of those sounds whose technique to achieve it has practically remained untouched. Foley artist normally uses coconuts to recreate horse hooves, and it’s probably the most well-known foley accessory thanks to Monty Python and The Holy Grail. Several foley artists suggest stuffing the half coconut with some materials such as fabric in order to get a more realistic sound. Then, hit a compact dirt or whatever surface the horse is running on with the stuffed coconuts.

Bird Wings

Just like with horses, in order to achieve the sound of birds flapping their wings or taking off, foley artists normally resort to traditional and really orthodox accessories such as a vintage feather duster or gloves. It’s also important to experiment with different materials and perhaps heavier textiles to create a much thicker sound for larger species. An old feather duster can create a terrific effect if the foley artist can find a nice sounding one and hit it against all kinds of surfaces and objects to create different sounds.


Inhaling A Cigarette

smoking sound effect.jpeg

Ever wondered hoy films record the sound associated with a cigarette inhale? Foley artists often use saran wrap and other light materials to get this sound. By using saran wrap, you can get a similar sound to the fire sound mentioned above; however, it’s more subtle. Nonetheless, it is produced the same way as you would produce the fire sound: compress and then release, but make sure to do it controlled so you don’t overdo it. Make sure to have the mic close enough so you can capture the desired level of subtleness; otherwise, you may obtain a totally different sound.

*The images used on this post are taken from

The Intricacies Of Mixing Sound For 360º Video

The Intricacies Of Mixing Sound For 360º Video

One of today’s most popular video formats is the 360º video. This format, which has been used by a plethora of influencers on YouTube (the platform in which the format gained its popularity), is seldom used outside social media channels, which is why there’s not a lot of information on how to edit sound for what is also called spatialized video. If you happen to have a project of this nature in mind, in this article we’ve shared the details on the intricacies of mixing audio and sound for this kind of video format.

The Visuals Will Determine Your Approach

When it comes to 360ª video, we’re basically talking about videos that represent the projected images as a single flat still. Thus, it is normal for viewers to perceive ceilings and floors as curved figures. In fact, rounded visuals suggest that 360º videos are a geometrical representation of a cylinder, which causes the seams to be curved; however, these also get flattened when they’re run through a 360º video editor software. So, under these circumstances, how do you even start outlining a plan to properly add audio to such complicated format? To being with, just like any other video format, a 360º video can also be split into different quadrants.

Think of quadrants as small parts of the whole sequence, and while it may look that sometimes there are duplicated quadrants, in fact, it’s just a visual representation of one quadrant split in two different, but equally long, parts. If you were to print the whole sequence as a linear chain of events, you would be able to fold the impression into a cylinder shape and see how each quadrant connects with each other—as it’s supposed to be. Having said that, approach each quadrant as a mini video. If you could separate each quadrant and add audio quadrant by quadrant, a spatialization software could also take it from that point on.

Organize Accordingly

Now that you’ve split the video into different quadrants, you can start thinking about the specific audio for each section. For specific audio, you don’t necessarily need anything else aside from a mono stem simply because you just want to pinpoint the sound. Some sound designers start by adjusting their mix template from the traditional 5.1 set of routing down to simply mono for both sound effects and dialogues. Music and score is an entire different world, and let’s leave it for later. Unlike typical dialogue recording, where a traditional edit would have just one track for each main role or character, the spatialized video focuses on quadrants. This totally goes against a sound editor’s normal workflow.

The same approach goes for sound effects, although some of the effects often cross quadrants. If that were the case, the best choice would be to crossfade uniformly across each quadrant in an attempt to match the timing of what’s going on in the action sequences.

The Music

When creating and editing sound for 360º videos, as a sound editor you often come across several complications, and music is not the exception. Music presents two different challenges; however, the most important thing is to always keep the current spatialization in mind for both music creation and post mix. If musical sounds, especially those created by the people appearing in the projected images —like someone playing an instrument—, cross different quadrants, it’s important to define what sound you want to pinpoint and place the instrument on its own mono stem.

The Mix

Mixing is pretty much like any other mixing you’re probably familiar with. Once you have split the video into quadrants and have been working on its unfolded format, the mixing should aim towards playing a rather balanced short. Since we are talking about a 360º format, some sounds will certainly draw viewer attention to specific parts or quadrants. That suggests that, when leveling each sound, the ones that should be highlighted ought to be played a bit louder in the mix.

sound and film editing.jpeg

360º video format will certainly become more popular for other projects and platforms. Spatialization, for instance, can definitely differ from project to project, and the amount of audio and sounds crossing different quadrants and overlapping other sections of the project will be also different. As for the music, its treatment may entirely alter the way sound editors approach this kind of projects. The most recommendable thing is to plan beforehand and study the project so that you don’t fail in the early stages. Bear in mind that mixing for spatialized audio, or 360º video, requires way more tracks than a traditional project, and sometimes, if the video requires splitting dialogue, different musical tracks, different sound effects, the mix session will likely be of massive proportions —which is why, if you’re into this format, you’ve got to be sure you have a system powerful enough for the total track count.

*The images used on this post are taken from

Is Music Important For Films And Ads?

Is Music Important For Films And Ads?

Something several of the most renown advertisements of this decade have in common is that they all involve music, and not simply in the rather worn-off form of a jingle. Think of John Lewis, for example, whose traditional Christmas adverts are as famous for the music they include as the whole storytelling. Vodafone, for instance, also set the Dandy Warhol’s song, Bohemian Like You, for success, as it managed to enter the UK’s top five charts.

Since the era of advertising started, one thing was clear: music and TV go hand in hand, but why do musical elements fit so well in ads and other audiovisual projects? Let’s find out.

It’s All About The Emotional Impact

There are plenty of original soundtrack songs that are simply stuck in our minds. They remind us of a certain time, individual or place in our lives. As discussed in other articles, music and musical elements are pivotal for any audiovisual project simply because, in order to process music, we use the same parts of the brain that are also the ones responsible for triggering emotion and memory.

Because of the human capability of emotionally associating a piece of music to something either positive or negative (which depends on the context and nature of the sounds), the associated memory tends to equal in strength that exact same emotion. The theory does not elaborate on whether it applies to moments in our everyday life, which it does, but rather on how this phenomenon resonates on songs in film, or music in radio, or ads. As for the type of music that triggers this particular area of the brain, its nature is somewhat special —it’s not just any type of music, though. As shown in this study, a group of Australians reacted to a series of audio clips, and their reactions suggested that different types of music can produce strong, but very different, types of emotional responses.

Different types of melodies, key changes, chords, etc., can produce and cause different responses. A string ensemble, for example, when playing sharp and long notes in a major key, were able to cause feelings related to happiness in almost 90% of the people assessed. On the other hand, a dramatic shift from major to minor tonality elicited the opposite feeling in the respondents —sadness and melancholy. An acoustic guitar is highly associated with calm and sophistication, as suggested by almost 83% of the respondents.

The aforementioned examples show how important it is for filmmakers and advertisers to have a deep understanding of the emotion they want to convey, but most importantly, the emotion they want to cause in the audience.¡ —and what type of music is more suitable for such a purpose.

And It’s Also About Telling The Story

Although music and musical elements on their own are an unquestionably powerful tool, they acquire a far more authentic effect when they accompany a story within a solid narrative arc. According to a study, and after having analyzed more than 100 ads to identify which ones were more correlated with long-term memory, the fact that music in TV ads, for example, becomes way more memorable when the music drives the action of the moving images being projected. For instance, if the lyrics match what is happening. The visual part is eye-catching enough sometimes; but when melodic music comes in, it sort of creates a hypnotic effect on the audience which triggers the areas of the brain previously addressed.

music partiture.jpeg

In a wider general sense, music and musical elements can definitely set the tone for a business’s or a brand’s personality, as well as to address a specific type of audience or portion of a specific demography. Adidas or Puma often target younger audiences when it comes to their activewear, for instance.

Creating From Scratch

Many filmmakers or advertises often choose existing tracks or songs from renowned artists; however, especially in filmmaking, many directors rely on composers to create an original soundtrack for a film. And it definitely works: Hans Zimmer, John Williams, Howard Shore, Ennio Morricone, James Horner, etc., are known for having created some of Hollywood’s best tracks for films. Who doesn’t remember Jaws for its soundtrack? Or Star Wars? Or Indiana Jones? Or Interstellar? Or the Lord of The Rings? The list goes on and on, but most importantly, the fact that movies serve as the perfect opportunity to craft a compelling and emotionally aggressive soundtrack, confirms the initial thesis that raises the question: is music really important in films and ads? Of course it is, and of course, it will always be. Without music, some parts of the action go missing. There’s simply no way to engage with an audience if an emotive soundtrack is not present. Music helps to tell the story; music is what people remember and what gets stuck in people’s minds.

*The images used on this post are taken from

Are Sound Effects Really Necessary?

Are Sound Effects Really Necessary?

The Golden Age (the 1930s - 1960s), taught us a lot about sound effects. Artists such as Orson Welles and Jack Benny left behind a great compilation of techniques and developments that are even used by today’s sound effects artists in their own productions and works. When it comes to live performances, live and studio recording and even workshops, sound effects artists have at hand a diverse array of manual sound effects, as they seem to be highly fond controlling and playing these over electronic sampler keyboards that come with recordings. Of course, plenty of sound effects artists also use high-tech electronic samplers and other backing track devices depending on the nature of the project they’re currently working on; however, there seems to be a consensus regarding the unique style of manual sound effects.

In the past, manual effects were not the only option —some sounds were easier to produce and to obtain, like cars, planes and nature sounds; but when it comes to sounds product of the manipulation of an object, of course, a lot of that sound is how you manipulate the object in question. That being said, a lot of experimentation is required to get the right technique to produce the desired sound. A lot also falls under the umbrella of what experimentation often means —you need to test microphones and how the effect sounds over them. Always trust your ears if you’re just getting started.

Are sound effects really necessary?

Sound effects allow filmmakers and audiovisual project directors to tell a compelling a story. Think of a drama: a well-crafted sound, sound effects included, makes every story better, irrespective of whether it is full action, music, dialogues, etc. Sound effects are important, yes, but in comparison to other formats such as radio, where dialogue contributes practically 80% to the drama, music 10% and sound effects 10%, then they are not that pivotal. But if we’re talking about a sci-fi film, well, that’s another story —in film, sound effects add realism and, unlike radio, where if a sound effect has been misplaced no one will notice, the slightest mistake can cause a disaster.

Is sound really as key as video quality when producing visual projects?

As mentioned above, poor sound and poor sound effects can ruin any production regardless of its quality. Understanding that sounds, especially high quality sounds in movies and even video games are closely related to also understanding the true nature of a successful filmmaker or game developer. Think of all the times audio and sound, or the lack thereof, have made you rate, either positively or negatively, any project in particular. Additionally, think how both elements, audio, and sound, determines the reactions an audience is able to digest about the moving images or frames they are presented. So, are sound and sound effects important in films? It certainly is.

Sound in Film

Films are often produced using three different types of sound: human sounds (voices), music and, of course, sound effects. All of them interact with each other throughout the whole project and are crucial for films to provide audiences and viewers with the realistic aspect they expect to, subconsciously, recognize. As mentioned in earlier articles, dialogue and sounds must perfectly sync with the actions being projected —avoiding delays and, of course, being realistic. If a specific sound doesn’t match the moving image on the screen, the realistic effect is gone and the action itself is not believable at all.

film tape.jpeg

There are several ways to achieve high-quality, realistic sounds, and that is by using original sound clips rather than uniquely resorting to sound libraries for the desired effect. Another way to provide an audiovisual project with realism is by incorporating the so-called asynchronous sound effects —which are often used as background sounds in films. These sounds, unlike the ones matching moving images, are not directly related to the action occurring in a moving image; they, of course, help a film be as realistic as it can and should be.

As for music, if you have ever asked yourself how important the implementation of music in film and audiovisual projects is, simply recall all the iconic film scores you’ve come across within the past —with all certainty, film music is perhaps one of the elements we remember the most about a film, and it’s one of the aspects that determines whether a film stands real chances of being successful or not. Movies like Steven Spielberg’s Jaws, and its iconic two-note melody, still brings back the memories of the big shark approaching its prey; what about George Lucas’s Star Wars? Years after the original trilogy was released, its musical score is still being used in today’s installments and is basically what builds up momentum during promotional affairs, and the list goes on and on. It’s practically impossible to simply overlook the importance of musical elements within today’s filmmaking.

*The images used on this post are taken from

Mixing Tips For The Balanced Soundtrack

Mixing Tips For The Balanced Soundtrack

Since we specialize in crafting the best sound for any type of audiovisual project, it was just about time for us to share some tips on how to achieve a balanced soundtrack and elaborate a bit more on what we at Enhanced Media do. The topics discussed pertain, of course, to the vast universe of film sound, but we will try to avoid oversimplifications while keeping it digestible, understandable and, why not, enticing. Be that as it may, this post is meant to be illustrative enough for you to develop your own knowledge regardless of whether it’s basic or not —learning something new will always be worth it.

Volume and Loudness

Too many people firmly believe that both volume and loudness mean the same thing; however, there’s indeed a crucial difference. When we speak about volume we mean the unit of sound that can be measured in decibels; loudness, however, is the perceived amount of volume. This depends, of course, on several factors such as frequency range and the noise. When it comes to crafting a balanced soundtrack—balanced also meaning homogeneous—both properties are no less than pivotal.

Simply put, when it comes to establishing how both terms interact with each other, we could assert that the physical volume must not be exceeded. In the vast majority of video editing programs and most multimedia software, the master volume is usually displayed with a decibels scale. It’s also important to mention that even though zero decibels can be achieved, zero does not mean inaudible, as some folks may think. Instead, it represents the maximum level before the digital clipping.

When mixing film sound, not only in the musical score but also in the dialogues, off voices and additional sounds, the master level may be at zero decibels, otherwise, the sound would go into what we call digital overdrive, cutting off the sine waves at the amplitude maxima —the highest and loudest rashes. This phenomenon is known as digital scratching, which, if you happen to work within the film or the audiovisual industry, is certainly known to you. Additionally, digital scratching ought not to be confused with confused with the popular term tape saturation, which is way older as it dates back to the time when magnetic tapes were used. Back then, and even today, tape saturation was rather a natural compression that would sound fuller and even far warmer than what is actually achievable today with software.

Traditionally, film score and music is used, and it has already been taken care of in the mixing room for immediate use and, chances are, likewise mastered —of course, that is, the soundtrack as if it was as made in the maximum volume, which allows sound editors to basically integrate it into the film or the audiovisual project. If that were the case, however, the only prerequisite for its integration is that no other plugin for artificial inflation is inserted in the master volume channel and the subsequent individual channels.

Music and its Sources

When it comes to choosing the music for your film, as a producer or as a sound editor you may as well use music from various sources. This means that you will be resorting to different soundtracks from different composers, studios, etc.; all of these tracks should have a clean level, but normally they happen to be uncommonly and excessively loud —which takes us back at the loudness and the perceived volume.

music editing.jpeg

Lamentably today, the vast majority of music producers have taken part in the loudness-war, aiming to pump up your music according to the motto: the louder the better. In the end, you just hear a shallow shriek, let’s be honest. This has its roots in the human mind, as humans and individuals seem to perceive louder musical sounds, in this case, film score or simply music, as better music. This course of action has left music so compressed and so pumped up, that nothing can be done to differentiate it from other lines of sound; however, bear in mind that reality dictates otherwise: the louder, the higher the chances for it to be utterly broken.

Today’s music, the vast majority of the stuff we hear on the radio has almost no dynamics —it’s just annoying, to some extent, but definitely loud! When it comes to filmmaking, film score and film music are supposed to support the project, not the other way around. But what if you were working on a purely technical video? Under these conditions, the goal would be to help the music support the storytelling of the images being projected. If the project begins rather quietly, the music should follow that same course of action. If there’s a sudden increase in tension, the music might as well be used to accentuate that change. Thus, you merge both volume and loudness into perfection.

*The images used on this post are taken from

The Evolution Of Film Sound: Music

The Evolution Of Film Sound: Music

Understanding today’s status of what is traditionally referred to as film sound demands certain background. How did we get here? That’s a question all film sound editors ask themselves at some point in their careers. Here we have compiled several important points of reference to understand how sound has evolved throughout the years.

A lot falls under the umbrella of film sound —music, movie image, silence, foley, dialogue, etc., are some of the elements that are directly affected by sound as an abstract term. The industry has learned a lot about how music and film score are totally under different circumstances than they were 50 years ago. Of course, this evolution has been determined by the use and constant development of technology, flavored by the ongoing use of social acceptance.

If we were to fast forward 100 years in time it would be really challenging to tell where we will be and would be equally hard to tell what composers will have made a name for themselves in the history of the film score to stand out and be dubbed as legends within the industry. Instruments, likewise, have evolved. Think of the Waterphone, for example —that acoustic instrument, highly popular in older films, mostly used in a moving image. Today, there is a plethora of sound effects and sound effects libraries that can imitate that exact same sound, and even improve it through the use of synthesis. History has taught us that music as a crucial film sound element has evolved a lot, not only in terms of sound effects but also in its own interpretation. Composers such as Hans Zimmer and even Walt Disney’s have completely changed the way we visualize and digest music.

Hans Zimmer, for example, is known for having taken part in a lot of successful audiovisual projects; but the majority of his work has been a major game changer within the industry —Hans Zimmer is a true artist simple because how his work blends with the images being projected. Writing music for a project and for moving images is something we see every day, but changing the entire mood of a film and its moving images is a complex thing only a true artist can achieve, especially if it’s done whilst captivating audiences of today’s modern society.

Walt Disney’s composers, on the other hand, kind of took a leap of faith when they decided to integrate music and musical sounds into their projects. They thought initially that including music would not be accepted by a modern society and middle-aged people, but the experiment ended up being positive, especially for younger audiences at that time. So, by incorporating music into film, the film industry helped man alter the way films are portrayed these days. Had they not taken the risk of using movies in the films, probably we wouldn’t be hearing of composers and original soundtracks.

And although many seem to agree that, at some point, someone would have done exactly the same, chances are Walt Disney managed to integrate music into films simply because of the name he had made for himself in the industry at that point in time, so, chances are, when the time came, no one even questioned him about what he was planning to do; but most importantly, without a big company behind that idea, who would have managed to pull that off?

The inclusion of music and musical sounds in films brought along subsequent jobs and positions within the industry such as foley artists, for example, and the recording of realistic sound effects in films and moving images. Once the idea of giving music and sound effects a key role within the whole conception and production of film, other ideas followed, but most importantly, other industries started to develop themselves according to the pace at which filmmaking was developing: software, music instruments, technology, Foley techniques, etc., all of them leaned towards filmmaking, not to mention that such development also allowed filmmakers to explore other genres such as sci-fi, 3D animation, fantasy, etc.

walt disney sound effects.jpeg

It would be fair to assert that the evolution of sound was determined by social changes as well, not only the pace at which technology allowed the industry to develop. Music, as a key element of film sound, will certainly get to new shores —new instruments, new technologies, new composers, new ways of recording and merging music with moving images, and, why not, maybe new genres. And although films are considered essentially a visual experience and a visual medium —that is, more sight and sound—, the fundamental importance of the latter as a part of the storytelling process of any film plays a pivotal role from the beginning till the end. It definitely changed the way filmmakers used to think about the nature of cinema.

*The images used on this post are taken from

What is Foley?

What is Foley?

When we speak about Foley, we always refer to a sound effects technique for both live effects or synchronous effects. This technique was initially named after Jack Foley, a famous sound editor at the renowned Universal Studios. Basically, foley artists strive to match all kinds of live sounds effects with what is actually going on in the film. These sounds are done manually using inanimate objects and people.

This technique is an excellent way of adding the subtle sounds into the film, the sounds that often the production overlook due to all the intricacies involved during the shoot —the noise of the saddle every time a rider gets on his horse or the rustling of an individual’s clothing are just some example of those sounds that are not considered by the production up front, but those are necessary to provide the film, or any other audiovisual project, with that distinctive touch of realism. Otherwise, by using other methods, it would be rather difficult to achieve the exact same level of authenticity.

A good foley artist often takes the place of the actor with whom production is trying to sync effects, if not, well, the sound ends up lacking the necessary level of authenticity and realism to be convincing enough. Most foley artists, the successful ones, of course, are audiles —they can look at any inanimate object and picture what of sound they can get from it. A foley crew, on the other hand, includes several individuals. The walker, or the one who makes the sound, and a technician, maybe two, responsible for recording and mixing the sounds. The vast majority of foley recordings take place what other people would call storage areas —rooms full of laundry, pieces of metal, rocks, stones, a sandbox, metal trays, empty cans, forks, knives, broken guns, anything.

Foley artists start by watching the film in order to determine which sounds need to be added or replaced, which ones can be subject to some level of enhancement, and which ones they can get rid of. At that point in time, the sound on the film is composed mostly by the dialogue and sound effects crafted during the production of the project. These sounds are actually recorded on a guide track, often dubbed as a production track. Later on, technicians focus on other sounds that may be subject of playing a minor role in the film: crowd noises, the musical score, dialogue replacement (or dialogue re-recordings through ADR), other sound effects and, last but not least, sound designed effects.

It is not rare to have up to 80% of a film’s sound altered, edited or simply customized in some way once the movie has been shot. Some sound effects are easy to craft and can be added by resorting to pre-recorded and existing audio libraries; however, many sounds remain unique to every project. Think of footsteps, for instance. As foley artists watch the film, they identify which sounds they need to craft and create, and imagine ways to pull them off. Additionally, when it comes to noises, foley artists have to consider other factors such as the origin of the noise or who is making that specific sound and, most importantly, in what kind of environment. Some noises or sounds are too difficult for just one take, so foley artists must carefully combine different noises from different sources to perfectly represent the sound they are looking for. In some cases, foley editors can ultimately digitally alter these sounds to match what the film is projecting. In the foley studio, you will find all sorts of surfaces for simulating all kinds of footsteps, a splash tank, all kinds of chambers for simulating different variations of echoes, and the mixing booth where foley engineers record and mix everything. The process is quite simple: foley artists spend hours orbiting microphones and watching a huge screen as they try to synchronize the noises they are producing with the action being projected.

But why is foley so important anyway? Well, the vast majority of a film’s soundtrack is added during the post-production stage for several reasons:

Some situations are not real during filming

Think of swords in Vikings fights or punches that don’t make contact with the skin in fist fights. These sounds are, of course, added during post-production and have to really embody what the action requires.

fighting sound effects

Some CGI simulations are not from this world

The vast majority of CGI elements are ultimately not from this world: big monsters, lightsabers, flying vehicles, etc.

Some sounds cannot be recorded on set

Imagine recording on set, in just one take, the sound of a bird’s wings when it jumps into the air or a letter being taken out of the envelope. This sounds, even though we never pay attention to them in real life, are needed to provide the project with realism.

*The images used on this post are taken from

What are some of the most common sound stereotypes and logic flaws?

What are some of the most common sound stereotypes and logic flaws?

Sound is full of stereotypes and, sometimes, full of common logic flaws as well. These are the product of the pursuit of what’s simple and easy; however, studios and sound mixers in general, even though they are conscious of what they’re doing, sometimes they seem to be fond of integrating what the audience, subconsciously, expects to hear. Here is a list of the most common logic flaws and stereotypes you often hear in films—but perhaps do not belong to them.


This is perhaps the most traditional category where logic flaws can be found. The most common stereotypes when it comes to including animals in the scenes are:

Think of dogs, for example —they’re rarely, if not never, silent. Dogs in films always appear either barking or whining. The same thing happens with cats —they always meow or purr. Cows? Yes: they always moo. And that exact same principle applies to basically all animals even though in scenes where most animals wouldn’t be making the slightest sound. Rats, squirrels, and vermin, in general, are always shown making their characteristic noises every time they are projected in the screen. Dolphins? The more than traditional dolphin sound. In almost every film that involves the ocean, filmmakers always include an unnecessary dolphin sound. It's always the same sound, sometimes a segment of it. Snakes always rattle, and the list goes on. In fact, animal behavior is also predictable —dogs, for example, always know who is the bad guy and bark at them all the time.

Bird noises are always the same. Hawks always do that traditional screeching sound. In fact, that exact same sound applies for hawks, eagles and other big birds. Whenever a dramatic part of an adventure film is about to happen, the screeching sound always comes out. If there’s a mountain or a cliff in the background, either a hawk or an eagle can be heard screeching. Owls, on the other hand, always sound the same —like the Great Horned Owl. In horror films, for example, when there is a full moon there is always either an owl or a wolf making its traditional howl in the distance.

Bicycles, bombs, explosions and other objects

Yes, we know you’ve noticed it before: all bicycles have functional bells and whenever they come out in the film they sound. Bombs, on the other hand, always come with a fancy beeping timer display (the bigger the better, it seems), and whenever a bomb goes off, it takes about one to two minutes for the explosion to fade away. Speaking of explosions, these, for some reason, always happen in slow motion —and you have to make sure you are running away from the point where the bomb will go off so that the blast can throw you in the air towards the screen (in slow motion, of course). Additionally, if the film being projected is an action film, if there is a bombing scene, the bomb will always whistle whilst falling from the aircraft.

Cars always screech even on dirt roads; car breaks? They always squeak, and whenever a car turns, stops or pulls away, its tires must always make that particular squealing noise. On films with more budget, whenever a car comes out and makes any particular movement, it must increase its acceleration even if it was initially moving under 25 mph. On long roads, whenever a truck comes out, we always hear the traditional truck horn (with doppler effect, of course).

sound of cars on movies.jpeg

Now, what about computers? Every button a character presses on a computer or a laptop makes some sort of beep noise; the text being projected on screen must make some type of typing or printer sound for some reason, even though we never hear that noise on a daily basis.


Easy: if there is a castle, there is a thunder. Storms start almost immediately: there’s castle, and subsequently, there’s a crack of thunder and heavy rain starts to fall amidst a plethora of lightning. Besides: thunder is always in perfect sync with lightning, no matter how far away the lightning occurred —and the same applies for explosions, as discussed, fireworks, etc. The wind always sounds the same. Oh, underwater scene? Let me just add non-stop bubbles throughout the whole scene. Doors always squeak. Phones? Universal telephone ring. The scene in San Francisco? Easy —cable and foghorn sound. Trains? The same old classic horn.

environment sound effects

And a fun fact: in U.S films being carried out in big cities, there’s always a police siren or horn in the background, whereas in films from other countries that thing never, never happens. If there’s fight or some sort of commotion in the second floor of a house, the individual or people in the first floor never get aware of what’s actually going on due to other sounds that are actually capable of muting or masking the chairs falling over, the yelling, the screams, etc., like phones ringing, the washing machine changing cycles, animals making their characteristic sounds, or the maid using the vacuum machine.

*The images used on this post are taken from

How Sound Helps You Tell Your Stories

How Sound Helps You Tell Your Stories

sounds that serve as the framework within which filmmakers and directors create a specific atmosphere. When coming up with an audiovisual project regardless of its nature, sounds are always carefully introduced in order for them to represent what is happening during a particular scene: what kind of actions the performers and the characters are engaged in, and where does that situation, in particular, take place.

The importance of creating an enticing atmosphere

Sound design is full of all kinds of nuances, and these vary from project to project; however, the necessity for filmmakers to understand the material remains. A good way for producers, directors, filmmakers, etc., to start editing sound for a production is by actually going through the whole script whilst figuring out the nature of every scene therein. Thus, aspects such as background sounds receive a much brighter connotation as well as the actions taking place. This is a good way to start elaborating on sound effects and possible atmospheres that could pertain to a specific scene.

What about music and dialogue?

Despite the fact that both music and dialogues are pivotal for creating an enticing audiovisual project, these two traditionally come to mind before considering the sound design part. And whilst these, as mentioned, are unequivocally vital in providing the plot with guidance, they remain as, perhaps, the most obvious elements in a project’s sound design. Since every scene is different, these require a subtle, yet specific, manipulation of both sound and sound effects to make them feel real and complete. Of course, dialogue and music by themselves are simply not enough to build the framework within which films are conceived and constructed.

Background and atmospheric sounds and noises drive the plot and allow the audience to clearly understand where a particular scene is taking place. Normally, these sounds are perceived by the subconscious given their quiet and repetitive nature; however, they are essential to every single scene because of the drive the audience. If both background and atmospheric noises were removed from audiovisual projects, scenes would end up feeling and being perceived unfamiliar and even unnatural by the audience due to the lack of realism. Scenes taking place on busy and crowded streets, for example, always include iconic noises like car horns, engines, indistinct chattering, etc., whereas scenes taking place in the woods include birds, wind, grass blowing, etc. Both examples are familiar to the audience simply because of the atmosphere created through the inclusion of background noises. Otherwise, it would feel weird to the audience.

How does action sound?

So, we just covered how sound is meant to interact on a subconscious level, the next part of sound design is helping the audience understand what the characters being played are doing in a particular scene. This is also possible and done through the inclusion of action sounds. When we speak about action sounds we are basically talking about a rather more subtle group of sounds like those we hear when characters are holding something or their clothing, or when characters interact with other character or other inanimate objects. If a character is in an action-filled scene, like a physical confrontation or a fight, then sound designers would need to include the sounds of impacts, punches, clothing being moved, etc. Sound designers actually spend a lot of time during this kind of scenes to make sure that all the sounds that the audience would hear if the fight or confrontation was real, match what is being projected and played by the performers. During a fight, it is common to hear some hits louder than other, or even a combination of different types of impact sounds. Thus, the audience perceives the scene in a more realistic way, which is why the vast majority of sound designers strive to include and use effects that resemble a real sound.

sound effects in movies.jpeg

This same principle applies to the interactions characters have with inanimate objects. If a character is manipulating something made of metal, like a gun or a hammer, the sound designer will need to add that sound by adding the sound of a person touching the object itself. The same happens when characters use computers or mobile phones and we hear the keys being pressed and mobile phone beeps. These are sound that possibly go unnoticed by the audience in real life; however, by including them into a particular scene the audience ends up being driven by the storytelling component. A well-crafted atmosphere includes these and the sound mentioned in the first part of this article. Although quiet and subtle, both atmospheric and action sounds are key for providing the audience with a compelling narrative. These are small elements whose tremendous value is reflected every time the audience remains engaged throughout an audiovisual project.

*The images used on this post are taken from

What Is Sound Design?

What Is Sound Design?

Despite the fact that sound design has always been a key component in the film and audiovisual industry, it still holds an air of mystery. In fact, the most common myth about sound design is that it is all about creating new sounds, which of course is not true. At least not partially. One may easily assume that sound design is all about coming up with enticing and neat sound effects; however, assuming that would not be fair with those who coined the term during Star Wars and Apocalypse Now. When we refer to sound design as a term, we need to resort to those films, as Ben Burtt and Walter Murch —Star Wars and Apocalypse Now respectively— found themselves working alongside directors who were not just trying to include attractive and powerful sound effects in their projects as an additional element of the structure they had already put in place.

It was by exploring the boundaries of sound: sound effects, music, dialogues, etc., that sound began to play a pivotal role in storytelling, shaping the picture in most cases. These experiments resulted in something different from what directors and audiences were used to. In fact, nowadays soundtracks change the way people and directors understand film sound, yet there seems to be a rather unorthodox conception of what a well-crafted sound design is. For many people, a well-crafted sound design is about recording high fidelity sounds and well-fabricated vocalizations like explosions or alien creatures; however, that is far from doing justice to the term. A well-orchestrated and recorded musical composition provides minimal to zero value if does not interact seamlessly with the film; having performers, actors, and actresses say a myriad of dialogues in every shot is not necessarily acting in the betterment of the production.

Sound, regardless of its nature, provides value and acts in the betterment of a film when it becomes part of the storytelling, when it resonates with what is being projected, when it changes dynamically over time, and when it makes the audience experience sensorial feelings. Filmmakers should actually pay special attention to sound every time they have an idea in their minds. Instead of simply considering sound as a mere component, filmmakers should strive to fabricate sounds, either on set or in a studio with a talented sound designer or composer, making it a pivotal contribution to influence their projects in different ways. Think of films like Citizen Kane, Star Wars or Once Upon A Time In The West. These films were thoroughly thought and produced in many ways, the sound being one of them, yet no sound designer appears on the credits.

sound design for films.jpg

But that does not mean that every film should strive to mimic what the aforementioned films have done in terms of sound; however, lots of audiovisual projects may actually learn from them instead. Sound mixing varies from film to film, and there are films whose sound design is astonishing. Now, there are several sound mixing practices that actually take place way long before production begins: directors often have their actors and actresses hear the words around their characters, making it possible for them to play their roles in a much better way.

Other directors actually build their stories around the whole role that sound plays within the storytelling framework, although many others still have a lot to learn about the potential for sound. There seems to be a paradigm around the role of sound —like a generally accepted idea that suggests that good sound is only meant to enhance the images and the visuals being projected. Such paradigm would only suggest that sound is actually a slave within the project, and its implications would be less important and complex than they would be if directors could let sound act as a free element during the whole process.

Another misconception around the topic of sound and sound design suggests both directors and filmmakers start to pay attention, or at least think seriously, about sound is when the project is approaching its final stages and when the filmmaking process and the structure of the film are practically over. Many would say: how is a composer supposed to come up with an idea unless he is able to catch a glimpse of the final product? Some may argue that there is nothing wrong with this practice, and yes: sometimes it works uncannily well. But what’s the point in actually considering sound like a collaborative, functional component of filmmaking if it is only taken into account and addressed once the other processes are over? In order for it to reach its full potential, directors and filmmakers should not disregard the possibility of understanding the whole project within a collaborative framework, allowing sound to exert some of its wonders on the filmmaking process.

*The images used on this post are taken from

School: Dunkirk's Intensity Lies in Sound

School: Dunkirk's Intensity Lies in Sound

Good sound design adds to the emotion of the scene. From the creaking of a door in a horror movie to the epic score in a fantasy movie, sound matters. The score can elevate the mood of the scene. There are also many tricks in composing music that is more sound design than music. One of which is using scales that don't resolve to add tension to a scene and reflect the environment the scene takes place in.

The Vox discusses this technique in more detail in the video below: