Mixing Tips For The Balanced Soundtrack

Mixing Tips For The Balanced Soundtrack

Since we specialize in crafting the best sound for any type of audiovisual project, it was just about time for us to share some tips on how to achieve a balanced soundtrack and elaborate a bit more on what we at Enhanced Media do. The topics discussed pertain, of course, to the vast universe of film sound, but we will try to avoid oversimplifications while keeping it digestible, understandable and, why not, enticing. Be that as it may, this post is meant to be illustrative enough for you to develop your own knowledge regardless of whether it’s basic or not —learning something new will always be worth it.

Volume and Loudness

Too many people firmly believe that both volume and loudness mean the same thing; however, there’s indeed a crucial difference. When we speak about volume we mean the unit of sound that can be measured in decibels; loudness, however, is the perceived amount of volume. This depends, of course, on several factors such as frequency range and the noise. When it comes to crafting a balanced soundtrack—balanced also meaning homogeneous—both properties are no less than pivotal.

Simply put, when it comes to establishing how both terms interact with each other, we could assert that the physical volume must not be exceeded. In the vast majority of video editing programs and most multimedia software, the master volume is usually displayed with a decibels scale. It’s also important to mention that even though zero decibels can be achieved, zero does not mean inaudible, as some folks may think. Instead, it represents the maximum level before the digital clipping.

When mixing film sound, not only in the musical score but also in the dialogues, off voices and additional sounds, the master level may be at zero decibels, otherwise, the sound would go into what we call digital overdrive, cutting off the sine waves at the amplitude maxima —the highest and loudest rashes. This phenomenon is known as digital scratching, which, if you happen to work within the film or the audiovisual industry, is certainly known to you. Additionally, digital scratching ought not to be confused with confused with the popular term tape saturation, which is way older as it dates back to the time when magnetic tapes were used. Back then, and even today, tape saturation was rather a natural compression that would sound fuller and even far warmer than what is actually achievable today with software.

Traditionally, film score and music is used, and it has already been taken care of in the mixing room for immediate use and, chances are, likewise mastered —of course, that is, the soundtrack as if it was as made in the maximum volume, which allows sound editors to basically integrate it into the film or the audiovisual project. If that were the case, however, the only prerequisite for its integration is that no other plugin for artificial inflation is inserted in the master volume channel and the subsequent individual channels.

Music and its Sources

When it comes to choosing the music for your film, as a producer or as a sound editor you may as well use music from various sources. This means that you will be resorting to different soundtracks from different composers, studios, etc.; all of these tracks should have a clean level, but normally they happen to be uncommonly and excessively loud —which takes us back at the loudness and the perceived volume.

music editing.jpeg

Lamentably today, the vast majority of music producers have taken part in the loudness-war, aiming to pump up your music according to the motto: the louder the better. In the end, you just hear a shallow shriek, let’s be honest. This has its roots in the human mind, as humans and individuals seem to perceive louder musical sounds, in this case, film score or simply music, as better music. This course of action has left music so compressed and so pumped up, that nothing can be done to differentiate it from other lines of sound; however, bear in mind that reality dictates otherwise: the louder, the higher the chances for it to be utterly broken.

Today’s music, the vast majority of the stuff we hear on the radio has almost no dynamics —it’s just annoying, to some extent, but definitely loud! When it comes to filmmaking, film score and film music are supposed to support the project, not the other way around. But what if you were working on a purely technical video? Under these conditions, the goal would be to help the music support the storytelling of the images being projected. If the project begins rather quietly, the music should follow that same course of action. If there’s a sudden increase in tension, the music might as well be used to accentuate that change. Thus, you merge both volume and loudness into perfection.

*The images used on this post are taken from Pexels.com

The Evolution Of Film Sound: Music

The Evolution Of Film Sound: Music

Understanding today’s status of what is traditionally referred to as film sound demands certain background. How did we get here? That’s a question all film sound editors ask themselves at some point in their careers. Here we have compiled several important points of reference to understand how sound has evolved throughout the years.

A lot falls under the umbrella of film sound —music, movie image, silence, foley, dialogue, etc., are some of the elements that are directly affected by sound as an abstract term. The industry has learned a lot about how music and film score are totally under different circumstances than they were 50 years ago. Of course, this evolution has been determined by the use and constant development of technology, flavored by the ongoing use of social acceptance.

If we were to fast forward 100 years in time it would be really challenging to tell where we will be and would be equally hard to tell what composers will have made a name for themselves in the history of the film score to stand out and be dubbed as legends within the industry. Instruments, likewise, have evolved. Think of the Waterphone, for example —that acoustic instrument, highly popular in older films, mostly used in a moving image. Today, there is a plethora of sound effects and sound effects libraries that can imitate that exact same sound, and even improve it through the use of synthesis. History has taught us that music as a crucial film sound element has evolved a lot, not only in terms of sound effects but also in its own interpretation. Composers such as Hans Zimmer and even Walt Disney’s have completely changed the way we visualize and digest music.

Hans Zimmer, for example, is known for having taken part in a lot of successful audiovisual projects; but the majority of his work has been a major game changer within the industry —Hans Zimmer is a true artist simple because how his work blends with the images being projected. Writing music for a project and for moving images is something we see every day, but changing the entire mood of a film and its moving images is a complex thing only a true artist can achieve, especially if it’s done whilst captivating audiences of today’s modern society.

Walt Disney’s composers, on the other hand, kind of took a leap of faith when they decided to integrate music and musical sounds into their projects. They thought initially that including music would not be accepted by a modern society and middle-aged people, but the experiment ended up being positive, especially for younger audiences at that time. So, by incorporating music into film, the film industry helped man alter the way films are portrayed these days. Had they not taken the risk of using movies in the films, probably we wouldn’t be hearing of composers and original soundtracks.

And although many seem to agree that, at some point, someone would have done exactly the same, chances are Walt Disney managed to integrate music into films simply because of the name he had made for himself in the industry at that point in time, so, chances are, when the time came, no one even questioned him about what he was planning to do; but most importantly, without a big company behind that idea, who would have managed to pull that off?

The inclusion of music and musical sounds in films brought along subsequent jobs and positions within the industry such as foley artists, for example, and the recording of realistic sound effects in films and moving images. Once the idea of giving music and sound effects a key role within the whole conception and production of film, other ideas followed, but most importantly, other industries started to develop themselves according to the pace at which filmmaking was developing: software, music instruments, technology, Foley techniques, etc., all of them leaned towards filmmaking, not to mention that such development also allowed filmmakers to explore other genres such as sci-fi, 3D animation, fantasy, etc.

walt disney sound effects.jpeg

It would be fair to assert that the evolution of sound was determined by social changes as well, not only the pace at which technology allowed the industry to develop. Music, as a key element of film sound, will certainly get to new shores —new instruments, new technologies, new composers, new ways of recording and merging music with moving images, and, why not, maybe new genres. And although films are considered essentially a visual experience and a visual medium —that is, more sight and sound—, the fundamental importance of the latter as a part of the storytelling process of any film plays a pivotal role from the beginning till the end. It definitely changed the way filmmakers used to think about the nature of cinema.

*The images used on this post are taken from Pexels.com

What is Foley?

What is Foley?

When we speak about Foley, we always refer to a sound effects technique for both live effects or synchronous effects. This technique was initially named after Jack Foley, a famous sound editor at the renowned Universal Studios. Basically, foley artists strive to match all kinds of live sounds effects with what is actually going on in the film. These sounds are done manually using inanimate objects and people.

This technique is an excellent way of adding the subtle sounds into the film, the sounds that often the production overlook due to all the intricacies involved during the shoot —the noise of the saddle every time a rider gets on his horse or the rustling of an individual’s clothing are just some example of those sounds that are not considered by the production up front, but those are necessary to provide the film, or any other audiovisual project, with that distinctive touch of realism. Otherwise, by using other methods, it would be rather difficult to achieve the exact same level of authenticity.

A good foley artist often takes the place of the actor with whom production is trying to sync effects, if not, well, the sound ends up lacking the necessary level of authenticity and realism to be convincing enough. Most foley artists, the successful ones, of course, are audiles —they can look at any inanimate object and picture what of sound they can get from it. A foley crew, on the other hand, includes several individuals. The walker, or the one who makes the sound, and a technician, maybe two, responsible for recording and mixing the sounds. The vast majority of foley recordings take place what other people would call storage areas —rooms full of laundry, pieces of metal, rocks, stones, a sandbox, metal trays, empty cans, forks, knives, broken guns, anything.

Foley artists start by watching the film in order to determine which sounds need to be added or replaced, which ones can be subject to some level of enhancement, and which ones they can get rid of. At that point in time, the sound on the film is composed mostly by the dialogue and sound effects crafted during the production of the project. These sounds are actually recorded on a guide track, often dubbed as a production track. Later on, technicians focus on other sounds that may be subject of playing a minor role in the film: crowd noises, the musical score, dialogue replacement (or dialogue re-recordings through ADR), other sound effects and, last but not least, sound designed effects.

It is not rare to have up to 80% of a film’s sound altered, edited or simply customized in some way once the movie has been shot. Some sound effects are easy to craft and can be added by resorting to pre-recorded and existing audio libraries; however, many sounds remain unique to every project. Think of footsteps, for instance. As foley artists watch the film, they identify which sounds they need to craft and create, and imagine ways to pull them off. Additionally, when it comes to noises, foley artists have to consider other factors such as the origin of the noise or who is making that specific sound and, most importantly, in what kind of environment. Some noises or sounds are too difficult for just one take, so foley artists must carefully combine different noises from different sources to perfectly represent the sound they are looking for. In some cases, foley editors can ultimately digitally alter these sounds to match what the film is projecting. In the foley studio, you will find all sorts of surfaces for simulating all kinds of footsteps, a splash tank, all kinds of chambers for simulating different variations of echoes, and the mixing booth where foley engineers record and mix everything. The process is quite simple: foley artists spend hours orbiting microphones and watching a huge screen as they try to synchronize the noises they are producing with the action being projected.

But why is foley so important anyway? Well, the vast majority of a film’s soundtrack is added during the post-production stage for several reasons:

Some situations are not real during filming

Think of swords in Vikings fights or punches that don’t make contact with the skin in fist fights. These sounds are, of course, added during post-production and have to really embody what the action requires.

fighting sound effects

Some CGI simulations are not from this world

The vast majority of CGI elements are ultimately not from this world: big monsters, lightsabers, flying vehicles, etc.

Some sounds cannot be recorded on set

Imagine recording on set, in just one take, the sound of a bird’s wings when it jumps into the air or a letter being taken out of the envelope. This sounds, even though we never pay attention to them in real life, are needed to provide the project with realism.

*The images used on this post are taken from Pexels.com

What are some of the most common sound stereotypes and logic flaws?

What are some of the most common sound stereotypes and logic flaws?

Sound is full of stereotypes and, sometimes, full of common logic flaws as well. These are the product of the pursuit of what’s simple and easy; however, studios and sound mixers in general, even though they are conscious of what they’re doing, sometimes they seem to be fond of integrating what the audience, subconsciously, expects to hear. Here is a list of the most common logic flaws and stereotypes you often hear in films—but perhaps do not belong to them.


This is perhaps the most traditional category where logic flaws can be found. The most common stereotypes when it comes to including animals in the scenes are:

Think of dogs, for example —they’re rarely, if not never, silent. Dogs in films always appear either barking or whining. The same thing happens with cats —they always meow or purr. Cows? Yes: they always moo. And that exact same principle applies to basically all animals even though in scenes where most animals wouldn’t be making the slightest sound. Rats, squirrels, and vermin, in general, are always shown making their characteristic noises every time they are projected in the screen. Dolphins? The more than traditional dolphin sound. In almost every film that involves the ocean, filmmakers always include an unnecessary dolphin sound. It's always the same sound, sometimes a segment of it. Snakes always rattle, and the list goes on. In fact, animal behavior is also predictable —dogs, for example, always know who is the bad guy and bark at them all the time.

Bird noises are always the same. Hawks always do that traditional screeching sound. In fact, that exact same sound applies for hawks, eagles and other big birds. Whenever a dramatic part of an adventure film is about to happen, the screeching sound always comes out. If there’s a mountain or a cliff in the background, either a hawk or an eagle can be heard screeching. Owls, on the other hand, always sound the same —like the Great Horned Owl. In horror films, for example, when there is a full moon there is always either an owl or a wolf making its traditional howl in the distance.

Bicycles, bombs, explosions and other objects

Yes, we know you’ve noticed it before: all bicycles have functional bells and whenever they come out in the film they sound. Bombs, on the other hand, always come with a fancy beeping timer display (the bigger the better, it seems), and whenever a bomb goes off, it takes about one to two minutes for the explosion to fade away. Speaking of explosions, these, for some reason, always happen in slow motion —and you have to make sure you are running away from the point where the bomb will go off so that the blast can throw you in the air towards the screen (in slow motion, of course). Additionally, if the film being projected is an action film, if there is a bombing scene, the bomb will always whistle whilst falling from the aircraft.

Cars always screech even on dirt roads; car breaks? They always squeak, and whenever a car turns, stops or pulls away, its tires must always make that particular squealing noise. On films with more budget, whenever a car comes out and makes any particular movement, it must increase its acceleration even if it was initially moving under 25 mph. On long roads, whenever a truck comes out, we always hear the traditional truck horn (with doppler effect, of course).

sound of cars on movies.jpeg

Now, what about computers? Every button a character presses on a computer or a laptop makes some sort of beep noise; the text being projected on screen must make some type of typing or printer sound for some reason, even though we never hear that noise on a daily basis.


Easy: if there is a castle, there is a thunder. Storms start almost immediately: there’s castle, and subsequently, there’s a crack of thunder and heavy rain starts to fall amidst a plethora of lightning. Besides: thunder is always in perfect sync with lightning, no matter how far away the lightning occurred —and the same applies for explosions, as discussed, fireworks, etc. The wind always sounds the same. Oh, underwater scene? Let me just add non-stop bubbles throughout the whole scene. Doors always squeak. Phones? Universal telephone ring. The scene in San Francisco? Easy —cable and foghorn sound. Trains? The same old classic horn.

environment sound effects

And a fun fact: in U.S films being carried out in big cities, there’s always a police siren or horn in the background, whereas in films from other countries that thing never, never happens. If there’s fight or some sort of commotion in the second floor of a house, the individual or people in the first floor never get aware of what’s actually going on due to other sounds that are actually capable of muting or masking the chairs falling over, the yelling, the screams, etc., like phones ringing, the washing machine changing cycles, animals making their characteristic sounds, or the maid using the vacuum machine.

*The images used on this post are taken from Pexels.com

How Sound Helps You Tell Your Stories

How Sound Helps You Tell Your Stories

sounds that serve as the framework within which filmmakers and directors create a specific atmosphere. When coming up with an audiovisual project regardless of its nature, sounds are always carefully introduced in order for them to represent what is happening during a particular scene: what kind of actions the performers and the characters are engaged in, and where does that situation, in particular, take place.

The importance of creating an enticing atmosphere

Sound design is full of all kinds of nuances, and these vary from project to project; however, the necessity for filmmakers to understand the material remains. A good way for producers, directors, filmmakers, etc., to start editing sound for a production is by actually going through the whole script whilst figuring out the nature of every scene therein. Thus, aspects such as background sounds receive a much brighter connotation as well as the actions taking place. This is a good way to start elaborating on sound effects and possible atmospheres that could pertain to a specific scene.

What about music and dialogue?

Despite the fact that both music and dialogues are pivotal for creating an enticing audiovisual project, these two traditionally come to mind before considering the sound design part. And whilst these, as mentioned, are unequivocally vital in providing the plot with guidance, they remain as, perhaps, the most obvious elements in a project’s sound design. Since every scene is different, these require a subtle, yet specific, manipulation of both sound and sound effects to make them feel real and complete. Of course, dialogue and music by themselves are simply not enough to build the framework within which films are conceived and constructed.

Background and atmospheric sounds and noises drive the plot and allow the audience to clearly understand where a particular scene is taking place. Normally, these sounds are perceived by the subconscious given their quiet and repetitive nature; however, they are essential to every single scene because of the drive the audience. If both background and atmospheric noises were removed from audiovisual projects, scenes would end up feeling and being perceived unfamiliar and even unnatural by the audience due to the lack of realism. Scenes taking place on busy and crowded streets, for example, always include iconic noises like car horns, engines, indistinct chattering, etc., whereas scenes taking place in the woods include birds, wind, grass blowing, etc. Both examples are familiar to the audience simply because of the atmosphere created through the inclusion of background noises. Otherwise, it would feel weird to the audience.

How does action sound?

So, we just covered how sound is meant to interact on a subconscious level, the next part of sound design is helping the audience understand what the characters being played are doing in a particular scene. This is also possible and done through the inclusion of action sounds. When we speak about action sounds we are basically talking about a rather more subtle group of sounds like those we hear when characters are holding something or their clothing, or when characters interact with other character or other inanimate objects. If a character is in an action-filled scene, like a physical confrontation or a fight, then sound designers would need to include the sounds of impacts, punches, clothing being moved, etc. Sound designers actually spend a lot of time during this kind of scenes to make sure that all the sounds that the audience would hear if the fight or confrontation was real, match what is being projected and played by the performers. During a fight, it is common to hear some hits louder than other, or even a combination of different types of impact sounds. Thus, the audience perceives the scene in a more realistic way, which is why the vast majority of sound designers strive to include and use effects that resemble a real sound.

sound effects in movies.jpeg

This same principle applies to the interactions characters have with inanimate objects. If a character is manipulating something made of metal, like a gun or a hammer, the sound designer will need to add that sound by adding the sound of a person touching the object itself. The same happens when characters use computers or mobile phones and we hear the keys being pressed and mobile phone beeps. These are sound that possibly go unnoticed by the audience in real life; however, by including them into a particular scene the audience ends up being driven by the storytelling component. A well-crafted atmosphere includes these and the sound mentioned in the first part of this article. Although quiet and subtle, both atmospheric and action sounds are key for providing the audience with a compelling narrative. These are small elements whose tremendous value is reflected every time the audience remains engaged throughout an audiovisual project.

*The images used on this post are taken from Pexels.com

What Is Sound Design?

What Is Sound Design?

Despite the fact that sound design has always been a key component in the film and audiovisual industry, it still holds an air of mystery. In fact, the most common myth about sound design is that it is all about creating new sounds, which of course is not true. At least not partially. One may easily assume that sound design is all about coming up with enticing and neat sound effects; however, assuming that would not be fair with those who coined the term during Star Wars and Apocalypse Now. When we refer to sound design as a term, we need to resort to those films, as Ben Burtt and Walter Murch —Star Wars and Apocalypse Now respectively— found themselves working alongside directors who were not just trying to include attractive and powerful sound effects in their projects as an additional element of the structure they had already put in place.

It was by exploring the boundaries of sound: sound effects, music, dialogues, etc., that sound began to play a pivotal role in storytelling, shaping the picture in most cases. These experiments resulted in something different from what directors and audiences were used to. In fact, nowadays soundtracks change the way people and directors understand film sound, yet there seems to be a rather unorthodox conception of what a well-crafted sound design is. For many people, a well-crafted sound design is about recording high fidelity sounds and well-fabricated vocalizations like explosions or alien creatures; however, that is far from doing justice to the term. A well-orchestrated and recorded musical composition provides minimal to zero value if does not interact seamlessly with the film; having performers, actors, and actresses say a myriad of dialogues in every shot is not necessarily acting in the betterment of the production.

Sound, regardless of its nature, provides value and acts in the betterment of a film when it becomes part of the storytelling, when it resonates with what is being projected, when it changes dynamically over time, and when it makes the audience experience sensorial feelings. Filmmakers should actually pay special attention to sound every time they have an idea in their minds. Instead of simply considering sound as a mere component, filmmakers should strive to fabricate sounds, either on set or in a studio with a talented sound designer or composer, making it a pivotal contribution to influence their projects in different ways. Think of films like Citizen Kane, Star Wars or Once Upon A Time In The West. These films were thoroughly thought and produced in many ways, the sound being one of them, yet no sound designer appears on the credits.

sound design for films.jpg

But that does not mean that every film should strive to mimic what the aforementioned films have done in terms of sound; however, lots of audiovisual projects may actually learn from them instead. Sound mixing varies from film to film, and there are films whose sound design is astonishing. Now, there are several sound mixing practices that actually take place way long before production begins: directors often have their actors and actresses hear the words around their characters, making it possible for them to play their roles in a much better way.

Other directors actually build their stories around the whole role that sound plays within the storytelling framework, although many others still have a lot to learn about the potential for sound. There seems to be a paradigm around the role of sound —like a generally accepted idea that suggests that good sound is only meant to enhance the images and the visuals being projected. Such paradigm would only suggest that sound is actually a slave within the project, and its implications would be less important and complex than they would be if directors could let sound act as a free element during the whole process.

Another misconception around the topic of sound and sound design suggests both directors and filmmakers start to pay attention, or at least think seriously, about sound is when the project is approaching its final stages and when the filmmaking process and the structure of the film are practically over. Many would say: how is a composer supposed to come up with an idea unless he is able to catch a glimpse of the final product? Some may argue that there is nothing wrong with this practice, and yes: sometimes it works uncannily well. But what’s the point in actually considering sound like a collaborative, functional component of filmmaking if it is only taken into account and addressed once the other processes are over? In order for it to reach its full potential, directors and filmmakers should not disregard the possibility of understanding the whole project within a collaborative framework, allowing sound to exert some of its wonders on the filmmaking process.

*The images used on this post are taken from Pexels.com

The Art of Post-Production Sound: Dialogue and ADR

The Art of Post-Production Sound: Dialogue and ADR

Sound editing is composed of many stages. Theoretically, the first stage of sound editing is basically going through each second of the film alongside the sound editor in order to come up with a list of every single sound that needs to be included, edited, augmented or, simply, replaced. This stage, however, has been altered due to the increase in demands for anticipated previews, which, ultimately, have made the post-production schedule a bit hectic.

The Dialogue

When we talk about dialogue editing we basically talk about organizing and cleaning up production sound, and it can be as detailed as reusing fragments of words to complete other fragments foreign to that sequence in particular; or even removing a performer’s mouth sounds. Often, the dialogue we hear in the ultimate version of an audiovisual project was not actually recorded on location. In fact, many producers and directors prefer to shoot silent, as it happens to be much easier than achieving the perfect and quiet environment required for a sequence in particular —the crew is always noisy, there are people watching the film, birds, sirens, airplanes, car alarms, etc.

Even though some dialogues are recorded on location, they ultimately get discarded as the mics also captured weird noises such as clothing rustle, people passing, camera noises, etc. So, having the aforementioned difficulties in mind, directors prefer to produce dialogues rather than loop them (which is actually an integral part of an actor’s performance). And even though there is a trend during looping sessions towards including booms and the microphones originally used during the shoot to mimic the exact same situation on set, it results practically impossible to replicate all the condition of the original shoot.

Be that as it may, it is really difficult and tough for performers to match during a looping session the very same emotional level they achieved during the shoot. Ron Bochar, whom originally was responsible for supervising the sound on Philadelphia, always describes this scene where Tom Hanks is answering to an aria recording as a case in point. Under ideal circumstances, the dialogue and the aria would be on two different manipulable tracks so the dialogue could be kept understandable and intelligible; however, Tom Hanks demanded both be able to move around and react to the aria being played. Thus, both the dialogue and the aria were recorded on the same track, which ultimately caused this sequence, in terms of the dialogue, less than acceptable.

The team, nonetheless, actually agreed upon having both tracks recorded on the same track rather than looping the scene, as it would have been impossible, according to Bochar, to recreate the exact same environment, and they would have ended up ruining the scene had they tried.

Today, the first job of every dialogue editor is to split every spoken track and line onto different and separate tracks, By doing this, he or she makes them as controllable and independent as possible so they can afterward merge them again seamlessly. Dialogues are often edited to customize characterization. Some sequences simply portray a dominant figure, like most hero-villain films. A mixer can simply raise the volume of one of the voices and then adjust it according to other tonal qualities, achieving the desired effect.


Those dialogues that cannot be retrieved from production must be re-recorded afterward in a process we have already mentioned: looping, or simple ADR (Automated or automatic dialogue replacement). Looping involves an actor speaking lines in sync with the image being projected, whereas ADR involves an actor watching the sequence repeatedly while listening to each line so that he or she can match the original wording and lip movement afterward in a new recording session. The actor then tries to recreate each line while also trying to convey the same degree of sentiment, passion, and mood. Some performers can indeed achieve a very high level of emotion and are really good at re-capturing the original idea.


In summary, since directors and producers only got the location for a couple of hours and, as mentioned, achieving the perfect quiet moment is rather impossible, production continues in hopes the sound mixer is able to capture every line as clean as possible. ADR gives filmmakers the opportunity to recreate back those moments the initially envisioned; in fact, ADR can even improve them as the re-recordings take place in a much quieter, more controlled environment, typically an audio post-production studio. Filmmakers don’t need to worry about a car horn blaring right over the perfect take while the performer was delivering their line, now, with the help of technology, sound and dialogue can be tailored to every scene and sequence, providing and delivering the audience an engaging and convincing environment, which is a film’s ultimate goal.

*The images used on this post are taken from Pexels.com

An Introduction To Film Sound

An Introduction To Film Sound

When it comes to film, we might actually think it is basically and essentially a visual experience; however, a film is much more than that. We really cannot simply disregard and underestimate the importance of sound within the film biome: a well-crafted soundtrack is often as powerful —and sometimes complicated— as the image on the screen. So, in order for us to understand the complexity of today’s modern film, there are several aspects that traditionally go unnoticed by the audience, but that is as important as the image being projected.

Soundtracks are entirely a different universe. And they involve three different aspects: the human voice, music and sound effects. In fact, each component is a soundtrack on its own, and they need to coexist seamlessly. These three soundtracks must be balanced and mixed in a way so they produce the desired effect and the desired emphasis throughout the film.

The Human Voice

When it comes to the human voice, we are basically talking about dialogues. Dialogues are used to authenticate the speaker as a real individual rather than a product or a concept of imaginary storytelling. For example, with stage drama, the dialogue is used to convey the story, and it ultimately expresses the motivations and feelings of the characters during the play.

Oftentimes, within the film ecosystem, audiences perceive little or practically no difference between the character being portrayed and the actor portraying the character. Think of Humphrey Bogart and Sam Spade or Jim Carrey and Stanley Ipkiss: we could assert to some extent that both film personality and Mr. Bogart’s and Carrey’s own personality merge in a rather high level since their voices complement both characters.

Additionally, when voice texture seems to fit the actor’s of the performer’s aspect and appearance, a wholly different yet realistic character, called persona, is born. The audience does not see a performer working on the character, but another individual struggling through all kinds of situations. On another note, it is also worth mentioning that dialogues are introduced within films in a unique way, and its use varies widely among the nature of the film. Sometimes films include little to no dialogue, and the narrative depends a lot on the visuals, and sometimes audiences are faced with non-stop dialogues, bouncing from conversations to a conversation in a frenetic, comedic way.

Sound Effects

Sound effects have two major components, so to speak. First, when talking about sound effects, we also talk about synchronous sounds —those sounds that match what the audience is watching. For example, if a character is playing a musical instrument, then the sounds of the instruments are projected. This type of sounds also contribute to the realism of a visual project or a film and are also used to create a desired or particular atmosphere. For example, when a door is being opened and we hear the door handle make its particular “click”, we are fully convinced that the image being portrayed is real.

However, if the door handle clicks during an action sequence like a robbery, the sound mixer may emphasize differently the “click” with a totally different volume level to create suspense.

The other main component of sound effects are the asynchronous sounds —those that don’t match what the audience is watching on the screen. These are used to introduce emotional nuances in the project and add a bit more realism. Think of ambulances as background sounds during a car chase, for example. The noise of the siren adds to the realism of the film by elaborating on the project’s city set. Or the noise of birds, dogs, and bystanders while a couple is arguing about something in the park during autumn. Both scenarios are real to us simply because we associate the background sounds we hear with what we are used to. We know ambulances move across city streets, and we know parks are often full of people with their pets.


Background music plays a pivotal role in every visual project. Music is often used to add nuances and emotions as well as rhythm to the film. Traditionally, music is not meant for the audience to note it, as it is rather used to provide a specific tone or emotional nuance to the story. Additionally, music also emphasizes all types of changes throughout a visual project; it foretells changes in mood, in pace, in sequences, etc.

music post production process.jpeg

Film sound is often comprised of both innovations and conventions. During a car chase, for example, the audience always subconsciously expect an acceleration of the music; however, it is important to mention that music and sound are most of the times brilliantly conceived and written. The effects of sound remain largely subtle and are noted only by our subconscious, but they play a key role in our capacity to appreciate and understand what we call today the modern film.

*The images used on this post are taken from Pexels.com

Interview with Alfred Hitchcock

Interview with Alfred Hitchcock

Alfred Hitchcock was a pioneer director most known for directing Psycho. In this interview from 1972 (his death was in 1980) he discusses his career working with actors, story telling, and decision making in regards to directing.

Watch the full interview in the video below:

AES 2017 Gear Rundown

AES 2017 Gear Rundown

The AES convention is a time where many new products are released and shown off. This year was no exception to that. If you were lucky enough to be on the floor of AES this year then you probably already saw and heard these pieces of gear. Here's some new gear announced at AES this year:

Neve 1073SPX - this is a rack unit version of the classic 1073 preamp/EQ. Previously this was only put out by other companies and Neve would require you to buy a rack to house a single unit. Read more about it here.

Transformizer Pro - This is a sound designers dream. The plugin takes one main sound and allows you to layer other sounds with it to fill it out and make it larger than life. the other sounds will adapt to the original sound in volume, timbre, pitch, and length. Read more about it here.

Sound Radix Powair - This plugin is an adaptive compressor. This is a great tool for music and in film/tv mixing. This will allow more flexibility than traditional limiters and allow for certain things to cut through based on their transients. Read more about it here.

PACE iLok Cloud - Hate using pesky USB keys and dongles that can get easily lost or broken? Dislike having thousands of dollars on one of those? Well good news, iLok will now be a secure online system. Joining the ranks of other software developers like Waves and others who use a proprietary online system, iLok will debut this new technology in early 2018. It is undetermined if this will be free or a subscription. Read more about it here.

Great new products this year that will be very helpful in post production and music.

Apple Hires Steven Spielberg For Original TV Content

Apple Hires Steven Spielberg For Original TV Content

Apple has been selling TV shows and movies for a long time now. They have finally announced original content to take on Netflix, Hulu, YouTube, and Amazon. Being a bit late to the game theysure are coming out strong. Apple has signed a deal with Steven Spielberg to develop Amazing Stories, a reboot of a 1980s sci-fi show of the same name.

This really shakes up the new industry. With lots of money at Apple's disposal they could be an industry game changer. Considering that most of the big players in video now have nothing to do with Hollywood, the next few years will be very interesting as streaming becomes the only way people watch content outside of reality tv, sports, and news. Read more about this deal at this location.

What Makes a Fight Scene Badass

What Makes a Fight Scene Badass

Fight scenes are what make action movies awesome. Making one work is a different story. So what separates the good from the bad? Too much going on can cause the viewer to be confused by everything and not enough will bore them. Knowing the emotion you want to convey from the fight scene is as important as the work in crafting the scene itself.

Youtuber Aaron Field takes a look at the Kingsman and explains what works about its fight scenes.

Interview with Inside Out Writer Meg LeFauve

Interview with Inside Out Writer Meg LeFauve

We all have a journey and path to our own success. This road to success is often not a straight line andoften is bumpy with pot holes and unpaved roads and for some has dead ends. YouTube channel RocketJump Film School talks with writer and director Meg LeFauve about her inspiring story of how she got to work on Pixar's Inside Out.

Watch the video below:

Waves Releases Drum Tuner Torque

Waves Releases Drum Tuner Torque

Ever record a drum set and then realize after the fact that the snare is out of tune with the song? The drums no longer gel with the music as a result. or get a mix where it's very clear the drums weren't tuned at all? Enter Waves Torque. This will tune the drum so it sits better in the mix. It can also be used for creative effects too.

Definitely a great tool to have to notch out unwanted resonances in drums and have the drums sit better in a mix overall. For more information about the plugin goto Gear News' website.

Spotify and Hulu Launch Student Bundle

Spotify and Hulu Launch Student Bundle

For those of you who were in college or high school in the early days of piracy - the napster/kazaa days - then you probably would have loved to have an offer like this. Hulu and Spotify have teamed up to offer their premium services for only $4.99 a month. Sounds like a steal huh? Well I'm sure that's exactly what they're thinking as they know that's what most college students will do if you don't sign up for their services. whats fascinating is that both of these services are partially owned by the major labels and TV studios. Both of which are partially owned by NBC/Universal (Comcast now). It seems like this was an easy deal to make since companies share a large stake in both.

This has been a proven smart way to combat piracy as schools who have adopted this (and similar deals) have reported a large drop in piracy rates and an increase in streaming rates..

Read more about the deal in this article by The Verge.

DP Henry Braham on Work for Guardians of the Galaxy Vol. 2

DP Henry Braham on Work for Guardians of the Galaxy Vol. 2

Guardians of the Galaxy Vol. 2 was a great film filled with incredible Special FX and great camera work. A big part of the good camera work was from DP Henry Braham who chose to use a RED Weapon 8K Vista Vision as his camera of choice. The camera was a prototype at the time, which means it was a big risk to use it.

Read more about this and other techniques used for the movie in an interview with Henry Braham in filmmaker magazine.

Toontrack Announces Superior Drummer 3

Toontrack Announces Superior Drummer 3

Since the 80s programmed drums has had an ever increasing growth in quality. The sonic quality of the latest offerings goes beyond anything that anyone could fathom in the early days of drum machines. Today's music world is filled with programmed drums. They are used during writing stages and in major label recordings. One of the biggest names in the drum world is Toontrack's Superior Drummer 2. They have recently announced a huge upgrade to that version with version 3.

With new features like 11.1 surround sound and audio to midi conversion, it is an all in one solution to getting drum sounds. In the past most sample replacement programs would end up with similar sounding productions, but with Toontrack's latest release allowing for simple sample replacement within their superior drummer software, which has over 230GB of samples, it seems that there are limitless possibilities for sound design. 

Read more about it on Pro Tools Expert's website.

Industry: Spotify Is Accused Of Creating Fake Artists

Industry: Spotify Is Accused Of Creating Fake Artists

Spotify was recently accused of pushing 'fake artists' in their popular playlists. They were accused of being fake because they appear nowhere else on the internet or in real life. Sometimes it is pen names for other songwriters, but what if Spotify has created AI that generates music for them? Is this ethical or right? Spotify is, in essence, paying itself royalties on top of the ad revenue that they already get. This then takes time and money away from real artists who may be struggling.

It's an interesting hypothetical idea, as it is not yet known what they are doing and why these artists are not known elsewhere. Read more about this topic at this location.

School: Dunkirk's Intensity Lies in Sound

School: Dunkirk's Intensity Lies in Sound

Good sound design adds to the emotion of the scene. From the creaking of a door in a horror movie to the epic score in a fantasy movie, sound matters. The score can elevate the mood of the scene. There are also many tricks in composing music that is more sound design than music. One of which is using scales that don't resolve to add tension to a scene and reflect the environment the scene takes place in.

The Vox discusses this technique in more detail in the video below:

School: Harnessing Shadows

School: Harnessing Shadows

There's so much that goes into crafting the perfect lighting or lack thereof. Often times the focus is on how to light a subject for a scene, but what about whats not lit? What about the shadows? Ever wonder how this contrast created? How some movies and TV shows can get so dark yet still contain so much detail? Or how silhouettes are created while the background is still well lit? The technique is actually similar to well lit scenes: It's all about the placement of the lights in relation to the subject.

Check out the video below from Rocketjump Film School to find out more: