Everything looks bad, and there’s nothing we can do about it

Left to Right: The Sandman (Image: Netflix);  Tenet (Photo: Warner Bros.);  The Rise of the Dark Knight (Photo: Warner Bros.)

From left to right: hypnotic (Photo: Netflix); tenet (Photo: Warner Bros.); The Dark Knight Rises (Photo: Warner Bros.)
Draw: Rebecca Fasola

TV today is better than watching – and frankly, we don’t have much choice in this matter. Over the past decade, the rise in broadcast technology has led to a boon in the use of subtitles. And before we start blaming advanced millennials with wax in their ears, conducted a study Earlier this year, it was revealed that 50 percent of TV viewers use subtitles, and 55 percent of those surveyed find it difficult to hear dialogue on TV. Most likely to be used demographic: Gen Z.

The growing audio problems in Hollywood productions in the broadcast age have been exacerbated by the endless variety of consumer audio products. Massive pitches and explosive sound effects overpower the dialogue, with their hands tied with broadcast device specifications and artist requirements. Viewers can do little to solve the problem except to turn on subtitles. And who can blame them?

“It’s awful,” Jackie Jones, senior vice president at Formosa Group, a leading audio post-production company. “There has been a lot of time and client money spent to make it sound right. Not great to hear that.”

Formosa is one of many post-production houses that struggle to maintain dialogue coherence amid the ongoing media divide. “Every network Different levels and specifications for sound,” Jones said The AV . Club click zoom. “Whether it’s Hulu or HBO or CBS. You have to get to those certain levels to be in the specs. But really how it’s broadcast and how it’s broadcast is out of our control.”

After it leaves a place like Formosa, the mix may go through an additional mix in the broadcast device and another mix, so to speak, by the projector device. Of course, that’s the last thing they want in the audio industry. “Dialogue is king,” sound editor Anthony Fanshore told us. “I want all the dialogue to be as clear as possible, so when you hear that people struggle to hear these things, you get frustrated.” However, we are still done with the translation. If we were only going to read a quote from hypnotic On Netflix, why bother making it?

Oldest game | hypnotic | netflix philippines

“Everyone is not happy with that,” said David Bondelevich, assistant professor of music and entertainment studies at the University of Denver. “We work very hard in the industry to make every part of the dialogue understandable. If the audience doesn’t understand the dialogue, they won’t follow through with anything else.”

Streamers and devices make terrible music together

With all this technology at our fingertips, dialogue has never been more incoherent, and the proliferation of streaming services has made the landscape impossible to navigate. Aside from the variety of products people watch media on, no two streamers are alike. Each one may have a different set of requirements for the post-production house.

As far as streamers go, editors say Netflix is the best for good sound and has publicly posted her audio specificationsBut the service is exceptional. “They put in a huge amount of money to set their own standards, while some of the other operators seem to have pulled them off their asses,” Bondelevich said. “With some of these banners, editors get 200 pages of specs that [they] She should sit there and read to make sure she’s not violating anything.”

Not all signs are forgiving. “I was having lunch with a couple of friends recently, and they were at lunch answering emails because they made the mix, completed the mix, and everyone is happy,” Fanshore said. And then the director became like a screen worker or was able to watch it at home, you know, whatever streaming service he was using. And he was like, ‘Hey, that looks totally different.’

Neil and Protogonist meeting | Neal Scene Introduction In UHD IMAX 4K 60FPS X265 10BIT HDR

Today, sound designers typically create two mixes for film. The first is for the play, assuming the movie will get a theatrical release. The other is called a “near field mix”, which has a lower dynamic range (the difference between the loud and quiet parts of the mix), making it more suitable for home speakers. But just because the mixes get better doesn’t mean we’ll be able to hear them.

“Near field means that you are close to the speakers, as if you were in your living room,” said Brian Vessa, executive director of digital audio perfecting at Sony Pictures. “It’s just having a speaker close by so that what you perceive is pretty much what comes out. From the speakers themselves and not what the room contributes to. And you listen at a quieter level than you hear in the cinema.”

“What is really about near field mix-up is to bring your container in a place where you can listen comfortably in the living room and get all the information you’re supposed to get, the stuff that’s already put into the program and that might kind of go away in another way.”

Vessa wrote the white paper on near-field blends, creating the industry standard. He believes that a large part of the problem is “psychological sound”, which means that we simply don’t perceive sound the same way at home and in the theater, so if a good near-field mix isn’t the norm, the audience will be left to fend for themselves.

Complicating matters, where things end has never been more fluid. “In television, we establish the dialogue so that it is always even and clear and we build everything else around that,” said Andy Hay, who introduced the first Dolby Atmos project to Netflix and helped develop service standards. “In the features, we let the story drive our decisions. The particularly dynamic theatrical mix can be quite challenging for wrestling in a near field mix.” With so many products discarded when broadcasting after the movie is complete, sound engineers may not even know what format to mix.

There is a house to deal with. Consumer electronics give users a number of proprietary options that “reduce loud sounds” or “enhance dialogue”. Sometimes they simply have stupid marketing names like “VRX” or “TruVol”, but it’s the “movement smoothing” of the sound. These options, which may or may not be turned on by default from the manufacturer, attempt to respond to noise spikes in real time, usually trying to capture and “reduce” loud noises, such as explosions or a music signal, as they occur. Unfortunately, they usually get late and end up reducing whatever noise follows.

The problem is not only the speakers. Rooms, device placement, and white noise from fans and air conditioners can make dialogue more difficult to hear. This should explain this near-field combination as well. “I listen very attentively and very quietly, because that way all these other factors, the air conditioner, the noise nearby, all the other things that could be rattling and things start to matter. And if you miss something, we have to bring that up.”

The long road to bad sound

The sound problems we face today are the result of decades of underestimating the importance of clear sound in production. Bondelevitch cites the transition from filming on sound stages with stage actors as the first nail in the coffin. Voice stages provide a secluded place to capture clear dialogue, usually with the standard boom mic “eight feet above the actors.” The popularity of location photography made this impossible, leading to the standardization of radio microphones in the 1990s and 2000s, which present their own problems. The swish of cloth, for example, is difficult to edit and leads to more ADR, which actors and directors alike hate because it detracts from the performance presented on set.

In the early days of cinema, when most actors were theatrically trained for the stage, performers turned toward the microphone. However, the manner of acting allowed for more whispering and mumbling in the name of realism. This can be managed if more time is devoted to training, as actors can practice the volume and clarity of their lines, but very few productions have this luxury.

One of the names still being thrown by audio editors for this transformation is Christopher Nolan, who popularized his increasingly popular acting style through his Batman films. The problem remained constant throughout the Dark Knight trilogy, with the voices of Batman and Bane being consistent complaints even among movie fans. When Bane’s voice was completely ADR after the film’s disastrous IMAX preview, he beat out the rest of the movie. “It was the worst combination The Dark Knight Rises,” he said. “The studio realized that no one could understand it, so at the last moment they remixed it and literally made it painfully loud. But size wasn’t the problem. [Tom Hardy’s] He speaks through the mask and has a British accent. Making the sound louder didn’t fix anything. It made watching the movie less fun.”

The Dark Knight Rises Bane Restored Undubbed Audio.

Volume is an ongoing war not only between audio editors but within the government. In 2010, the Federal Communications Commission passed the Commercial Advertising Loudness Reduction Act (CALM) to reduce the volume of commercial advertising. Instead, networks simply raised the volume of TV shows and compressed the dynamic range, making dialogue more difficult to hear. said Clint Smith, associate professor of sound design at the University of North Carolina School of Motion Picture Arts and who previously worked as a sound editor at Skywalker Ranch.

Smith has taught audio engineering for five years and encourages his students to embrace subtitles and work to integrate them into the film’s narrative in more creative ways. “What does it look like? Ten years down the road, twenty years down the road as translation has become more widespread because I don’t see it disappearing,” Clint asked his students. “I was kind of curious about…how can we actually get subtitles to be part of the filmmaking process. Don’t try to run away from them.”

As incomprehensible dialogue becomes more common, we will have no choice but to embrace the subtitle. But at what point do studios and broadcast screens not bother to properly mix the audio and assume that viewers will only read the dialogue? With subtitles being an option for every broadcast, soon, “we’ll fix it in the post” could become a fix when Homepage.”

A sound you can feel

There are some things we can do. For example, there is always the purchase of a great sound system. The most important thing is to prepare it correctly. Most of the audio mixers interviewed recommended getting professional help, but they also mentioned that many amplifiers today come with microphones for home improvement. Despite this, no one seemed too convinced of the soundbars.

Bondelevich said, “If you use loudspeakers, get the best speakers you can afford. And if you listen to earphones or headphones, get good headphones. If the environment is noisy, get over-ear headphones. They isolate. They sound a lot better and they don’t use noise canceling headphones because those really spoil the sound quality.”

But more than anything, they emphasized how this is a selling factor for movie theaters. If you want a good sound, there is a place with “sound you can feel”.

“It’s a problem because you want to experience theater,” Vanchure said. “People don’t go out to theaters that much nowadays because everything just flows. And that’s how you want people to hear these things. You do this work so you can hear this loud and big.”

Leave a Comment