Once Upon a Time In Hollywood DP Robert Richardson gets philosophical about the art of film projection, the beautiful failures of old lenses, and how we will all end up like Rick Dalton. Here’s my talk with the three-time Oscar winner for Filmmaker Magazine.
In my newest interview for Filmmaker Magazine, Panavision Senior Vice President of Optical Engineering Dan Sasaki talks being a second-generation member of the Panavision family, the storied history of the C Series anamorphics, and personalizing lenses for cinematographer Robert Richardson for use on Quentin Tarantino’s Once Upon a Time In Hollywood.
Here’s an excerpt:
Filmmaker: Do you have a favorite set of lenses that you wish got rented out more?
Sasaki: Oddly enough, I cannot say that I have encountered a situation in which any particular series has gone unnoticed or underutilized. The amount of content [being made now] has created a bit of a renaissance in which the art of cinematography has evolved into an adventure that I have not seen the likes of in my history at Panavision. Cinematographers are figuring out ways to maintain their authorship and carry their intent throughout the entire imaging chain. That includes experimenting with every type of lens we carry.
My talk with Midsommar cinematographer Pawel Pogorzelski is now up on Filmmaker Magazine. The folk horror tale re-teams Pogorzelski and his Hereditary director Ari Aster. Shot on the Panavision DLX2 with Panavison Primo Primes and Primo Artiste 70mm lenses.
Here’s Pogorzelski breaking down one of the film’s distinctive drone shots:
Filmmaker: There’s a shot when the Americans are first driving to Hårga where a drone flies from the front of the car to the back. As the drone moves, the camera rotates 180 degrees so that the image is upside down when it reaches the other side of the vehicle. I don’t remember ever seeing that shot before.
Pogorzelski: We found very good drone operators in Hungary, where the Sweden-set scenes were filmed, but at first they told us it was impossible to do that shot. I kept doing research and found a way, which was basically a custom-made drone with a gimbal head — I think it was a Ronin — that could hold an Alexa Mini. Then you had to bypass the drone’s software to tell it to tilt more than [the software] would normally allow. I asked the drone operators if they would try that for me and they were a bit reluctant because that drone is their baby. The first day we flew it, the drone died as it took off. So we had to do the shot again on a day that was a little bit too overcast, but it was the only day we had left to get the shot. It was very windy, but the operators were able to keep the drone flying straight, which was quite impressive. I think we did it four or five times and every time [the drone operators] were very nervous. I was always like, “One more, one more. We can do it better.” And they were always like, “Are you sure? I think we’ve got it.” (laughs)
The latest entry in my Shutter Angles column for Filmmaker Magazine is a chat with Us cinematographer Mike Gioulakis. Shot on Arri/Zeiss Master Primes with an emphasis on practical effects, which included storyboarding nearly the entire film in order to mainly use clever blocking, framing, and editing to sell the movie’s doppelganger effect rather than relying on digital tricks like face replacements.
Gioulakis also breaks down a half dozen specific shots from the film, including the opening credits’ long pullback shot of a wall of caged bunnies.
“The Person You Put Up There Ain’t the Person That Comes Back”: Directors Kevin Kölsch and Dennis Widmyer on Pet Sematary
Here’s Kölsch on how the filmmakers began working together decades ago as teenagers on Long Island:
Kölsch: We’re from the same area and knew people in common, so we’d see each other at parties or on the basketball court. As far as working together, we were both already writing on our own [before we became friends]. I had just written a script for a screenwriting class when I ran into Dennis at a mutual friend’s house. Our friend was like “Dennis, you write scripts? Kevin wrote a script too.” I lived around the corner, so we walked over to my house and I showed Dennis some stuff. From there we started showing each other our work and giving each other feedback.
We decided that while our other friends were getting together and drinking on Friday nights, we’d try to be productive. So we’d get together, bring our word processors, get some beers and play some music to make it fun. We’d work on pages of our scripts and at the end of the night we’d show each other and give feedback. That turned into helping each other—like if one of us got stuck on a scene, he’d turn to the other and say “I’ve got a problem.” So slowly we started contributing to each other’s scripts and eventually it was like “Why aren’t we just writing these together?”
And here’s Widmyer on the decision for “Jud” actor John Lithgow to not attempt the Maine accent used by Fred Gwynne in the original film:
Widmyer: That was an ongoing back-and-forth with John. At first he was up for it, but then he read the book and saw that our interpretation in the script was different. In the book King leans more into the folksiness of Jud and the locality of him. He’s like the quintessential Maine character. But [the accent] is kind of a no-win situation. If you nail it, you’re going to sound like Fred Gwynne, and if you don’t nail it, then you don’t sound like Fred Gwynne, who did a pretty good job with it.
John actually knew Fred. They’d been in a play together and he’d always joke that Fred was the only actor that was ever taller than him, because Fred was 6’5″ and John is 6’4″. He has a lot of respect for Fred Gwynne and so he purposefully didn’t watch the first film. We talked about it a lot and John tried the accent in the read-through and we all thought it was great, but in the end we left the decision up to John. He decided to go his own way and we were actually really happy that he did.
Check out my interview with Captain Marvel colorist Doug Delaney for Filmmaker Magazine. Here’s an excerpt of Delaney talking about the abundance of deliverables required for a tentpole Marvel Cinematic Universe release.
Filmmaker: How different are all those deliverables from version to version?
Delaney: Because the suit color and the amount of detail in the glow and all these things are so important, that has to be maintained throughout all the different deliverables and across all the different light levels and color spaces and Nit values. So when you move into EDR or HDR, while you want to leverage the ability [of that technology], the big challenge is making all these versions feel the same. You don’t want to say, “Now we’re in HDR, let’s make everything bright and super crazy.” You want it to feel like the same movie [regardless of the viewing platform].
With a movie of this scale, you’re finishing the film literally weeks before it’s released. That release is international and standard projection, plus Dolby Vision projection, plus stereo 3D in various forms, plus IMAX. It’s a huge amount of work. The compression of time on these kinds of films is quite intense and it really is impossible to do without collaborative workflows. I had a second colorist helping me and two additional people helping on rotoscopingand tracking and doing the 3D stereo grade. Some of those people were also helping with the IMAX version plus additional help on the home video version, because they’re releasing online very close to the [theatrical] release date of the film.
And here’s Delaney on how he got his start in the industry as a “scanning and recording” technician in the late 1990s:
Filmmaker: Looking over your credits, you started in the late 1990s as a “scanning and recording” technician. What was that job?
Delaney: That was my entrance into the field, which was before digital intermediates even existed. Back then, [for shots incorporating visual effects] you had to scan the camera original negative on a scanner and digitize it. The visual effect would be executed, then you’d have to record that digital file back onto a piece of negative film and process it at the lab. That’s how I started in color timing, because in those days the scanners, which were essentially like digital cameras jury rigged on an old optical printer, couldn’t capture the full dynamic range of film. So you had to do the exposure at the point of scan and try to accurately reproduce the director of photography’s intent, so that when it went to the visual effects artist it was already in a pretty good place. Then when you recorded that digital file back out you would compare your recorded out version against the original version and try to get them as close as possible to each other so the round trip was seamless and the audience hopefully couldn’t tell the difference.
I started doing that as a technician and then got into supervising that role. Around 2002 or 2003 I was supervising that process on the Matrix [sequels] and we decided that [instead of just scanning in the scenes needed for VFX], we’d beta test this whole digitaliIntermediate thing since there were entire sequences in those movies that were visual effects. That’s how I officially got into doing DIs.
Filmmaker: How much different was the DI process in those early days?
Delaney: It was very bulky and difficult. Now we’re doing it on our laptops. (laughs) Back in those days that was certainly not the case. It would take 15 seconds a frame to scan a 2K piece of negative. Imagine doing that for the whole movie. In those days, (recording back out to film) would take you 20-something hours per reel. A typical movie is maybe six or seven reels. If you got to the end and there was an issue, you had to start all over again. It was a painful process for sure. I’m grateful for the experience and discipline I learned from it, but I don’t miss it.
Chris Teague (Obvious Child, Landline) talks going back to Red for Russian Doll, futzing with split diopters, and the difficulty of balancing a personal life with a TV series schedule in my latest interview for Filmmaker Magazine.
Here’s an excerpt, with Teague discussing his use of Leica Summilux lenses.
Filmmaker: You owned a set of Cooke Speed Panchros for years, but for Russian Doll you went with Leica Summilux lenses.
Teague: Yeah, I shot almost everything on vintage lenses before Russian Doll. They didn’t feel appropriate for this except for the flashback sequences in episode seven, which we shot on Super Baltars. This felt like a modern, contemporary, hyper-real landscape and I loved the idea of having super fast lenses that I could shoot wide open all the time. The concept in my head, which is maybe too literal, was that Nadia was out of step with her world, and if we used fast lenses with very shallow depth of field she’d always feel like she was popping out of her background. I really fell in love with how those lenses look. The wide lenses [have minimal distortion], so we could do some things on super wide lenses. I shot a couple of scenes on a 16mm lens and I never would’ve done something like that before. I love a wide lens where you have that incredible open field of view, but you’re not so distracted by the way it’s warping the space. The Leicas are fantastic lenses. They’re also small and light, so we could keep the camera’s [footprint] smaller. That was a plus when shooting in tiny New York locations.
Here’s my interview with Green Book director of photography Sean Porter (Green Room, 20th Century Women) from Filmmaker Magazine. This is actually my 100th interview piece for Filmmaker – all of which you can find here.
As for the Porter piece, here’s a little preview with Sean talking about his shift from old Cooke lenses to newer Leica glass.