I sat down the other day for my annual rewatch of Bladeunner 2049 and I noticed this. Is this pixelated? Is this color banding? I checked my internet connection. I checked my TV settings. I checked my HDMI cable. It was still there. So, here's the Netflix version. Still there. And here's the Apple version. This is where I even own the damn movie. Still there. There has to be a version of this film that doesn't have the color banding, right? I pay extra money for my 4K Netflix plan. Like, there has to be a way. Where is it? So, over the last few weeks, I went down this streaming
and movie watching rabbit hole, which I don't mind at all, honestly. But I wanted to uncover some deep conspiracy, and I wanted to blame Netflix for it. But I ended up mixed in this heated war of formats and discs and streaming services. And the ultimate answer is even simpler. Like 99% of the time, 4K is a lie. But for those of you who are willing to chase that other 1%, that 1% of perfect movie projection, man, there are some incredible hidden gems out there, if you know where to look. So, let me drag you into this rabbit hole with me. So, before we look at 4K movies, we need to understand where that image came from. Because for years, for decades, films were film. A movie would get shot on good old 35 millimeter film.
And the movie that you saw in the theater was just an edited copy of that film. It was never turned into pixels. And of course, I know what you're thinking. Old films, right? Classics. All right, Mr. Deil, I'm ready for my close-ups. [snorts] This guy in a black and white world cutting strips of film. Well, let me tell you, this was the process for 99% of the films that released in the '9s. And the '90s were just what, like 15 years ago. Every movie from our millennial childhoods was cut by hand. And to do that cutting from that original priceless negative that came out of the actual camera that was used to shoot the movie, the studio would make a copy.
Now, that copy was called the workprint, and that's what editors would cut, physically cut. Your favorite '90s film was just a Frankenstein cut of pieces of film all done on this workprint copy using a machine like this. When the director finally approved the edits, the most pro, the master boss, the Sigma job here, the negative cutter, they'd go back to the original negative and cut it to match the workprint cut. So, this work print was really just a temporary file, a disposable version to avoid damaging that precious original negative. And then, uh, a color grading lab, not a computer, not Photoshop, not Dainci. They would grade the film using real filters [snorts] and different film
stock and chemicals. You make this scene a little yellower, a little bluer, a little redder. You have certain, you know, controls, an overall kind of feel of the film. warm or cool or a little bit of this, a little bit of that. Then this finalized copy would be duplicated thousands of times and it get sent to movie theaters for you to watch and chug on that popcorn. Now those copies, there were reels of film. They were sent in a can and they were projected in the theater. With film projection in movie theaters, if you wanted that top 1% premium experience, it really mattered that you went to see the film early because after a few weeks, the 35 mm film would start to get damaged and scratched. But what is the resolution of that film? Is
it 4K? Is it more? Well, resolution and pixels, they didn't matter back then. Like, I don't think people knew what a pixel was. Like these guys, they were working from film to film. There is no 4K conversion here. There's just chemistry in film stock capturing an image and then more chemicals in the copy. John McConnell still shoots on film. That results for me in a man crush. Now, obviously, editing like this is very artsy, but it's terribly slow and even a little dangerous. And it took almost 100 years of film making to come up with a better solution. Arguably a better solution, but we'll get there. In the '90s,
the industry got digital editing tools, but they're not what you think. What tools like Avid or Light Works did was replace the workprint, not the final edit. The original negative would get scanned into the computer, so that'd be easy to edit digitally, but that scan was 320x 240. Like, that's all these early computers could deal with. The first film cut using Avid was Let's Kill All the Lawyers. And I looked in the darkest parts of the internet and I could not find that film for the life of me. But what I can guarantee you is that when this movie was released in movie theaters, it didn't look like 320x240. 320 x 240 is even for '90s standards. But it's the beginning of the end for film quality.
It's the start of our fake 4K. Now, back in the '90s, that low res scan of the film was really just like a proxy, a preview. It was never intended to be used for the final film. Like Avid would export a cut list, a little sheet that went back to that holy man, the negative cutter that would just follow those specs to cut the negative by hand and go through the same process that had been used for decades. Now, we're still working from film to film. So, pixels in that avid copy of the movie, it doesn't really matter. But fast forward to the 2000s, and this is where the cheating begins.
Hello, sir. By the 2000s, technology had come to this point where we could actually get 2K resolution scans of that 35 mm film, 2,000 pixels. And so the Cohen brothers and Roger Dickens the cinematographer made one radical change that would transform movies forever. Now they wanted to give this film this sepia sort of old faded postcard sort of look and it's a look that it was really hard almost impossible to accomplish using the old school grading process at the lab. So what they did is they scanned the negatives of the production using a Philips spirit data cine. It's a device that could scan the original film in that 2,000 pixel resolution. And then they color graded that digital movie file in the computer. And so they
exported that computer graded movie back to film. That was done using a Kodak lightning film recorder that kind of converts digital files into film. Sort of like a printer, I guess. Now that printed copy would then be duplicated a thousand times and sent to all movie theaters around the world. But there's a huge problem with that process. Did you catch it? The point of this whole story so far is that because of this decision, there is no 4K version of Oh Brother, Where Art Thou anywhere ever. It's impossible to get one. It's gone. When that negative was scanned into digital, it was scanned at a width of 2048 pixels. And because this film is widescreen, the actual resolution of the movie might have been something like
2048 by 850. And that's it. That's the master file for this film. there's nothing else to it. The original negative, if it still exists, maybe has that detail, but you need to basically scan, edit, grade the movie again to be able to make it 4K, not to mention have to deal with the damage of that 26-year-old negative. The movie that we got was shot on celluloid, yes, but then it was converted to pixels, and even if you convert it back to celluloid, you're always going to be limited by the pixels in that original scan. You've lost all that extra detail forever. How many movies do you think suffer from this? How many movies do you think are tricking you with this fake 4K? Just wait for it.
Now, this master file is actually called a digital intermediate, a DI. And after, oh brother, we're out thou 2KDis started to become a standard. And for the most part, this meant the end of negative cutters. Now, with the rise of digital projectors around the mid-200s, it also meant that a lot of movie theaters also switched from film rolls to just projecting a digital copy of that master. No more scratchy, jittery film, but a digital copy instead. But 2K, 2K is not the 4K that I'm paying for. Like any film with a 2K digital intermediate master could never be converted into 4K, right? So, which films were limited by this?
I've been going through a list of my favorite movies. Like, some of these are big budget Hollywood productions, like some of the biggest movies of the last couple decades. Some of these released on IMAX, and they're all 2K masters, like all at least all of these. So, why are they showing up as 4K films in my streaming services? Is this the explanation for the color banding? Now, Netflix knows exactly where to put that 4K logo so that it appeals to the nerds like me, and they have piles of data to prove it and to test it. But regular UI designers don't. But now they have Figma Make. Figma make lets you prototype dozens of versions of your products, not in days, not in hours, but in minutes. You
can iterate on proposals by just chatting or even submitting reference images of what you like. And these are not static. These are clickable prototypes that have data and interconnections. Figma make integrates with your existing design system libraries so that you can get credible onbrand mockups. You can also connect Figma make to Figma sites to customize a website or test interactions and just publish it. Or you can even connect it to Superbased and turn your idea into a web app that is ready to ship. It doesn't replace the role of the designers or the product managers. It just empowers them to move a lot faster. You can get free access to Figma make along with AI credits by using the
link in the description or just snatching this QR code over here. And of course, you also support our channel and our videos in the process. Okay, so let me get back to that list of fake 4K. Now, there are a couple of tiny exceptions on this list, and I'm going to get to those, but the truth is that most of these films have just taken the 2K digital intermediate master and upscaled it to 4K. Sometimes using some really bad AI and sometimes just lying to you about it. One of the most extreme cases is The Lord of the Rings. Is it secret? Is it safe? From around the 2000s as well.
Shot on film but mastered on a 2K intermediate which then they used for the theater versions, the DVDs, the extended versions, the Blu-rays, and the streaming versions. But how do they go from 2K to 4K? What Wet has done is they've gone to all the original camera negative of the live action stuff and they've gone to all of that original film out that film printed VFX shots. Wow. and they've scanned it all in native 4K. That was not true. Lord of the Rings nerds, at least more nerds than I am, compared the old Blu-ray releases with the new 4K releases, and they found that they were the same film. They just
stretched out the 2K intermediate image, messed up the colors in the process, which is a story for another day, and they slapped a 4K sign on the box and on my streaming service. And this goes into the 2010s as well. Prometheus, the social network, Matt Max. Like I remember I distinctly remember paying top dollar to watch Mad Max on IMAX explodes exclusively in IMAX 3D. And now [snorts] you're telling me that it was just an upscaled 2K film that pretended to be an IMAX. Why would they do that? I wasn't even close to the end of this rabbit hole. There is some debate on this, but the general consensus is that if you scan a 35 mm film into digital and you store that image at a resolution of 4K, you've captured all the detail in that 35
mm negative. Now, some people will argue that there is more detail in there, that there's detail enough for like a 6K scan from a 35mm negative, but honestly, not really. Now, the first movie to scan film in 4K was Spider-Man 2. They scanned all the negatives in 4K and they mastered the film in 4K. That means the DI was 4K, which I'm sure crippled all those 2004 computers. My back. Funny enough, not the special effects. Rendering those tentacles at 4K would have been impossible with 2004 computers. Not to mention crazy expensive. It's still crazy expensive today to render in 4K. So, the animation in Spider-Man 2 is actually 2K
resolution renders placed on top of that 4K footage. But anyway, aside from this being, you know, a good nerd curiosity, did you notice that difference between the 4K footage and the 2K render? [screaming] That is the catch. Despite everything you've been told, despite all the money that you've already spent on that 4K TV, do you even need 4K? Matt Max look pretty great at 2K. So, is 4K even useful? Well, most of the time it's not. Now, let me explain why. Here in my living room, the distance between my couch and my TV is 2.8 m or 9.3 ft. That 75 in TV takes about 33° in my field of view when I'm sitting over there. And if I place my living room on this chart, I would land somewhere like
here. Now, that means that at this distance, my eyes should be able to tell the difference in pixels between 1080p and 4K. Now, if I had a slightly smaller TV, or if my couch was a little bit further away, 4K would not make any difference. It would be just as good as 2K. I'm going to link some materials on this below if you're interested in testing your living room distances. And the same problem happens in a movie theater. Most seats in a movie theater fall within this 1080p region of the chart. It's a huge screen, but the distance is much longer. And unless you're sitting way up front, which comes with its own set of problems, 4K versus 2K will not make a difference. That is the reason why most
movies even today are mastered in 2K. In 99% of cases, it's not worth the effort or the extra equipment cost. So, why did we get 4K shoved down our throats? Why was it sold to us as this golden standard in the movie experience? A big reason 4K is in our mind is because it's easier to market. In the mid2010s, when 4K TVs were entering the mainstream, it was very easy to just go on and market four times the pixels versus that crappy 1080p TV you used to have. It was easy to get people to upgrade because 4K was the future. And this allowed TV companies to suddenly ramp up pricing on TVs and profit margins. And once you did upgrade, well, now you need 4K content, right? So, so film studios found the perfect excuse to resell that same movie
in a new 4K Blu-ray packaging or streaming services found the perfect excuse for a new, more premium plan. And again, most of the time what they were using were just upscaled 2K intermediates. [snorts] So it is true. Why pay for 4K then? There is some logic to these 4K plans being more expensive because in theory you're streaming a bigger file. It's four times larger than the previous one. There's more data in there which naturally will cost more to stream, right? But there's your answer. Bigger files are better. It's what we want, right? We want more information. We want more detail. We want less compression. And we want more picture quality.
So, let me go back to my Bladeunner 2049. Bladerunner has a 4K digital intermediate. So, there's real 4K there. And funny enough, it was shot by the same cinematographer as O brother were out. Though, assuming that they used an industry standard codec for that file, that would probably be something like ProRes. The DI file for Bladeunner might have been something like 1,800 gigabytes, like 1.7 terabytes. That's the purest form of the movie, the one that's stored in the studio servers. But they can't send a 1.7 TBTE file to every cinema in the world cuz hardly any computer could play that thing. It needs to be compressed and still look good. So, how do they do that? So to send this movie to theaters,
they might compress it to a DCP file of maybe 600 GB. That's for high-end projectors like an IMAX digital release. Now lower-end movie theaters, they might get a smaller copy, maybe a 300 GB copy. And yet movies at the movie theater look great not because they're 4K or 2K. So we already saw that at a certain distance the 2K or the 4K don't even matter. Now, movies in the theater look great because they get a hard drive copy of the film, a file that is hundreds of gigabytes in size, and it's as close as possible to that original DI. Now, here's every color that is visible to the human eye. Now, a DI master, a digital intermediate will probably use the ASUS encoding system,
which covers 100% of these colors. It's kind of lives like here. theater distribution copies of the film, they're compressed. They're compressed in a standard called DCIP P3, which covers 53% of the colors that we can see. And it's kind of a triangle here. In other words, there are some greens and there are some magentas that we can see in the real world that we're never going to be able to see in a movie theater. Now, that detail may have been captured by the camera because cameras can generally do that. That detail may still be in the digital intermediate cuz the codec can handle it, but it's gone. It's gone even from the best copy of the film that you will ever be able to see in a digital format.
Now, by the time that we get our streaming copy, the movie will be even more compressed. Normally, it's going to be compressed kind of at this range. This is the Wreck 709 range. And we've lost even more colors. We've lost some of these greens and these yellows and these reds and magentas that the Kodic just can't handle. Rec 709 only covers about 35% of the colors that are visible to the eye and bunch of colors are gone. Now, this right here is the cause of my color banding. Like the oranges between this orange and this orange, that information is just gone. It doesn't exist in the movie copy anymore.
Streaming services rely on different codecs. Codex are like encoding algorithms that try to preserve as much detail as possible while shrinking the file size, but it is a lot of shrinkage. A study by 4K film DB found that most streaming services don't go over 40 megabits per second. Apple TV wins this one, followed by HBO, then Netflix, then Amazon. So why won't they stream at a better quality? Like home internet is already in the hundreds or thousands of megabits per second. Like they could do it, right? Are the streaming services just cheating on us?
I wanted to blame Netflix's greed, but ah, that's not really the case. I mean, it would increase their broadband and their CDN costs like 10fold, but it's not just their fault. That's the thing. It's also the Ethernet ports or the Wi-Fi antennas in our devices. Many of them, they're just capped at 100 megabits per second. It's also the processing power that's needed to decode and play a file that is so large, most TVs just couldn't handle it. Now, because of those limitations, 4K prioritized more resolution because it was easier to market. And in exchange, well, it made us sacrifice color, sometimes to include pixels that weren't even there in the first place.
Just look at an old 1080p Blu-ray. They often look better than a 4K movie from streaming. So, the extra storage in that disc is being used for that detail and for that color, not for pixels. that again you probably won't see if your TV is too far away. But if you're still with me, I'm sure that you now you want to know well how to get around this compression codec fake 4K mess. How to get the closest possible experience to real film in your couch. And this is my favorite part of the rabbit hole. How to learn from all of this nerding to get movies at home that look like the original. So, looking like the original film, that's obviously a loaded sentence cuz so much of this process has changed now.
Most movies, they aren't even shot on film anymore. A dinosaur dead. It scratches, it breaks, it's dirty. It's a nightmare. Like, some cameras can shoot footage in up to 12K and that's the original file and that's a file, not a negative. And then that file gets sampled down into that 4K or sometimes even a 2K digital intermediate. The Martian is a good example of this. The Martian was shot digital in 6K and all that 6K information was there. It was there in the footage. But because the Martian had so many visual effects shots, the digital intermediate ended up being 2K cuz nobody wanted to render all
of this special effects in 6K. Purists will agree that movies shot on digital are To me, the magic of movies is connected to 35 mm. Real films are shot on film, right? That's a heated debate. I'm not going to get into it today. I'm sure there's going to be some shouting matches in the comments. But what I will talk about is a cheat code. Now, some cinematographers, they strive for that OG film look, but shooting on film is very expensive and fewer people are willing to pay for it or can afford it or even know how to do it. What they do is they shoot digital, they edit the film digital, and then they go through the trouble of printing
that pre-cut of the movie into film at a lab and then scanning it again to generate their 2K intermediate or their 4K intermediate. Because it went through film, it catches some of that film texture. That's the trouble they go through just to try and get some of that original film feeling. But back to our home. How do we get close to that OG experience? First of all, you might need a bigger TV. You might need to move your couch closer to make sure that you're squeezing all of that 4K resolution. And if you're not squeezing it, you can save yourself some money and downgrade your plan. Second, you have to make sure that your TV supports today's color standards. Movies encoded in HDR support a lot more colors than
movies in SDR. HDR as a standard is a mess. It's a mess of color and brightness. We used to make YouTube videos in HDR and we had to stop making them because nobody everybody got a different version of our video and some of them look like But let me walk you through what really matters about HDR, which is color. HDR files support colors in the Wreck 2020 cannon, which is kind of like here and it's huge. Rec 2020 has like 75% of all the colors that we can see. So many colors in fact that most TVs can't even display it. So the information is in the file. It's just that the TV can't display it. Now high-end TVs, they probably operate something closer to the P3 gamut. So kind of all the way up to here. Lower-end TVs in 2026, h they settled for around 75% of the P3
space. So probably something closer to here. There are some top tier screens and projectors that are starting to get closer to this wreck 2020 gamut. But before you go hunting for that, remember distribution versions of the film often only have P3 DCI data, so there aren't more colors in there. The fact that WEC 2020 HDR supports more colors than the movie actually has is great, but it doesn't make the movie look better. It's like putting a 12 oz soda inside a 20 oz bottle. Like the bottle is bigger, but there's still only 12 oz of liquid inside. Now, here's where things can get really dangerous for your wallet. There are some beautiful, exceptional, incredible cases of movies that are pushing the boundaries of those home theater film
experiences. The 4K Blu-ray version of Spider-Man: Into the Spider-Verse was tested and confirmed to have more colors, more colors than P3. It uses all of the colors in that space between P3 and Wreck 2020. It uses the rest of that soda bottle. This is purple. Now, blue now. Now, of course, you need a TV or a projector that can display those colors. And that level of detail at 4K is possible because 4K Blu-ray is incredible. Those discs can fit like 100 GB of data. So, you're looking at bit rates of up to 144 megabits per second. Look at that in the chart. My streaming version of Into the Spider-Verse is stuck at whatever bit rate Apple wants to give me and definitely doesn't have those colors. Now, in the deep rabbit
hole of film and home theater enthusiasts that I've been living in the past few weeks, Blu-ray 4K is like the deal. That's a big deal. It's not just because of quality, but it's because you get to own this movie and you don't depend on the goodness of the streaming services, which might just remove it without warning. that 4K is worth it for movies like Big Fish. He's here. We see that everybody is already there. Now, Big Fish was produced in 2003 where the digital intermediate was 2K and like we said, in 99% of cases, the 2K intermediate is getting upscaled to generate the 4K Blu-ray like they cheated in the Lord of the Rings.
Wicked first. And that'd be the end of it. But Sony went one step forward. They went back to the negatives, the original negatives, those that came out of the camera that they still had in storage, and they scanned them again in 4K. They matched the cut, and they released a 4K Blu-ray that is a true remaster and the closest possible experience to the original Big Fish. They say when you meet the love of your life, time stops. And that's true. There are only a handful of films that have gone through that really expensive process, but they're they've become little treasures to the nerds that have been with me
over the past few weeks. Now, if you're not going to get into this physical media world and to buying discs, there are some good digital alternatives, too. Recently, Sony launched this service called Sony Pictures Core that runs on PS5 on and some Sony TVs. If you have the right combination of devices, it's usually going to be one of their latest TVs. And if you have the internet speed, of course, you can access their pure stream feature, which streams at 80 megabits per second. Now, Hollywood directors and I guess millionaires who want insane movie quality at home, they're probably customers of this other
service called Kate Escape. Now, this these guys let you download insane quality movies, average bit rate of 65 bits per second. That would put them around here on the chart. The catch of that service is that you need to buy their decoder, their server, which runs a few tens of thousands of dollars. It's really only for millionaires, but as I hear it, the quality is insane. So, if any of you have access to it, I guess invite me to watch a movie. And last, but not least, one that we can all access. And the reason why Chris Nolan keeps getting all this fuzz around his IMAX films is that, well, he has the money to spare no expense on this stuff.
Christopher Nolan's films don't have a digital intermediate. IMAX film is 70 millimeters. That's four times the size of our usual 35 millimeters. That means four times the detail. And so in good theory, we could scan this film into 16K and still have detail to look at. But once again, no computer could handle that thing. And even these days, so just like in the old days, Nolan's films are shot on film. They're edited digitally to find the right cut. And then a negative cutter goes to the original IMAX negative, cuts it, sends it to a lab to be graded the old way with chemicals and labs. And for like the four movie theaters around the world that support IMAX 70 mm, that film still makes it there film to film. Now Raul
RDP that's behind the camera today, like he thinks that IMAX is mostly a marketing gimmick. And that's kind of true because except for those like four IMAX theaters that support film 70 mm, the rest of them are just digital anyway and they're using a somewhat compressed version. But I think that Nolan using his unlimited budget to give negative cutters a job again. That's that's pretty cool. I'd love to see what you guys think, so drop it in the comments. But there is one more limitation. There's one more problem with capturing images. And for that, I had to travel to the Atakama desert to see the largest telescope in the world with my own eyes and geek about an impossible problem that no film and no
transfer will ever be able to solve. You should check that video out. Catch you in the next one.