pro rendering question

My question is about pro movies.

all the time you see movies from, lets say the 50s coming on out blue ray and being restored and all that stuff.

If i'm not mistaken that is because it was shot on film and film and be rendered into higher quality over time right?

I wonder what's gonna happen in the future? can movies that are shot on 4k be rendered to higher quality in 35 years? (what ever that will be 25k!!!!OMG!!!")
 
I'm sure someone more knowledgeable than myself will provide you with a very detailed answer, but...

In short, no. The technology my be there to "upscale" a 4k image to a higher resolution (just as some systems upscale SD to HD), but it would never truly be of a higher resolution. That would require splitting pixels...

The reason a film shot on film can be released at higher resolutions is because of the fact that it was shot on film. I've seen figures that say film has approximately 20 million pixels. A SD film has about 0.3 million pixels. Full HD has about 2 million pixels.

At least that's my understanding of the theory...
 
Film is scanned into a digital image and as with document scanners, can be captured at whatever resolution is deemed necessary - there are practical limitations though - I've heard that a Super 35mm film print has an effective resolution of 2k (go beyond that and all you're doing is scanning grain at higher resolutions).

Digital isn't great as an archival format since whatever the film was shot at is the resolution it'll always be - sure, you can blow up 1080p to 4k and with some refinement, it might even look OK but there'll never actually be any more detail in the 4k version vs. the HD one.

Ever wonder why shows such as Babylon 5 haven't been released on Blu-ray? It's 'cause the VFX were rendered in SD and the assets don't exist any more to re-render them at HD. The video (which was shot on film) could be scanned at HD resolutions but the VFX would look terrible in comparison.
 
The current thinking around resolution of film is that SUper 8 can scan up to about 1080p; S16mm 2k-4k, and Super 35mm anywhere from 4l to 6k-8k depending who you talk to.

Kodak tend to adjust their numbers each time a new digital camera comes out..

Film is simply captured differently to digital, so it's difficult to pull a definitive resolution number from it. Also, some will tell you that even if the actual image itself hits a point where scanning at a higher resolution doesn't increase the quality, the finer detail of the grain can make it appear to be at a higher resolution.
It's all a bit subjective.

At the most basic level - the answer is yes, you can scan 35mm film into around 2k-4k resolution to create a Blu Ray release. One would imagine that most super old TV shows would require some amount of digital restoration, and also some careful re-framing to adjust from 4:3 to the 16:9.
I started watching one of my favourite shows from the mid-00's that was shot on S16 the other day and realised that it was unlikely that it will come to Blu Ray because 85% of the framing would be impossible to adjust for 16:9.

If I remember correctly, the DP for Scrubs tended to frame for 16:9 safe, even though the first seven seasons of the show were broadcast in 4:3 - I imagine there are many others who have done the same thing, meaning they're easier to re-frame and bring to Blu Ray.

Digital isn't great as an archival format since whatever the film was shot at is the resolution it'll always be - sure, you can blow up 1080p to 4k and with some refinement, it might even look OK but there'll never actually be any more detail in the 4k version vs. the HD one.

You could say the same about, say, Super 8 or Super 16mm.. The reason digital is a bad archival format is that it has no longevity.
Whereas with film, you can print something onto film, bury it in the ground for 100 years and come back and scan it in at full quality - with digital you need to constantly update your LTO tapes (as in, ever 2-3 years update the tapes and systems) and make backups of your backups in case of data loss...
By Kodak's numbers, digital archival costs ~11 times more than film...
 
Last edited:
My question is about pro movies.

all the time you see movies from, lets say the 50s coming on out blue ray and being restored and all that stuff.

If i'm not mistaken that is because it was shot on film and film and be rendered into higher quality over time right?

I wonder what's gonna happen in the future? can movies that are shot on 4k be rendered to higher quality in 35 years? (what ever that will be 25k!!!!OMG!!!")

Film has never been limited by pixels. Only the medium it was transferred to. Film is HD, period. Thus, when it's been "restored," they have merely cleaned up the film stock that the original was printed from because over the years, the original stock has taken on dust and lost its luster. And then they have taken that cleaned up stock and transferred it to the medium... which in this case is Blue Ray.

4K digital footage IS limited by pixels, thus, it will never truly upconvert to 6K, 8K, or greater. When it IS upconverted, they use an algorithm to determine the best guess as to what color the new pixel(s) between the existing pixels, should be. Film doesn't suffer from this problem.

Look at it this way film is analog, which means it organically exists in infinitesimal levels of pixels, if you wish to look at it that way. It's the epitome of HD. "Digital" means, basically, that your resolution is determined in "digits." More specifically in 1s and 0s on a computer. The more 1s and 0s, the greater the resolution. The more "data" if you will. So, the more 1s and 0s, the closer to the analog look you will get.

This is the same as with sound. If you record/capture your audio on camera at 16bit, you limit yourself to what you can do to that audio later when you edit because as you add effects to the original 16bit capture, you introduce rounding errors and those rounding errors (remainders) get removed and the holes in your audio have to be replaced with white noise during the dither process. You never want to dither from 16bit to 16bit because you've essentially put holes in your sound and are only replacing your holes with white noise. If you capture at 24 bit, do all of your editing, which will create those holes, there is a greater pool of "digits" to pull from in the dithering process from 24bit to 16bit and thus your holes get filled with data from the excess data which is being removed natively to render to 16 bit.
 
Last edited:
Film is scanned into a digital image and as with document scanners, can be captured at whatever resolution is deemed necessary - there are practical limitations though - I've heard that a Super 35mm film print has an effective resolution of 2k (go beyond that and all you're doing is scanning grain at higher resolutions).

Yes, to cap onto what I said, you can effectively scan film to an infinitesimal degree, however, you are eventually scanning greater levels of graininess beyond a certain point.
 
Last edited:
Another interesting side to this is what happens when you "upgrade" your film.

Older films can definitely look great when converted to a higher resolution but one should be vary of falling into the trap of 'more resolution is always better.'

Another aspect to consider when thinking about resolution is the human eyesight, viewing angle, viewing distance and other fun stuff. If you're interested, Red has a good basic overview here: http://www.red.com/learn/red-101/eyesight-4k-resolution-viewing
 
Back
Top