Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Bluray vs DVD at 1920x1080? (Read 25018 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Bluray vs DVD at 1920x1080?

Reply #25
Are those the average bitrates listed for those movies?  I have the "Kingdom of Heaven" Director's Cut on Blu-Ray and on certain scenes the video bitrate spikes all the way up to 45 Mbps.
That spends how you measure it. With any MPEG encoding, some analysers will show huge instantaneous peaks (e.g. for a single frame, which only lasts a short time, so can imply a huge data rate that's meaningless in practice), but the decoder has a buffer, and it's the maximum data rate into this buffer that's normally implied. The MPEG encoder knows the prescribed size of the buffer for a given format, and ensures it will never overflow.

Cheers,
David.

Bluray vs DVD at 1920x1080?

Reply #26
Please read the post I was replying to. onkl was actually defending your position, but s/he was suggesting that bandwidth that is saved by going 720 can be used for higher frame rate.

Right. I wasn't suggesting to use frame interpolation either. But for some contents it's feasible to lower the resolution and use the bandwidth for something else. Animation for example could easily be rendered with more then 24 frames or future productions that want to stay within Blu-ray specs. So 720p60 could be supperior to 1080p24 depending on the source material, that's what this discussion was about.

Yeah, but actually computer animation titles are the highest resolution titles out there, and where 1080p really shines. The content you are talking about is probably old video-sourced, or old film movies.

Bluray vs DVD at 1920x1080?

Reply #27
There was an 3dfx demo showing the difference between 30 and 60 fps. It was obviously smoother. But I doubt the effect of anything above that is worth it.

Bluray vs DVD at 1920x1080?

Reply #28
Anyone that has played any games can tell between 30 and anything significantly higher. 50-60 looks very smooth to me, but I haven't run a scientific test to discern anything higher. Sometimes with video it's hard to notice above certain lower frame rates cause of motion blur. The "360-degree shutter" in digital I referenced above causes motion blur that makes the picture look smoother even though it's still filmed at 24fps.

Bluray vs DVD at 1920x1080?

Reply #29
Just figured out how to do this at least with Theora, which has options for upscaling to 720p. I think not only are the older pictures "sometimes" remastered, but they are more or less up-sampled with bilinear or bicubic algorithms!  depending upon the nature of these and the time and space requirements the older copies upscaled to HD can look fantastic or they can look "grainy" depending upon quality of the algorithms. Theora upscaling algorithms on "Ptalarbvorm" are very good. I don't use H.264 and I am pretty sure you can also do this with WebM? Any other thoughts or suggestions? 
budding I.T professional

Bluray vs DVD at 1920x1080?

Reply #30
There are actually quite a few releases on Blu-ray where their DVD counterparts being upscaled actually produce results that are either the same as the Blu-ray release or better (even without the increased color gamut).

There is no increased color gamut.

-k

Bluray vs DVD at 1920x1080?

Reply #31
Me too  TVs/projectors usually double the frame rate anyways in 24fps mode, generally the stutter you get is the conversion to 50 or 60fps but I see your point.

If the scene is recorded in 24fps, then that limits the available information.

A projector may be able to present that information in more or less "pleasing" ways, but it will not (in general) be able to produce the same output as it could had the scene been recorded with a higher framerate in the first place.

CDs are recorded with a 44.1kHz samplerate. If perfect "upscaling" was possible, then the samplerate might as well have been set to 1 Hz, and the remaining 44099 samples generated by some alien prediction algorithm.

Video is in some ways even worse than audio, because "proper" lowpass filters are not the norm.

-k

Bluray vs DVD at 1920x1080?

Reply #32
There is no increased color gamut.

-k


Actually xvYCC has been adopted by a large portion of Blu-ray players (including some earlier Sony models and the PS3) and even some HDTVs but there currently aren't any Blu-ray releases (that I know of) that use it.  So the technology is there for them to use.  I was simply going off of what Greg had previously stated:
"The color gamut on BD is far superior to either of the DVD formats, and that's apparent even on a well-done DVD transfer using component outputs."

Bluray vs DVD at 1920x1080?

Reply #33
It's a little more complicated, actually... while the available gamuts for MPEG-2 and H.264 are technically the same, the problem is that using MPEG-2 encoding tends to reduce color saturation in practice (it wants to discard detail in dark areas), so even if you bump up the saturation or contrast on your playback device, the loss of information yields a "stretched" saturation curve.  Even though the extreme values are still present in the output stream, the gamut available in the source material is not displayed.  I guess the actual issue is that you get increased quantization or banding of the colors and it's difficult to compensate for this on the display in a way that doesn't make the image look even worse.

It's even worse if the lab which created the DVD used DV as a waypoint during the processing (most studios would have no reason to do this); this causes further loss in saturation/apparent gamut and it's fairly common in DVDs authored using Final Cut or other NLEs.

Bluray vs DVD at 1920x1080?

Reply #34
It's a little more complicated, actually... while the available gamuts for MPEG-2 and H.264 are technically the same, the problem is that using MPEG-2 encoding tends to reduce color saturation in practice (it wants to discard detail in dark areas), so even if you bump up the saturation or contrast on your playback device, the loss of information yields a "stretched" saturation curve.  Even though the extreme values are still present in the output stream, the gamut available in the source material is not displayed.  I guess the actual issue is that you get increased quantization or banding of the colors and it's difficult to compensate for this on the display in a way that doesn't make the image look even worse.

It's even worse if the lab which created the DVD used DV as a waypoint during the processing (most studios would have no reason to do this); this causes further loss in saturation/apparent gamut and it's fairly common in DVDs authored using Final Cut or other NLEs.

The main effect of lossy video coding is quantization of spatial frequency components. Are you suggesting that MPEG2 leads to systematic errors of significant frequency components towards zero? Do you have any references for this?

The fact that lossy encoding is... lossy is evident, and not something that I think post-processing can do a lot about. If encoding artifacts are too large, you probably are not using a high enough bitrate for the application.

-k

Bluray vs DVD at 1920x1080?

Reply #35
Actually what Cameron proposed was raising the framerate for smoother motion, 48fps was for regular "2D" movies. The "soap opera effect" though is gonna be tough to get out of people's minds. It even looks bad to me when directors use 360 degree shutter with regular 24fps movies shot digitally (like Public Enemies, and I've seen other Michael Mann movies criticized for this as well). Even if it's purely psychological, I don't think it's gonna be easy for people to swallow. On the other hand, those 120Hz TVs and their interpolation modes are pretty popular, so maybe I'm wrong, and most people won't care.

i'd say 360 degree shutter and post-interpolation are two different thingies (1st has really low motion-blur, with 2nd; motion-blur will still depend on how the movie was shot).

I think it's odd to pick 48fps though? Why not 50fps? At least we have 1080p50 as a standard already don't we?

to keep easy compatibility with gazzilion 24fps movies?
PANIC: CPU 1: Cache Error (unrecoverable - dcache data) Eframe = 0x90000000208cf3b8
NOTICE - cpu 0 didn't dump TLB, may be hung

Bluray vs DVD at 1920x1080?

Reply #36
i'd say 360 degree shutter and post-interpolation are two different thingies (1st has really low motion-blur, with 2nd; motion-blur will still depend on how the movie was shot).

They are of course, but the effect is similar, it's motion that appears smoother than the 24fps we're used to, where only up to 180 degree shutter speeds are used.