I recently started building a movie/show collection again on my home NAS.
I know that generally H.265 files can be 25-50% less bitrate than H.264 and be the same or better quality. But what’s the golden zone for both types? 10 Mbps for a 1080p H.264 movie? And would it be like 5 Mbps for H.265 1080p to be on par with H.264? What about 4K?
For file size: would it be 25GB for a 2 hour 1080p movie to be near or at original Blu-Ray/digital quality?
Depends on the media. High motion live action is going to require a higher bit rate than low motion animation.
I know that this is a non-answer, but the best thing to do is reencode a few files at multiple bitrates and see where the line is for you.
Try to get a few dark scenes, since that’s where compression artifacts tent to be most noticeable.
People really get confused when they see 50% reduction and exactly the same as X and think they’ll just crank down the bitrate and be good. As others have said you really should try a few options although instead of bitrate you could try using crf values and find what works on your setup.
Also while screenshots can help, it really is something you should look at in motion. Sometimes you can think ‘this looks horrible’ but in motion it’s fine and you never see it. But it also works the other way too.
There’s a science and an art to getting good encodes. Hell even the encoders update and update little things behind the scenes that can effect the outcome as well.
I was wondering the same thing recently because my nvidia captures videos took like 1 TB of space on my main PC. I wanted to compress them by switching to H265. In FFMPEG there’s no simple option like “loseless compression”, you always have to enter manually the bitrate or quality. Rendering a bunch of videos with different bitrates and trying to compare them to see if there’s a significant differene is a really long and annoying process. I gave up and just burnt everything to bluray instead.
Like others have said - it depends on the source media. In general, grainy sources require more bitrate to achieve a given quality as opposed to a clean, digitally shot source. You can choose a random bitrate and encode all your sources with it but you might not like the results or your encodes will be bloated for no reason.
Personally, having used both x264 and x265, I would stick with x264 for 1080p content. Yes there are some space saving advantages to using x265 but the time it takes really just isn’t worth it - in my opinion. This is assuming you’re using software encoding and not NVENC or QuickSync. Hardware encoding is much faster but yield larger file sizes and lower quality when compared to software encoding - again, not really worth it in my opinion.