I have been doing a lot of testing for Rip tasks using both h264 and h265 profiles, and I have found a couple of very odd things.
For comparison:
the (MP4.H265) profile has a default Video Quality of Standard, at a 1333 Kb/s Bit Rate. High Quality is set to 2666 Kb/s Bit Rate
the (MP4) General profile (which uses h264) has a default Video Quality of Standard at a 2031 Kb/s Bit Rate. High Quality is set to 4061 Kb/s Bit Rate.
Note that the Bit Rates set for both Standard and High Quality for the h264 Codec are being set in the profile to almost twice the Bit Rate of the h265 profile. I understand that h264 and h265 are different compression methods, but why would he Bit Rate be set differently for the same Video Quality?
But here is the really strange thing:
When I run a Rip task using the h265 profile, using either Software encoding or the hardware encoding for my AMD GPU, the output files of BOTH encoding types have a Bit rate consistent with the Bit Rate shown in the Advanced Settings for that profile.
HOWEVER, when I run a Rip task using the h264 profile, the results form the Software encoding and the AMD hardware encoding are totally different! If I use the Software encoding, the output file has a Bit Rate consistent with what is shown in the Advanced Settings, but if I use the AMD hardware encoding, the output file has a much lower Bit Rate, more consistent with what is shown in the h265 profile.
For example, I ran one rip task using profile (MP4) and High Quality, using Software encoding. The output file shows a Bit Rate of 4061 Kb/s. I then repeated the identical rip task, still using (MP4) and High Quality, but this time I used the AMD APP hardware encoding, and the output file shows a Bit Rate of 1901 Kb/s (totally different than what the Advanced Settings showed it was supposed to use).
This does not happen when I use the h265 profile. With it, the output Bit Rates for either Software or Hardware encoding are very similar.
Is there a bug here, or is this really what we should expect with the h264 codec?