For a study I am working on, I have been comparing GPU compression efficiency on several types of GPUs: GTX460, GTX570 and GTS250 and I noticed that occassionally, the GPU is not used and was wondering why. For example, the following image shows a BD-R compression using a GTX460. 152.m2ts was compressed using GPU and so was 167.m2ts but it seems that 148.m2ts was re-encoded using the CPU only (note the 12 minute interval). This can be observe by looking at the temperature graph that shows the GPU colored in orange and 4 CPUs shown under it at lower temperature. The temperature of the GPU drops indicating it is not being used for about 12 minutes, then it picks up again after 148 is processed.
Could it be that this section of the movie did not need any compression or were the CPUs used for re-encoding during this time? If the CPU was actually being used for re-encoding, is there any way I can force DVDFab to use the GPU all the time?
What is the difference between encoding and re-encoding?
Thanks for looking!
Could it be that this section of the movie did not need any compression or were the CPUs used for re-encoding during this time? If the CPU was actually being used for re-encoding, is there any way I can force DVDFab to use the GPU all the time?
What is the difference between encoding and re-encoding?
Thanks for looking!
Comment