I recently got a new system setup with a GTX470 (448 cuda cores, w/latest drivers instld) & i7 2600k and was eager to see the faster rip times enabling cuda on the vid card. I was very impressed with around 80fps and rip time around 50min (2-pass). However, I've come to realize that the quality using cuda is not as good when compared to software enabled decoding/encoding.
I'm basically using the default MKV profile (H264/AC3) ripping around 13kbps @ 1080x1920, 2-pass. I noticed that when I have cuda enabled for both decoding/encoding, I can see more artifacts, like pixelation on certain scenes and overall some scenes just look grainy (displayed on a 52" LCD). Switching to software for both decoding/encoding, using the same movie scenes, removes the artifacts/pixelation/grainy. The movie just looks better, but at a cost of almost two times the ripping time it would take if I had cuda enabled. I really wanted to use this vid card to aid in processing, but it's not worth it if I'm getting lower quality. Is there different processing instructions when using cuda vs. cpu? Any suggestions?
I'm basically using the default MKV profile (H264/AC3) ripping around 13kbps @ 1080x1920, 2-pass. I noticed that when I have cuda enabled for both decoding/encoding, I can see more artifacts, like pixelation on certain scenes and overall some scenes just look grainy (displayed on a 52" LCD). Switching to software for both decoding/encoding, using the same movie scenes, removes the artifacts/pixelation/grainy. The movie just looks better, but at a cost of almost two times the ripping time it would take if I had cuda enabled. I really wanted to use this vid card to aid in processing, but it's not worth it if I'm getting lower quality. Is there different processing instructions when using cuda vs. cpu? Any suggestions?
Comment