Developers --
A suggestion:
There is something not quite right about how the two types of file scans report their results (full scan versus non-full scan). The results can be misleading.
Here's a typical example:
I have a Blu-Ray disc image that is 36,464,754,688 bytes. I want to use Blu-Ray Copy to extract the main movie without compressing it, and then burn the movie onto a BD25 BD-R disc. To do this, I set DVDFab so that when it first sees the disc image, all subpictures and audio will be unchecked within the "Main Movie" configuration except for English. I leave in the HD sound.
When DVDFab estimates the disc size and compression requirements with "Full Scan" OFF, it displays the following results:
"Size 100% (23.068 GB -> 23.068 GB)" This suggests that DVDFab will not compress the data.
When DVDFab estimates the disc size and compression requirements with "Full Scan" ON, it displays the following more accurate results:
"Size 100% (23.140 GB -> 23.140 GB)" This also suggests that DVDFab will not compress the data.
Assuming these "gigabytes" are of the binary type (i.e. 1073741824 bytes), this can also be interpreted:
Full Scan Off: Size 100% (24,769,076,396 bytes -> 24,769,076,396 bytes)
Full Scan On: Size 100% (24,846,385,807 bytes -> 24,846,385,807 bytes)
Comparing these two sizes with the amount the disc will actually hold (25,025,314,816 bytes, which DVDFab knows from the "Common Settings" selection of 23866MB), 100% sound right. So far, so good.
However ...
When the movie is extracted with DVDFab configured for "Full Scan" ON, DVDFab compresses the movie to a disc size of 24,786,239,488 bytes.
When the movie is extracted with DVDFab configured for "Full Scan" OFF, DVDFab does a simple copy of the movie to a disc size of 24,853,741,568 bytes.
So, to summarize --
Here's what went right:
1. Full scan reported a more accurate file size than fast scan (within 0.03% versus 0.34%).
2. Fast scan reported 100% and copied the file without compressing it.
Here's what went wrong:
1. Full Scan correctly reported 100% and then needlessly compressed the disc by around 0.27%, taking far longer than it should have to create the "movie only" disc, and with a quality that could only approach that of a simple copy.
Because percentage-of-original size estimates of less than 100% can be used as a rough indication to the user of the resulting quality of DVDFab's compression, the estimates reported are useful and appropriate. However, in the special case where 100% is reported, the estimates can be misleading. In this exception case, the estimates should really mean what they imply -- that there will be no compression.
Suggested fix:
At a minimum, since you already apply an algorithm to decide whether or not to use compression, that result could be tied into the scan report so that 100% always means no compression. In the meantime, users should be warned that 100% may or may not mean that no compression will be applied.
A better solution would be to combine the above solution with the creation of a better decision algorithm, since the current one apparently made the wrong choice in deciding to needlessly compress the data in one of the cases above.
Regards,
Ed S.
A suggestion:
There is something not quite right about how the two types of file scans report their results (full scan versus non-full scan). The results can be misleading.
Here's a typical example:
I have a Blu-Ray disc image that is 36,464,754,688 bytes. I want to use Blu-Ray Copy to extract the main movie without compressing it, and then burn the movie onto a BD25 BD-R disc. To do this, I set DVDFab so that when it first sees the disc image, all subpictures and audio will be unchecked within the "Main Movie" configuration except for English. I leave in the HD sound.
When DVDFab estimates the disc size and compression requirements with "Full Scan" OFF, it displays the following results:
"Size 100% (23.068 GB -> 23.068 GB)" This suggests that DVDFab will not compress the data.
When DVDFab estimates the disc size and compression requirements with "Full Scan" ON, it displays the following more accurate results:
"Size 100% (23.140 GB -> 23.140 GB)" This also suggests that DVDFab will not compress the data.
Assuming these "gigabytes" are of the binary type (i.e. 1073741824 bytes), this can also be interpreted:
Full Scan Off: Size 100% (24,769,076,396 bytes -> 24,769,076,396 bytes)
Full Scan On: Size 100% (24,846,385,807 bytes -> 24,846,385,807 bytes)
Comparing these two sizes with the amount the disc will actually hold (25,025,314,816 bytes, which DVDFab knows from the "Common Settings" selection of 23866MB), 100% sound right. So far, so good.
However ...
When the movie is extracted with DVDFab configured for "Full Scan" ON, DVDFab compresses the movie to a disc size of 24,786,239,488 bytes.
When the movie is extracted with DVDFab configured for "Full Scan" OFF, DVDFab does a simple copy of the movie to a disc size of 24,853,741,568 bytes.
So, to summarize --
Here's what went right:
1. Full scan reported a more accurate file size than fast scan (within 0.03% versus 0.34%).
2. Fast scan reported 100% and copied the file without compressing it.
Here's what went wrong:
1. Full Scan correctly reported 100% and then needlessly compressed the disc by around 0.27%, taking far longer than it should have to create the "movie only" disc, and with a quality that could only approach that of a simple copy.
Because percentage-of-original size estimates of less than 100% can be used as a rough indication to the user of the resulting quality of DVDFab's compression, the estimates reported are useful and appropriate. However, in the special case where 100% is reported, the estimates can be misleading. In this exception case, the estimates should really mean what they imply -- that there will be no compression.
Suggested fix:
At a minimum, since you already apply an algorithm to decide whether or not to use compression, that result could be tied into the scan report so that 100% always means no compression. In the meantime, users should be warned that 100% may or may not mean that no compression will be applied.
A better solution would be to combine the above solution with the creation of a better decision algorithm, since the current one apparently made the wrong choice in deciding to needlessly compress the data in one of the cases above.
Regards,
Ed S.