Ok, if I understand this properly, nzbget verifies the files in a collection after download, and then downloads extra pars without having to redo the verification bit. This means there is no speed advantage in knowing how many extra pars we need to download upfront, right? So for broken releases, the speed is already good, but for unbroken releases it's not. So there is no real need to calculate the exact number of missing blocks and a simple checksum check will be good enough. If you can keep track of the number of successfully verified files and this number is equal to the number of files in the par2 archive, you can then skip the final verification step and go straight to the post-script. I quickly wrote a par2 parser that you can use exactly for this purpose in nzbget - if you want to.
par2chk.h / par2chk.cpp, the interface is this:
int par2chk_verify(char *parfile, char *diskfile); // return 1 if OK
int par2chk_numfiles(char *parfile); // return number of files in par2
It uses the par2cmdline stuff for the MD5 checksum calculation (turns out that is twice as fast as the GNULIB version on my pc).
I uploaded the thing to
http://apollo.spacelabs.nl/~emiel/par2chk.tar.gz and you can use it however you want, or not at all.. but it would be nice if you can make nzbget even faster
oh I didn't implement the par2-self check to check if the par2 file itself is good.. you can probably add that if you want to use it.