nzbget 14.0-testing-r1150

Announcements about new stable and testing releases.
Subscribe to the forum for e-mail notifications.
Forum rules
This forum is readonly and is intended to inform users about new releases.
You can subscribe to the forum to receive e-mail notifications.
Developer & Admin
Posts: 7089
Joined: 09 Sep 2008, 11:58
Location: Germany

nzbget 14.0-testing-r1150

Post by hugbug » 25 Oct 2014, 21:11

This is a candidate for stable branch. Please report all issues.

nzbget 14.0-testing-r1150

Changes since nzbget 14.0-testing-r1145
  • additional parameters (env. vars) are now passed to scan scripts: NZBNP_DUPEKEY, NZBNP_DUPESCORE, NZBNP_DUPEMODE; scan-scripts can now set dupekey, dupemode and dupescore by printing new special commands;
  • queue scripts can now define what events they are interested in; this avoids unnecessary calling of the scripts which do not process certain events;
  • fixed a compiling error on OS/2 (bug introduced in r1145);
  • fixed potential crash which could happen in debug mode during program restart;
  • fixed: program could crash during restart if an extension script was running; now all active scripts are terminated during restart;
Other changes since 13.0
  • added article cache:
    • new option "ArticleCache" defines memory limit to use for cache;
    • when cache is active the articles are written into cache first and then all flushed to disk into the destination file;
    • article cache reduces disk IO and may reduce file fragmentation improving post-processing speed (unpack);
    • it works with both writing modes (direct write on and off);
    • when option "DirectWrite" is disabled the cache should be big enough (for best performance) to accommodate all articles of one file (sometimes up to 500 MB) in order to avoid writing articles into temporary files, otherwise temporary files are used for articles which do not fit into cache;
    • when used in combination with DirectWrite there is no such limitation and even a small cache (100 MB or even less) can be used effectively; when the cache becomes full it is flushed automatically (directly into destination file) providing room for new articles;
    • new row in the "statistics and status dialog" in web-interface indicates the amount of memory used for cache;
    • new fields "ArticleCacheLo", "ArticleCacheHi" and "ArticleCacheMB" returned by RPC-method "status";
    • see forum topic [New Feature] Article memory cache for more info;
  • renamed option "WriteBufferSize" into "WriteBuffer":
    • changed the dimension - now option is set in kilobytes instead of bytes;
    • old name and value are automatically converted;
    • if the size of article is below the value defined by the option, the buffer is allocated with the articles size (to not waste memory);
    • therefore the special value "-1" is not required anymore; during conversion "-1" is replaced with "1024" (1 megabyte) but it can be of course manually changed to any other value later;
  • integrated par2-module (libpar2) into NZBGet’s source code tree:
    • the par2-module is now built automatically during building of NZBGet;
    • this eliminates dependency from external libpar2 and libsigc++...
    • ...making it much easier for users to compile NZBGet without patching libpar2;
    • for more info see forum topic [New Feature] Integrated par2-module;
  • added quick file verification during par-check/repair:
    • if par-repair is required for download the files downloaded without errors are verified quickly by comparing their checksums against the checksums stored in the par2-file;
    • this makes the verification of undamaged files almost instant;
    • damaged (partially downloaded) files are also verified quickly by comparing block's checksums against the checksums stored in the par2-file; when necessary the small amounts of data is read from files to calculate block's checksums;
    • this makes the verification of damaged files very fast;
    • new option "ParQuick" (active by default);
    • when quick par verification is active the repaired files are not verified to save time; the only reason for incorrect files after repair can be hardware errors (memory, disk) but this is not something NZBGet should care about;
    • if unpack fails (excluding invalid password errors) and quick par-check does not find any errors or quick par-check was already performed the full par-check is performed; this helps in certain rare situations caused by abnormal program termination;
    • see forum topic [New Feature] Quick par verification for more info;
  • added multithreading par-repair:
    • doesn't depend on other libraries and works everywhere, on all platforms and all CPUs (with multiple cores), no special compiling steps are required;
    • new option "ParThreads" to set the number of threads for repairing;
    • the number of repair threads is automatically reduced to the amount of bad blocks if there are too few of them; if there is only one bad block the multithreading par-repair is switched off to avoid overhead of thread synchronisation (which does not make sense for one working thread);
    • new option "ParBuffer" to define the memory limit to use during par-repair;
    • for more info see forum topic [New Feature] Multithreading par-repair;
  • added support for detection of bad downloads (fakes, etc.):
    • queue-scripts are now called after every downloaded file included in nzb;
    • new events "FILE_DOWNLOADED" and "NZB_DOWNLOADED" of parameter "NZBNA_EVENT"; new env. var "NZBNA_DIRECTORY" passed to queue scripts;
    • queue-scripts have a chance to detect bad downloads when the download is in progress and cancel bad downloads by printing a special command; downloads marked as bad become status "FAILURE/BAD" and are processed by the program as failures (triggering duplicate handling); scripts executed thereafter see the new status and can react accordingly (inform an indexer or a third-party automation tool);
    • when a script marks nzb as bad the nzb is deleted from queue, no further internal post-processing (par, unrar, etc.) is made for the nzb but all post-processing scripts are executed; if option "DeleteCleanupDisk" is active the already downloaded files are deleted;
    • new status "BAD" for field "DeleteStatus" of nzb-item in RPC-method "history";
    • queue-scripts can set post-processing parameters by printing special command, just like post-processing-scripts can do that; this simplifies transferring (of small amount) of information between queue-scripts and post-processing-scripts;
    • scripts supporting two modes (post-processing-mode and queue-mode) are now executed if selected in post-processing parameters: either in options "PostScript" and "CategoryX.PostScript" or manually on page "Postprocess" of download details dialog in web-interface; it is not necessary to select dual-mode scripts in option "QueueScript"; that provides more flexibility: the scripts can be selected per-category or activated/deactivated for each nzb individually;
    • added option "EventInterval" allowing to reduce the number of calls of queue-scripts, which can be useful on slow systems;
    • for more info see forum topic Fake detection;
  • the list of scripts (pp-scripts, queue-scripts, etc.) is now read once on program start instead of reading everytime a script is executed:
    • that eliminates the unnecessary disk access;
    • the settings page of web-interface loads available scripts every time the page is shown;
    • this allows to configure newly added scripts without restarting the program first (just like it was before); a restart is still required to apply the settings (just like it was before);
    • RPC-method "configtemplates" has new parameter "loadFromDisk"
  • options "ParIgnoreExt" and "ExtCleanupDisk" are now respected by par-check (in addition to being respected by par-rename): if all damaged or missing files are covered by these options then no par-repair is performed and the download assumed successful;
  • added new search field "dupestatus" for use in rss filters:
    • the search is performed through download queue and history testing items with the same dupekey or title as current rss item;
    • the field contains comma-separated list of following possible statuses (if duplicates were found): QUEUED, DOWNLOADING, SUCCESS, WARNING, FAILURE or an empty string if there were no matching items found;
  • added log file rotation:
    • options "CreateLog" and "ResetLog" replaced with new option "WriteLog (none, append, reset, rotate)";
    • new option "RotateLog" defines rotation period;
  • improved joining of splitted files:
    • instead of performing par-repair the files are now joined by unpacker, which is much faster;
    • the files splitted before creating of par-sets are now joined as well (they were not joined in v13 because par-repair has nothing to repair in this case);
    • the unpacker can detect missing fragments and requests par-check if necessary;
  • added per-nzb time and size statistics:
    • total time, download, verify, repair and unpack times, downloaded size and average speed, shown in history details dialog via click on the row with total size in statistics block;
    • RPC-methods "listgroups" and "history" return new fields: "DownloadedSizeLo", "DownloadedSizeHi", "DownloadedSizeMB", "DownloadTimeSec", "PostTotalTimeSec", "ParTimeSec", "RepairTimeSec", "UnpackTimeSec";
    • see forum topic [New Feature] Per-nzb time statistics for screenshots and more info;
  • pp-script "" now supports mail server relays (thanks to l2g for the patch);
  • when compiled in debug mode new field "process id" is printed to the file log for each row (it is easier to identify processes than threads);
  • if an nzb has only few failed articles it may have completion shown as 100%; now it is shown as 99.9% to indicate that not everything was successfully downloaded;
  • updated configure-script to not require gcrypt for newer GnuTLS versions (when gcrypt is not needed);
  • for downloads delayed due to propagation delay (option "PropagationDelay") a new badge "propagation" is now shown near download name;
  • added new option "UrlTimeout" to set timeout for URL fetching and RSS feed fetching; renamed option "ConnectionTimeout" to "ArticleTimeout";
  • improved pp-script now it can send time statistics (thanks to JVM for the patch);
  • improvement in duplicate check:
    • if a new download with empty dupekey and empty dupescore is marked as "dupe" and the another download with the same name have non empty dupekey or dupescore these properties are copied from that download;
    • this is useful because the new download is most likely another upload of the same file and it should have the same duplicate properties for best duplicate handling results;
  • when connecting in remote mode using command line parameter "--connect/-C" the option "ControlIP" is now interpreted as "" if it is set to "" (instead of failing with an error message);
  • when option "ContinuePartial" is active the current state is saved not more often than once per second instead of after every downloaded article; this significantly reduce the amount of disk writings on high download speeds;
  • added commands "PausePostProcess" and "UnpausePostProcess" to scheduler;
  • unpack is now automatically immediately aborted if unrar reports CRC errors;
  • unpack is now immediately aborted if unrar reports wrong password (works for rar5 as well as for older formats); the unpack error status "PASSWORD" is now set for older formats too (not only rar5);
  • improved cleanup:
    • disk cleanup is now not performed if unrar failed even if par-check was successful;
    • queue cleanup (for remaining par2-files) is now made more smarter: the files are kept (parked) if they can be used by command "post-process again" and are removed otherwise;
  • improved scan-scripts: if the category of nzb-file is changed by the scan-script the assigned post-processing scripts are now automatically reset according to the new category;
  • added missing new line character at the end of the help screen printed by "nzbget -h";
  • better error reporting if a temp file could not be found;
  • added news server name to message "Cancelling hanging download ..." to help identifying problematic servers;
  • added column "age" to history tab in web-interface;
  • debug builds for Windows now print call stack on crash to the log-file, which is very useful for debugging;
  • fixed: RPC-method "editqueue" with action "HistoryReturn" caused a crash if the history item did not have any remaining (parked) files;
  • fixed: RPC-method "saveconfig" did not work via XML-RPC (but worked via JSON-RPC);
  • fixed: a superfluous comma at the end of option "TaskX.Time" was interpreted as an error or may cause a crash;
  • fixed: relative destination paths (options "DestDir" and "CategoryX.DestDir") caused failures during unrar;
  • fixed: splitted .cbr-files were not properly joined;
  • fixed: inner files (files listed in nzb) bigger than 2GB could not be downloaded;
  • fixed: cleanup may leave some files undeleted (Mac OSX only);
  • fixed: renaming of active downloads was broken (bug introduced in r1070);
  • fixed: when rotating log-files option TimeCorrection were not respected when building new file name - the filename could have wrong date stamp in the name (bug introduced in r1059);
  • fixed: compiler error if configured using parameter "--disable-gzip";
  • fixed: one log-message was printed only to global log but not to nzb-item pp-log;
  • fixed: par-check could fail on valid files (bug introduced in libpar2 0.3);
  • fixed: scheduler tasks were not checked after wake up if the sleep time was longer than 90 minutes;
  • fixed: the "pause extra pars"-state was missing in the pause/resume-loop of curses interface, key "P";
  • fixed: web interface showed an error box when trying to submit files with extensions other than .nzb, although these files could be processed by a scan-script; now the error is not shown if any scan-script is set in options.
Download link


Who is online

Users browsing this forum: No registered users and 1 guest