Failed downloads

Get help, report and discuss bugs.
bbddpp
Posts: 3
Joined: 11 Jun 2018, 21:40

Re: Failed downloads

Post by bbddpp » 12 Jun 2018, 01:32

Attaching 2 logs fresh from tonight. In both cases the files did unpack fine and play great, sent back to Sonarr and in the proper directory, renamed and all. But nzbget indicates failure. nzb.su is rss provider.
Attachments
Elementary.S06E07.1080p.HDTV.X264-DIMENSION-xpost.log
(2.08 KiB) Downloaded 211 times
Supergirl.S03E22.720p.HDTV.x264-SVA-xpost.log
(2.01 KiB) Downloaded 199 times

hnorgaar
Posts: 3
Joined: 08 Jun 2018, 05:18

Re: Failed downloads

Post by hnorgaar » 12 Jun 2018, 03:41

Hi there again, yes they do indeed download, and just a couple of observations.
It seems to be linked to Nzb.su and their xpost which I guess is files which are unpacked by Nzb.su and reuploaded as a single mkv/video file. Sabnzbd is not reporting them as failed but does tell that 1 article is missing for them, and in the Nzb.su forum it is mentioned as well, and the solution is to avoid xpost files, but all of them are now xpost from Nzb.su

Henrik Norgaard

sanderj
Posts: 184
Joined: 10 Feb 2014, 21:46

Re: Failed downloads

Post by sanderj » 12 Jun 2018, 06:29

hnorgaar wrote:
12 Jun 2018, 03:41
Sabnzbd is not reporting them as failed but does tell that 1 article is missing for them, and in the Nzb.su forum it is mentioned as well, and the solution is to avoid xpost files, but all of them are now xpost from Nzb.su

Henrik Norgaard
Ah, yes: on the SABnzbd forum it was discovered private indexers put a fake segment/article into their NZB (as a tracking method). SABnzbd was made resilient against that some months ago.

If you inspect the NZB it's easy to find that fake segment/article based on the file extension. Can't remember it right now, though.

hugbug
Developer & Admin
Posts: 7645
Joined: 09 Sep 2008, 11:58
Location: Germany

Re: Failed downloads

Post by hugbug » 12 Jun 2018, 08:53

bbddpp wrote:
12 Jun 2018, 01:32
Attaching 2 logs fresh from tonight.
From Supergirl.S03E22.720p.HDTV.x264-SVA-xpost.log:
Mon Jun 11 2018 21:28:57 WARNING 1 of 1317 article downloads failed for "Supergirl.S03E22.720p.HDTV.x264-SVA-xpost/Supergirl.S03E22.720p.HDTV.x264-SVA.mkv"
File "Supergirl.S03E22.720p.HDTV.x264-SVA.mkv" has one missing article. "play great" huh? Maybe, depending where this one missing article was. One article of 1317 means 0,07% percent damage or about 2.7 seconds of video missing. You may have not noticed that especially if the missing fragment was on opening or closing credits. Nonetheless the video is damaged.

From Elementary.S06E07.1080p.HDTV.X264-DIMENSION-xpost.log:
Mon Jun 11 2018 20:33:55 WARNING 1 of 2 article downloads failed for "Elementary.S06E07.1080p.HDTV.X264-DIMENSION-xpost/newz[NZB].nfo"
File "newz[NZB].nfo" missing one from two articles. That's not critical. The video is (probably) OK. NZBGet can ignore errors in files of certain types and it does ignore errors in .nfo files by default. Unfortunately that ignoring is part of par-check (option ParIgnoreExt) which wasn't take place for that download due to lack of par2-files. If more logs will indicate similar problem a special solution could be developed, for example an extension script to remove all non-mkv files from nzb before adding to queue.

Summary
So far three log-files were provided. In all three cases the posts contained mkv-files (not archive files) and no par2-files. The lack of par2-file is a real problem which prevents adequate checking of file integrity. At the end of processing downloads contain mkv-files; in two cases they are damaged but users do think they are OK: "did unpack fine" (there were nothing to unpack; if they were archive files the unpack would fail) and "play great" (missing segments may go unnoticed during playback).

There is still a chance that the nzb contain fake article for identification but in these three cases it doesn't look so.

renzz
Posts: 2
Joined: 11 Jun 2018, 13:00

Re: Failed downloads

Post by renzz » 12 Jun 2018, 11:08

I too had the failure in the nfo file for Elementary, and looking at the NZB file, I can see why.

The "File" entry says there is one segment for the file (where it says (1/1) at the end):

admin edit: code removed

but there are actually two "Segment" entries, with the second being the non-existent tracking one:

admin edit: code removed

I also had one where the extra "segment" was in the MKV file. Again, the "File" entry specifies 5662 segments:

admin edit: code removed

but there is an extra segment number 5663:

admin edit: code removed

I guess this explains why the MKV isn't corrupt because the genuine 5662 segments have been processed so the entire file is intact, but the extra (unavailable) segment causes NZBGet to think it's failed.

I guess one solution is for NZBGet to only process the number of segments specified in the "File" entry and ignore any extras. We don't want it to delete any files that it thinks are incomplete because they actually aren't.

I've put the full nzb files at admin edit: link removed

hugbug
Developer & Admin
Posts: 7645
Joined: 09 Sep 2008, 11:58
Location: Germany

Re: Failed downloads

Post by hugbug » 12 Jun 2018, 11:36

renzz:
the purpose of fake articles is to identify nzb leaks to other sites. These fake article ids may be unique for each users and may allow to identify users. I don't know if that's the case with posted nzbs but to protect you I've removed all references. I have your original post saved for my own usage.

renzz
Posts: 2
Joined: 11 Jun 2018, 13:00

Re: Failed downloads

Post by renzz » 12 Jun 2018, 11:46

Thanks hugbug, it didn't even occur to me that they may be unique to me. Just as well you removed them...

Still, I hope they are of some use.

hugbug
Developer & Admin
Posts: 7645
Joined: 09 Sep 2008, 11:58
Location: Germany

Re: Failed downloads

Post by hugbug » 12 Jun 2018, 12:11

A poorly implemented leak protection defeats its purpose. Leak protection leading to download errors makes users to investigate the failures. That in turn makes the fact of leak protection become public. Now everyone knows how the leak protection were supposed to work. It is now for everyone easy to modify the nzbs by removing fake articles before uploading them to other sites.

I don't think I should do anything in NZBGet. The site runners should stop doing what they are doing now. If they do want some kind of nzb identification they should implement it in a way not causing download errors.
renzz wrote:
12 Jun 2018, 11:08
I guess one solution is for NZBGet to only process the number of segments specified in the "File" entry and ignore any extras.
Someone could write an extension script (scan-script) which would modify nzb-file before it is added to queue. As a starting point use for example this script.

bbddpp
Posts: 3
Joined: 11 Jun 2018, 21:40

Re: Failed downloads

Post by bbddpp » 12 Jun 2018, 16:06

Got it. So we need to talk the owners of nzb.su out of doing what they have started to do to their nzb downloads or just live with it.

Thanks for taking the time to look into this and responding, appreciate the attention!

Thanks for all you do to make such great software for the community.

B.

darkhomb
Posts: 5
Joined: 11 Jun 2018, 03:01

Re: Failed downloads

Post by darkhomb » 12 Jun 2018, 23:51

Thank you hugbug for your time and input on it and I agree it's a problem that should be discussed with nzb.su

Post Reply

Who is online

Users browsing this forum: Baidu [Spider] and 57 guests