Only download NZB if required - RSS feeds?

Discuss newly added features or request new features.
get39678
Posts: 222
Joined: 09 Jun 2014, 10:49

Re: Only download NZB if required - RSS feeds?

Post by get39678 » 26 Jul 2018, 09:03

Thanks but unfortunately nzbget is crashing when it is processing the RSS feed - it is doing this with the previous and newer version (23 july). The log file abruptly ends in the middle of the RSS/URL duplicate processing so it is difficult to know what happened. When nzbget is restarted it continues to process but it has missed matches. No idea what has changed as it was working fine only very limited checking though. I am going to try disabling all extensions and a few other things and see if the problem persists.

hugbug
Developer & Admin
Posts: 7645
Joined: 09 Sep 2008, 11:58
Location: Germany

Re: Only download NZB if required - RSS feeds?

Post by hugbug » 26 Jul 2018, 14:07

Please compile with debug info (./configure --enabled-debug), then activate option "CrashTrace=yes". It should print call stack on crash.

get39678
Posts: 222
Joined: 09 Jun 2014, 10:49

Re: Only download NZB if required - RSS feeds?

Post by get39678 » 27 Jul 2018, 09:32

Running latest version with no extensions. Typically it has run without crashing this time however.

RSS had 9 new items, 6 duplicates, 3 downloads. Indexer reports only 3 downloads made, this is great as previous to this new feature it would have reported 9 downloads!
Fixed. The items should now have status DUPE instead of DELETED.
DELETED/URL items are still being added in the History (3 entries) so it hasn't fixed it for me.
activate option "CrashTrace=yes". It should print call stack on crash.
Shall I just keep running as I am with no extensions and see if I can get it to crash over a few days or should I put the extensions back and try to make it crash again?

Thanks

hugbug
Developer & Admin
Posts: 7645
Joined: 09 Sep 2008, 11:58
Location: Germany

Re: Only download NZB if required - RSS feeds?

Post by hugbug » 27 Jul 2018, 09:46

Does it crash more often with extensions?
The easier it is to reproduce the crash the better.
get39678 wrote:
27 Jul 2018, 09:32
DELETED/URL items are still being added in the History (3 entries) so it hasn't fixed it for me.
Strange, it worked in my tests but they are rather synthetic. I must have missed something.

get39678
Posts: 222
Joined: 09 Jun 2014, 10:49

Re: Only download NZB if required - RSS feeds?

Post by get39678 » 27 Jul 2018, 12:39

crash
I have put the extensions back as it was definitely crashing with the extensions, not done enough testing to know whether it was an extensions causing the problem or not.
DELETED/URL
All I can say so far is if there are X unique dupekeys (each dupekey may have several items though) in the RSS I seem to get X DELETED/URL History entries that are not removed unlike the entries that set to DUPE when Mark as Good (but this may be pure coincidence).

Perhaps the logs being more detailed in this area may help track down the problem?

INFO Moving collection <snip1> with lower duplicate score to history [DELETE/DUPE, DUPE/DELETED etc + more useful information]
INFO Collection <snip1> added to history [DUPE/DELETED/SUCCESS etc + more useful information]

hugbug
Developer & Admin
Posts: 7645
Joined: 09 Sep 2008, 11:58
Location: Germany

Re: Only download NZB if required - RSS feeds?

Post by hugbug » 27 Jul 2018, 15:24

I've made another fix for DELETED/DUPE status. Please update and test.
get39678 wrote:
27 Jul 2018, 12:39
I have put the extensions back as it was definitely crashing with the extensions
Have you tried recompiling in debug mode? If you can post call stack that would be very helpful.

get39678
Posts: 222
Joined: 09 Jun 2014, 10:49

Re: Only download NZB if required - RSS feeds?

Post by get39678 » 28 Jul 2018, 09:44

I've made another fix for DELETED/DUPE status. Please update and test.
Only limited testing so far but that has fixed it. Thanks.
Have you tried recompiling in debug mode? If you can post call stack that would be very helpful.
I have extensions enabled so I have the same setup where I had a couple days of it working and then a couple days of it crashing. Yes I am ready if it crashes I will post the trace but as it is intermittent I just need to wait for it to happen. My current thinking is as it is intermittent it may depend on contents of RSS.

get39678
Posts: 222
Joined: 09 Jun 2014, 10:49

Re: Only download NZB if required - RSS feeds?

Post by get39678 » 31 Jul 2018, 09:37

Not had any crashes but I haven't had many matches so its not really an ideal test I will continue to monitor.

I have noticed that the number of nzb downloads are sometimes higher than what it should be.

The feed may report 8 items found, 2 items are downloaded but it may incorrectly make 4 nzb downloads (or at least attempts so they are counted).

Is this what should be expected?

It could be the system starts downloading URLs before it has looked through all the items and checked if they are duplicates or not?

For example, this was an RSS item that was a lower score duplicate and added to the queue in a paused state due to its size but the nzb was still attempted to be downloaded.

Code: Select all

Tue Jul 31 06:25:05 2018	23540	140613915756288	DETAIL	Downloading <match1> @ <snip>
Tue Jul 31 06:25:05 2018	23540	140614335194880	DEBUG	Adding URL to queue (UrlCoordinator.cpp:378:AddUrlToQueue)
Tue Jul 31 06:25:05 2018	23540	140614335194880	DEBUG	Notifying observers (Observer.cpp:37:Notify)
Tue Jul 31 06:25:05 2018	23540	140614335194880	INFO	Moving collection <match1> with lower duplicate score to history
Tue Jul 31 06:25:05 2018	23540	140614335194880	INFO	Deleting active URL <match1>
Tue Jul 31 06:25:05 2018	23540	140614335194880	DEBUG	Trying to stop WebDownloader (WebDownloader.cpp:642:Stop)
Tue Jul 31 06:25:05 2018	23540	140614335194880	DEBUG	Stopping Thread (Thread.cpp:122:Stop)
Tue Jul 31 06:25:05 2018	23540	140614335194880	DEBUG	Cancelling connection (Connection.cpp:894:Cancel)
Tue Jul 31 06:25:05 2018	23540	140613915756288	WARNING	URL <match1> @ <snip> failed: 
Tue Jul 31 06:25:05 2018	23540	140613915756288	DEBUG	Releasing connection (WebDownloader.cpp:657:FreeConnection)
Tue Jul 31 06:25:05 2018	23540	140614335194880	DEBUG	WebDownloader stopped successfully (WebDownloader.cpp:650:Stop)
Tue Jul 31 06:25:05 2018	23540	140613915756288	DEBUG	Disconnecting (Connection.cpp:181:Disconnect)
Tue Jul 31 06:25:05 2018	23540	140613915756288	DEBUG	Do disconnecting (Connection.cpp:854:DoDisconnect)
Tue Jul 31 06:25:05 2018	23540	140613915756288	DEBUG	Destroying Connection (Connection.cpp:140:~Connection)
Tue Jul 31 06:25:05 2018	23540	140613915756288	DEBUG	Disconnecting (Connection.cpp:181:Disconnect)
Tue Jul 31 06:25:05 2018	23540	140613915756288	DEBUG	Notifying observers (Observer.cpp:37:Notify)
Tue Jul 31 06:25:05 2018	23540	140613915756288	DEBUG	Notification from UrlDownloader received (UrlCoordinator.cpp:239:Update)
Tue Jul 31 06:25:05 2018	23540	140613915756288	DEBUG	URL downloaded (UrlCoordinator.cpp:252:UrlCompleted)
Tue Jul 31 06:25:05 2018	23540	140613915756288	DEBUG	Filename: [<snip>] (UrlCoordinator.cpp:270:UrlCompleted)
Tue Jul 31 06:25:05 2018	23540	140614335194880	INFO	Moving collection <match1> with lower duplicate score to history
Tue Jul 31 06:25:05 2018	23540	140614335194880	INFO	Deleting active URL <match1>
Tue Jul 31 06:25:05 2018	23540	140614335194880	DEBUG	Trying to stop WebDownloader (WebDownloader.cpp:642:Stop)

hugbug
Developer & Admin
Posts: 7645
Joined: 09 Sep 2008, 11:58
Location: Germany

Re: Only download NZB if required - RSS feeds?

Post by hugbug » 31 Jul 2018, 11:03

get39678 wrote:
31 Jul 2018, 09:37
It could be the system starts downloading URLs before it has looked through all the items and checked if they are duplicates or not?
Yes. I'll think something out to fix this.
get39678 wrote:
31 Jul 2018, 09:37
added to the queue in a paused state due to its size but the nzb was still attempted to be downloaded.
The paused-state has probably gone missing when the URL was moved to history and then back. How did that work before? When a downloaded nzb in paused state was moved to history and then moved back by duplicate handling - was it still paused in queue? In that case it would not be downloaded automatically and would require manual handling (unpausing) but that's probably desired?

get39678
Posts: 222
Joined: 09 Jun 2014, 10:49

Re: Only download NZB if required - RSS feeds?

Post by get39678 » 31 Jul 2018, 12:23

hugbug wrote:
31 Jul 2018, 11:03
get39678 wrote:
31 Jul 2018, 09:37
It could be the system starts downloading URLs before it has looked through all the items and checked if they are duplicates or not?
Yes. I'll think something out to fix this.
Thanks for looking for a fix. What we have now though is still a great improvement than downloading *all* the URLs regardless of dupekey/score. Thanks again.
hugbug wrote:
31 Jul 2018, 11:03
get39678 wrote:
31 Jul 2018, 09:37
added to the queue in a paused state due to its size but the nzb was still attempted to be downloaded.
The paused-state has probably gone missing when the URL was moved to history and then back. How did that work before? When a downloaded nzb in paused state was moved to history and then moved back by duplicate handling - was it still paused in queue? In that case it would not be downloaded automatically and would require manual handling (unpausing) but that's probably desired?
A bit of confusion here - I agree that if the item was added to queue in Paused state and then moved to History and back again it should remain Paused. I do not think the Paused state is going missing. I think it is currently working this way/correctly. I will confirm the next opportunity though and reply if required.

The reason of mentioning the Paused state was should any item added to the Queue in the Paused state have the URL downloaded? It should still be duplicate checked/scored and moved to History if required but it is Paused so URL is not needed until it is (or if) Resumed?

It will not always currently be the case due to this (above) (ordering of RSS items and so on) but as it appears both my extra URL downloads this time were Paused items if it didn't download the Paused URLs these two extra nzb downloads would have more than likely been avoided.

Post Reply

Who is online

Users browsing this forum: No registered users and 4 guests