Search found 6742 matches

by hugbug
Today, 07:32
Forum: Support
Topic: Only accept files with "language" in it
Replies: 1
Views: 6

Re: Only accept files with "language" in it

I don't know what you mean. Anyway nzbget downloads all files added to queue. If you think some files isn't worth downloading you shouldn't add them to queue in the first place. Therefore it's a responsibility of Sonarr to not send files which you don't need. Saying that, you still can mark download...
by hugbug
08 Dec 2017, 11:03
Forum: Feature discussion
Topic: Feature request / search: Post-processing script return code WARNING
Replies: 2
Views: 22

Re: Feature request / search: Post-processing script return code WARNING

Status "FAILURE" is what you want to use. Failure for downloading and Failure for post-processing are different things and result in different statuses. A failure of pp-script produces status "WARNING/SCRIPT" which an external app should treat as WARNING. See https://nzbget.net/api/history for detai...
by hugbug
06 Dec 2017, 07:43
Forum: Support
Topic: RSS Feed Number of Results with NZBPlanet
Replies: 3
Views: 34

Re: RSS Feed Number of Results with NZBPlanet

You are trying to misuse a feature which was designed for other purpose (automation is the purpose). What you should do instead is to browse the indexer web-site and choose nzb-files there. To make it easier to add nzbs to nzbget you can use a bookmark AKA "shopping cart" feature. When browsing the ...
by hugbug
06 Dec 2017, 07:36
Forum: Support
Topic: NZBGet always hangs, and I don't understand why
Replies: 4
Views: 119

Re: NZBGet always hangs, and I don't understand why

@mattgphoto
NZBGet is a user space app. If you whole system hangs or you can't kill nzbget process it looks more like a hardware problem or a software system problem (drivers etc.).
by hugbug
05 Dec 2017, 21:56
Forum: Support
Topic: RSS Feed Number of Results with NZBPlanet
Replies: 3
Views: 34

Re: RSS Feed Number of Results with NZBPlanet

You are right, the limit is set by RSS provider. You need to ask them if there is a way to increase the number of returned items. However due to quick DMCA takedowns you better have your downloader running uninterrupted because even if you could get older nzbs you may not be able to download their c...
by hugbug
03 Dec 2017, 20:19
Forum: Support
Topic: Large files are failing to unpack error: 5
Replies: 17
Views: 82

Re: Large files are failing to unpack error: 5

2GHz, 2 cores is a lot of power, more than enough for nzbget, even 1Gpbs should be saturated. If you can attach an SSD for intermediate files it would run even smoother (but not necessary).
by hugbug
03 Dec 2017, 19:55
Forum: Support
Topic: Large files are failing to unpack error: 5
Replies: 17
Views: 82

Re: Large files are failing to unpack error: 5

Why don't run nzbget directly on the NAS?
by hugbug
03 Dec 2017, 19:45
Forum: Support
Topic: Large files are failing to unpack error: 5
Replies: 17
Views: 82

Re: Large files are failing to unpack error: 5

sirace135 wrote:
03 Dec 2017, 19:39
I am willing to bet that lockup also causes some sort of network halt which is that "Unexpected network error"
It think it's the other way around: there is a network issue (probably a hardware problem), which causes low-level subsystems (drivers, etc.) to hang.
by hugbug
03 Dec 2017, 19:37
Forum: Support
Topic: Large files are failing to unpack error: 5
Replies: 17
Views: 82

Re: Large files are failing to unpack error: 5

Unrar: An unexpected network error occurred.
That's the root cause of all problems. The connection between PC and NAS isn't properly working. Try to unpack using Winrar UI like 20 times and see if it succeeds all 20 times. I guess it will not.