For a test try FlushQueue=no; if that doesn't help it's better to set it back to "yes" then. This added 50-60 MB/s. In other words, this fixed the issue ; thank you!! Follow up, so why does this help, what is the benefit of flushing the queue, is that like a cleanup of sorts? Also set ContinueParti...
Check CPU usage and disk activity to find out where the bottleneck is. Maybe some process takes most of CPU time. While downloading: Average CPU: ~6% with ~3% as system resources and 2-3% for nzbget Average Memory: 2.65gb / 15.5gb used (16gb installed) Put QueueDir and TempDir on SSD. I did this an...
I bought an SSD to use as a place to have files downloaded to until they are completed and can be moved to HDD. As many others I am sure, I am working with larger files, maybe 30gb on average. Here are some highlights of settings from me reading the performance tips section of this website: Direct W...
Trying to get a better idea of how to optimize my setup and understand items from https://nzbget.net/performance-tips for my benefit... Current Setup: 4 bay NAS (qnap's TS-453Be), 16gb of RAM, 2 NAS HDD's (5400RPM) I've been thinking of adding either an SSD in one of the slots, an m.2 SSD, or simply...
The path is probably incorrect. Path "/ssd" looks suspicious. It should be something like /mnt/ssd or /media/ssd or /Volumes/ssd etc. Command "df" in terminal maybe helpful to see how drives are mounted. Found that my disk is "/share/external/DEV3305_2" - you're right I was still getting "Could not...
I'm trying to set my download paths to an external SSD but getting "Could not create directory" error. Everything works fine with downloads with the near default paths that are local to where NZBGet is installed (although slower than it should be, that's why I'm trying this). My volume is HFS+ and f...
Couple issues I believe: Thank you ahead of time for the support as I'm still pretty new. Could not close file /mnt/wd500gb/maindir/nzbtemp/450.010.tmp: /mnt/wd500gb/ is my mount point and has plenty of free room. nzbget was pausing with an error code that upon researching said not enough free space...
another update: when I was testing, I was having sickbeard snatch nzb files. It ended up grabbing a different download for the one that went through fast - the time of me posting this was pure coincidence that the 2 test files I used I think had been 'taken down' resulting in slow download speeds wh...