Performance Tuning - Tried Everything

Get help, report and discuss bugs.
Post Reply
northernivy
Posts: 10
Joined: 01 Feb 2017, 22:33

Performance Tuning - Tried Everything

Post by northernivy » 20 Aug 2018, 23:00

Another "I can't saturate my gigabit line question". I am using three different news providers all set at level 0 and I have all three set to use one less connection than their max. I have followed the advice in the performance tuning guides, but it seems that on my Synology 1517+ with 8gb of ram I can only see about 57 megs down. When I look on the NAS I am never peaking my RAM or CPU so I do not think that I am hardware limited. I am running the most recent version of NZBget.

I am 100% open to ideas....

Would running everything in docker get better performance?

hugbug
Developer & Admin
Posts: 7645
Joined: 09 Sep 2008, 11:58
Location: Germany

Re: Performance Tuning - Tried Everything

Post by hugbug » 21 Aug 2018, 09:15

Test nzbget on a PC if you have one. What's the max speed? Then configure nzbget on NAS to use only half of connections and nzbget on PC to use the other half. Run nzbget on both simultaneously. What is the sum speed?

northernivy
Posts: 10
Joined: 01 Feb 2017, 22:33

Re: Performance Tuning - Tried Everything

Post by northernivy » 21 Aug 2018, 16:42

When I tested directly to my MacBook Pro via ethernet I got the same figures. I then used 50% of the connections on my MBP and the other 50% on the Synology, and doing this spilt the download speeds evenly between the two systems in a simultaneous download using the same NZB file.

Any ideas?

hugbug
Developer & Admin
Posts: 7645
Joined: 09 Sep 2008, 11:58
Location: Germany

Re: Performance Tuning - Tried Everything

Post by hugbug » 22 Aug 2018, 08:39

I do not quite understand if the sum of speed was higher than speed on one device. Can you please post rough numbers?

When you test on Mac alone - does changing connection count between 50 and 25 changes the speed?

northernivy
Posts: 10
Joined: 01 Feb 2017, 22:33

Re: Performance Tuning - Tried Everything

Post by northernivy » 22 Aug 2018, 14:29

Sorry on the Mac along the total speed was roughly 60MB/s using all 50 connections....when I did the simultaneous down load using 25 connections on each the Mac saw 30MG/s and the NAS saw 30MG/s.


When I did it on the mac alone and reduced the connections I got roughly the same results as when I did the simultaneous test. I am using Giganews at 50 connections, Usenet Farm at 40 Connections and XSnews at 30 connections all set at level zero.

hugbug
Developer & Admin
Posts: 7645
Joined: 09 Sep 2008, 11:58
Location: Germany

Re: Performance Tuning - Tried Everything

Post by hugbug » 22 Aug 2018, 14:38

How far geographically are you from news servers? How high is the ping?

It would be good to make a test with SAB (on Mac probably easier to do). If you get similar speed the issue is likely with network or routing to news servers.

sanderj
Posts: 184
Joined: 10 Feb 2014, 21:46

Re: Performance Tuning - Tried Everything

Post by sanderj » 22 Aug 2018, 20:20

northernivy wrote:
22 Aug 2018, 14:29
on the Mac along the total speed was roughly 60MB/s using all 50 connections....when I did the simultaneous down load using 25 connections on each the Mac saw 30MG/s and the NAS saw 30MG/s.
So the total will get not above 60 MB/s, even if you split across different systems.

I would say this is the proof the bottleneck is not in the NZBget Hardware/OS/Software, but in the network: LAN, router, modem, ISP, throttling, peering, newsservers, roundtrip delays.

As a side step, or: as a first things first test:

can you measure your pure, non-NNTP network speed with
https://fast.com/
http://www.speedtest.net/

Both can measure Gbps speeds.
Post the result here.

NexusBender
Posts: 6
Joined: 15 Aug 2018, 18:02

Re: Performance Tuning - Tried Everything

Post by NexusBender » 23 Aug 2018, 14:17

I am testing/largely switched to NZBGet from Sabnzbd, partly because I was seeking better performance. when I first switched, I didn't really see any difference in throughput, although CPU usage was lower. I was still capping out after around 50MB/s. My setup includes a high speed storage caching tier, so that all recent writes and reads effectively go through an SSD cache, so I didn't feel I was storage limited at first.

Eventually, I came to realize 2 things were happening in my case:
1) Initially, storage performance wasn't the problem. More or less any SSD can handle a single source of 100MB/s traffic. But what I realized was happening was I'd see a slow down when the decode operations started. Now I had multiple parallel ops happening against storage.

2) I was using network storage, and everything virtualized, but while connected through enterprise grade managed switches, I realized I was sharing a NIC with other VMs and even to handle both storage and regular communications. This will be true even in a physical machine, unless you went to the effort of setting multiple nics up, which I assume most non-IT, non-network people probably don't do.

Fortunately, there are some mitigations for both of those. First, and this one easy, I configured NZBGet to pause between downloads during unpacks. This prevents you from having disk contention issues (which I would strongly suggest you check into, even if you aren't hitting memory/cpu limits, the thing is, for write ops, cache isn't going to help as much, and CPU, well you are probably running into IOPS limitations in storage when you have combined read/write operations happening especially if you are running hard disks). Things often end up more performant as it turns out, if you can do high speed serial operations instead of trying to do things in parallel.

Second, in my case, I moved communications to NAS storage onto an internal virtual network so that it was crossing 10Gbps adapters(these are actually virtualized in-memory transfers) but anyway the point is, if you have NAS storage, moving storage data transfers to a different network adapter may make a huge difference because think of what is happening (and this is the parallel thing again):
1) you start a download...comes into the NZBGet at say 900Mbit. No problem so far...has to save to NAS...so writing 900Mbit out the same network adapter. Also no problem, generally, you've got gigabit in both directions typically.
2) download completes, starts on the next one. Tries to pull 900Mbit...here's where your problems start. It might also be trying to unpack the first transfer. So now you are trying to read and write from the NAS too. Well that means you have potentially 2 read and 2 write ops happening as fast as they can (read from usenet server, read from NAS for unpack, and write the usenet server to NAS, and write the unpack back to the NAS). And now you've saturated your gigabit connection, you've got 2 things trying to do gigabit read and gigabit writes, but you only have 1Gb in each direction. This probably turns into splitting the traffic roughly equally so you end up with like a 50MB/s from the news server, and 50MB/s with the NAS. So if you can put network traffic an another NIC entirely it may help a ton (this largely only applies tho if NZBGet isn't running on your NAS, but even then, you could be saturating your disks, especially if you only have 1 or 2 spindles).

When I moved my storage traffic to a separate network, and stopped hitting my storage system with both ops at once, I started being able to hit 80-90MB/sec in my transfers. I think the separate network for storage traffic made the larger difference, but I haven't turned the pause-for-unpack off again to check, as I find I rather like having the unpacks happen as fast as possible as well, I'd rather pause the download for a few seconds to make unpack ops happen much faster time-wise.

Hope that helps!

northernivy
Posts: 10
Joined: 01 Feb 2017, 22:33

Re: Performance Tuning - Tried Everything

Post by northernivy » 24 Aug 2018, 14:31

After reading over all the topics, the only thing that I had not tried was a new service provider. Thinking that maybe if I could find a provider closer to my location that I might see a better saturation. Initially my setup was Giganews 47 connections, USnetfarm 37 connections and Xs News 27 connections. For each of these they were set as level zero with 3 connections under the max for overhead.

The tier one provider that seemed to be the closest to my location and allowed 150 connection was UsenetExpress. As soon as I added them to the mix I am seeing speeds that are averaging 90MBs. This is a great improvement, and now I can certainly Start to remove the other providers that are not really needed. I had been using Giganews without issue and never really thought that would be the issue.

Post Reply

Who is online

Users browsing this forum: No registered users and 54 guests