Maximising Throughput

Discuss newly added features or request new features.
Post Reply
rxp
Posts: 55
Joined: 07 Mar 2014, 18:51

Maximising Throughput

Post by rxp » 23 May 2014, 15:57

I'm using Supernews and very old posts (>1000days) get a slow download rate.

When I have a large queue downloading it's frustrating that an old article will hold up the entire queue. Is there a way I can setup NZBGet to force downloading of other items in the queue?

Thanks,

salami
Posts: 68
Joined: 14 Apr 2014, 11:09
Location: Switzerland

Re: Maximising Throughput

Post by salami » 23 May 2014, 16:10

Giganews/Supernews once claimed to be the only provider not suffering from slower speeds on older articles. What kind of difference are you seeing? Are you sure the age is the only factor? Article size can have an impact on the overall transfer rate, but you could fix that by using more connections.

hugbug
Developer & Admin
Posts: 7645
Joined: 09 Sep 2008, 11:58
Location: Germany

Re: Maximising Throughput

Post by hugbug » 23 May 2014, 16:49

rxp wrote:Is there a way I can setup NZBGet to force downloading of other items in the queue?
Why not use priorities?

rxp
Posts: 55
Joined: 07 Mar 2014, 18:51

Re: Maximising Throughput

Post by rxp » 23 May 2014, 19:18

It's definitely the age, only ever happens to the oldest articles. The connection settings are precisely the same - newer articles download faster.

Priorities is an option, but what I'd really like (I guess a feature request) is for nzbget to understand that old articles are slower so it'll also start downloading a newer nzb while also downloading the older one.

Or perhaps "organise by age" like with Sabnzbd.

hugbug
Developer & Admin
Posts: 7645
Joined: 09 Sep 2008, 11:58
Location: Germany

Re: Maximising Throughput

Post by hugbug » 27 May 2014, 12:30

rxp wrote:nzbget to understand that old articles are slower so it'll also start downloading a newer nzb while also downloading the older one.
Split connections? A half for the old nzb and another half for other nzb? Wouldn't that make the download of the old nzb even slower?

sanderj
Posts: 184
Joined: 10 Feb 2014, 21:46

Re: Maximising Throughput

Post by sanderj » 27 May 2014, 21:21

hugbug wrote:
rxp wrote:nzbget to understand that old articles are slower so it'll also start downloading a newer nzb while also downloading the older one.
Split connections? A half for the old nzb and another half for other nzb? Wouldn't that make the download of the old nzb even slower?
Maybe, and maybe not. But the main advantage is that the other download is not waiting anymore, but is being downloaded and can finish even before the slow download is finished

It's like dividing a road into two separate lanes: a lane for slow traffic and a lane for fast fast traffic: the slow traffic might or might not get slower, but the fast traffic is not stuck anymore behind slow traffic and will arrive sooner

Note: if the usenet download of one NZB is already at full speed, paralleling two downloads will make the download take more time; two downloads then share bandwidth ...

rxp
Posts: 55
Joined: 07 Mar 2014, 18:51

Re: Maximising Throughput

Post by rxp » 28 May 2014, 19:26

Nice summary sanderj, that's exactly hte scenario I have.

My old downloads run at 5MB/Sec, my new ones run at 26MB/Sec.

That's a combination of Supernews having slow speed on older articles and the newer downloads being off my local ISP server as well as Supernews, so more threads and a much faster local server.

Post Reply

Who is online

Users browsing this forum: No registered users and 19 guests