Get help, report and discuss bugs.
I used to like nzbget, but now I use hellanzb. Try it yourself and you will find it better than nzbget, I assure you!
You are right, hellanzb is great and provides some features that nzbget has not.
My main problem is memory footprint and hellanzb consumes, due to the python interpreter, lots of memory on my 32Mb machine
Is there a way to make python consume less RAM (may be out of subject)?
Is there a way to make nzbget evolve towards a hellanzb-like program (the missing features or enhancements)?
I do not know how to optimize python, maybe ask in #python? On my machine, waiting for connections takes up 17MB at rest. It is a bit high, but the ease of use I think makes up for the large memory footprint. Memory is cheap. Developer's time is not
You could fix up nzbget, but you are going to need to add auto-par2/unrar. That's mostly the reason I switched from nzbget to ninan and then to hellanzb. So, just adds the ability to do that and make an easy nzb import mechanism...
I just switched from hellanzb to nzbget because hellanzb is too heavy on the resources! Especially on the CPU
I also went the other way from hella to nzbget because CPU usage is too high on Hella.
nzbget now has parrepair and I guess you can do unrar via post processing scripts.
hellanzb supports SSL which nzbget does not.
nzbget includes the currently downloading nzb as part of the queue. This makes it easy to start a small download even if a big nzb is currently downloading. With hella, you can only jump a nzb to the top of the waiting queue. This was a pain when I needed to grab some file quickly but a large download was in progress.
Who is online
Users browsing this forum: No registered users and 8 guests