Page 1 of 1

broken files, pet project

Posted: 06 Nov 2005, 23:15
by lumpy2000
Hi everyone,

I discovered nzbget a few days ago and it is pretty close to what I am looking for. I have browsed through the code and it looks fairly well architected.

The first problem I encountered was that I had a lot of broken files. I think this is due to my ISP's news servers disconnecting frequently. It seems to me that nzbget gives up a little too easily in this case, resulting in a broken file. I have a patch that remedies this.

I also added a new feature that I wanted, which is the ability to have multiple servers in each level. My use case for this is that my ISP provides 6 servers that each allow 4 connections. I want to use these as my level 0 servers, since they have good bandwidth and no quotas. Then, in level 1, I want my fill servers, which have less bandwidth, and quota restrictions.

So now I can have 24 threads downloading until an article is not available at level 0, then I can get it from level 1. The implementation of this is not perfect yet. I would like to try all level 0 servers before going to the next levels, but this currently does not happen.

I have been using BNR2 up until now, and I really like the way that it manages the article queue and the servers. I would like to modify nzbget to behave more like BNR2 for downloading, which I think it does optimally. Of course, BNR2 has its quirks in other areas.

I love the client/server feature of nzbget. I think a daemon mode would be useful. I'm really getting used to one-click downloads.

I think it would be cool to integrate par2 handling, so only the minimal required par2 files would be downloaded, and recovery would be automagic. I see that nget does this, and is also written in c++, so I think it is possible to take the nget code and glue it into nzbget.

Anyway, I think I will be doing a bit of work on nzbget over the next few months. I would be keen to share ideas and patches with others. I think that with the "broken file" problem fixed that this could be a release-worthy project. There are also some minor issues, like not failing gracefully if your downloads or temp directories don't exist.

cheers,
kevin

RE: broken files, pet project

Posted: 07 Nov 2005, 10:26
by nibor2005
Sounds great,

But If you or placebodk changes the protocol, I hope you don't break compatibility with FastNZB.

I would like to see some added features to the network protocol:
* Change queue with client. (it shouldn't be a server task I think)
* when an nzbfile will be placed at top in the queue, it should start to download that one directly, instead of first completing the one it is busy with.

Let me know if you have any plans to implement these as well :), If you don't want to do it, maybe I can look into it myself.

I'm the maintainer of FastNZB, the GUI Win32/Linux client for nzbget. (http://www.sf.net/projects/fastnzb)

Robin.

RE: broken files, pet project

Posted: 07 Nov 2005, 15:13
by jens_nordmann
hi alls,
i'm new here, and i have been toying with nzbget-server and it is a verry fine progg. nzbget is running only in server mod here, and i have been writing a few features.

- return the active queue-size and ~ remaining time
- since my router has a small harddrive, i wanted a better solition for nzbget to remember downloaded files. now the server stores all downloaded files in a txt-file


@Robin
your requested features will be my next task, since i wanted them allso :)

so, i don't know where to put my verson of nzbget, so feel free to contact me and i will post it somewhere on the net.

jens

RE: broken files, pet project

Posted: 07 Nov 2005, 15:28
by nibor2005
Hi Jans,

Great that you are working on nzbget.

I think the best way to get your modifications included in nzbget is to create a patch against the latest version and post it in the 'Patches' section (https://sourceforge.net/tracker/?group_ ... tid=632378)
of this project.

You could also contact placebodk, the current maintainer of nzbget.

Robin.


RE: broken files, pet project

Posted: 08 Nov 2005, 21:25
by placebodk
Hiya people,

Sounds really really great :o)

I think it's fantastic that you guys have gotten some new ideas and some drive to implement them to the project.

Personally I have been neglecting development on nzbget for quite some time for the following reasons (here comes the lame excuses):

First of all my personal setup is actually rather stable. It almost never crashes and from time to time runs for more than a week without problems. Also some of the bugs which remains with the code seems to be related to how the newsservice performs, and since my newsservice performs rather well I would need to go looking for a bad newsservice to approach these problems - and I really don't need the extra D/L bandwidth (my epia server can't use my entire bandwith at current time).

The second and more important reasons is that I'm spending my free time coding on a bigger project that I intend to make a living off sometime in the future - this means that this has high priority and I'm unlikely to spend much time extending the features of the nzbget in the near future.

I'm open to suggestions as how to proceed from here :o) People could do patches and these could be tried out, I could give developer/admin rights to one, more, or whomever wanted this, or somebody could take over as admin for the project?

/placebodk

PS. In relation to Nibor's remarks:
* "Change queue with client" - I had thought about implementing this, but never got to it, it seems. Anyhoop it shouldnt be a big problem to implement.
* "when an nzbfile will be placed at top in the queue, it should start to download that one directly" - this was actually one of the first things I tried to look into when I took over the project from siddy. But as I discovered, getting this functionality demands some refactoring of the code. Currently the download part of the code only tracks all the files which a news-download consists of for the current downloading .nzb-file.

RE: broken files, pet project

Posted: 08 Nov 2005, 22:11
by vinnaaah
Hi Placebodk and all,

First off, I have been a enthousiastic user of the nzbget tool. Recently, this changed. It might be a bit odd to post about this in a nzbget thread - but I'd like to tell you all about the newsleeching tool I use currently.

Ninan. Also a sf project (ninan.sf.net). It's a web-enabled application written in Java, which works very good at Linux and Windows. I have been using it for a month or so and it processed about 60GB. Three coworkers of mine also use it and we all experienced little problems with it. Important features include automatic parrepair and unrarring, which basicly rocks.

I'm aware that some people explicitly want a console-based newsleecher or dislike Java or (there are probably some more differences).. but I also think quite a number of people will be interested in using Ninan.

I do not want to offend anyone or advertise for my favorite tool. But since nzbget is in need of a new admin it might be worth considering to stop active development in favor of Ninan and mentioning this of the summary page so that people that use nzbget / come across the project page also know about the alternative.

Bye,
Vinnaaah

RE: broken files, pet project

Posted: 08 Nov 2005, 23:54
by khermansen
Ninan does look promising, but the bloat of Java is something that makes me think twice about using it. It has a slick UI, and I just installed it myself. Would be nice not to rely on the JVM. nzbget is still great for CLI stuff.

Btw, there is a slight vulnerability in your startup script ninancore.sh. Probably not too much of a problem, given that it shouldn't be run as root and the end user would need to be really dumb do fall for something like this, but...

$ mkdir /tmp/bin
$ cat malware.sh > /tmp/bin/java
$ cat /tmp/bin/java
#!/bin/sh
touch ~/w00w00
$ chmod 755 /tmp/bin/java
$ export JAVA_HOME="/tmp"

When this line runs,

nohup $JAVA_HOME/bin/java -Djava.awt.headless=true -classpath ...

Our /tmp/bin/java script runs and can do whatever it wants given the privilege of the user. Pretty dumb, but just something to keep in mind...

Cool prog though :-)
--
Kristian Hermansen

RE: broken files, pet project

Posted: 09 Nov 2005, 02:36
by lumpy2000
Thanks for giving me developer access. I will commit my change for the broken files issue soon, since it fits well with the established code base. Some of my other changes are working well for me, but they are not very elegant.

I think that some refactoring is necessary to implement some more robust downloading behaviours. In particular, I think the server "level" has more importance than it should. I think that articles should decide where they want to be downloaded from, given server preferences, server performance, and article availability on a particular server. The level should be part of your server preferences.

I also find that nzbget is remarkably stable. Looking at the code I can see that it is very simple and has some thought behind it. I also see that the changes I would like to make will have a big impact on the core of the application. However, I don't think they will impact the user experience, other than to make downloads more robust in the face of server contraints and difficult network conditions.

I think that some of the changes I would like to make are fairly amibtious. However, my time is also fairly limited by work, other hobbies, etc. I would like to mould this project into my requirements, but it may take some time.

In the future I will try to elaborate more on where I think this project should go, which is largely where I think I can get it to, with some help from the users. My goal is to maintain the usability of the current project, and add more reliability and possibly some new features.

cheers,
kevin

RE: broken files, pet project

Posted: 01 Jan 2006, 19:45
by ninny
Hi Kevin,

Have you already started work on this project? If you need more coders I will volunteer (think I can mostly help in bug-fixing). Send me a mail if you are interested.

Believe, the following are essential bugfixes.
- Fix the glibc double free bug on exit
Impact: nzbget can't be used in shell scripts
NOTE: I submitted a patch to this which fixes it.
- Way too many broken files are being left. Can be verified by running the older versions.
Unsure why this happens. seems to be some issue at the final file decode stage after all parts are obtained.

- Sometimes leaves zero sized broken files even when multiple parts have been downloaded.

--
Dinesh