Hi!
Fist big thx for this nice new feature which really speeds up processes a lot
But i have a problem with several BIG download. About 30 - 60 GiG´s of data stored into about 200 .rar files.
Everything works fine and i can also see that the files are unpacked during download in the /unpack subfolder but after the process is successfully finished the directory is empty.
I´ve also tried some single file downloads from 2 - 12 GiG Size where this did not happen.
with direct unpack
Code: Select all
Statistics:
Downloaded size: 59.84 GB
Average download speed: 49.50 MB/s
Total time: 0:37:52
Download time: 0:20:38
Verification time: 0:00:18
Repair time: 0:00:00
Unpack time: 0:16:37
Files:
<no files found in the destination directory (moved by a script?)>
without direct unpack
Code: Select all
Statistics:
Downloaded size: 59.84 GB
Average download speed: 47.46 MB/s
Total time: 0:54:44
Download time: 0:21:31
Verification time: 0:00:17
Repair time: 0:00:00
Unpack time: 0:32:36
Files:
So no files listed but everything is at it´s place without using direct unpack.
Does any1 have an idea?
I´m running nbzget from linuxserver.io latest docker container. As i am running the stabled build and V19 should be updated tomorrow or on saturday i´ve just updated nzbget within the docker container which is normally not causing any problems.
Thx for help in advance!
btw the only post processing script i am running is the eMail script...