Comments (19)
Will let you know as soon as I can 👍
from deluge.
Will do 👍
from deluge.
Good stuff! Give me a little bit to get back into this project. 👍 We'll see if we can get it working with the new Deluge. Thank you for the report!
from deluge.
Man they did a number on the WebUI API. It's a completely different layout now. The unpacker-poller app only uses like 3 items out of the data structure and none of those seem to have changed, so more or less it still works. I kicked up a fresh install of Deluge 2.0.3 on my MacBook (what a pain), and I loaded one torrent. I got a different error than you even:
2019/06/30 00:35:13 Deluge Error: json.Unmarshal(xfers): json: cannot unmarshal number -1.0 into Go struct field XferStatus.max_download_speed of type int64
So yeah, lots of quick easy fixes, but more or less I'm going to have to update this library and release a new version. Will take me a bit as I try to figure out if I can support both the old and the new format.
I'm also not ready to upgrade my server to Deluge 2 because there is not a macOS app available yet. I just want to know they've put time into macOS before I switch over, my entire setup runs on a Mac and it's currently working flawlessly.
Hope to have a fix for this soon!
from deluge.
So it would seem they've added a bunch of additional metadata that's available via the API for torrents currently active in the daemon.
It's generally working for me too right now unless I pass in the additional parameters to a torrent. I'm using a global default ratio setting for all torrents right now and anything passed in via CouchPotato uses a label and custom ratio/time params on the label.
I really want to like Radarr enough to switch to it, but I'm not sure it's ready for primetime yet based on some of the current open issues on the repo
from deluge.
I've been rocking Radarr and Sonarr for a long time. Tied in with all the tools you see on the autotyed repo. I haven't looked at the issue tracker :D Seems ok to me! haha
from deluge.
I'm sure having a hard time reproducing the error you're getting. stop_at_ratio
always come back true or false and stop_ratio
contains the float64 for the ratio. I'm not sure why you're getting a number where a boolean belongs..
from deluge.
Are you setting that on torrents imported from Sonarr. These are the settings I'm referring to which seem to trigger the error message...
from deluge.
I think this must be a bug in Deluge 2. I don't know what else to make of it. Seed time isn't even a per-torrent option that I can find in any of the Deluge 2 GUIs (web or GTK). I'm just testing locally with Deluge; I do not have Radarr or Sonarr hooked up in my test environment at the moment. I just added a couple torrents manually for now, so I could play with the new API data.
Are you able to re-compile the binary if I upload some test/debug code?
These contributions add a few new things, including better debugging logging (it prints out the failed JSON payload in debug now).
#2
Unpackerr/unpackerr#12
from deluge.
I think, from what I can surmise, that Sonarr manages the seed time period and may remove torrents after that has elapsed.
I have very little experience with Go sorry, the extent of my knowledge is go get; go install; go build
😨
My container images on my host get refreshed every 24 hours so I'm happy to update my build pipeline schedule to run every few hours while you're doing work on this, I can then report results after testing / reviewing logs
from deluge.
I'll have a new release out In a bit. It won't solve your problem, but if you throw it in debug mode (which is now done in the config file), it should print out the json that is failing. I'd like you to confirm what we really already know: stop_at_ratio
is not true
or false
, but instead a number. Can you try one other test for me? Instead of 2, enter 2.1 and 2.0 (try both). I'd just like to know how it renders on your system. Please let me know what thestop_at_ratio
and stop_ratio
values are with these tests. Can you also tell, by looking at the json, where the seed time
is being stored? If you paste the json into a file and cat file | jq
it will make it easier to read.
from deluge.
This just in ...
"acb26c334bf006c69458f8275305a0e884cce3e2": {
"active_time": 843,
"seeding_time": 834,
"finished_time": 834,
"all_time_download": 262922369,
"storage_mode": "allocate",
"distributed_copies": 0.0,
"download_payload_rate": 0,
"file_priorities": [4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4],
"hash": "acb26c334bf006c69458f8275305a0e884cce3e2",
"auto_managed": true,
"is_auto_managed": true,
"is_finished": true,
"max_connections": -1,
"max_download_speed": -1,
"max_upload_slots": -1,
"max_upload_speed": -1,
"message": "OK",
"move_on_completed_path": "/root/Downloads",
"move_on_completed": false,
"move_completed_path": "/root/Downloads",
"move_completed": false,
"next_announce": 0,
"num_peers": 1,
"num_seeds": 0,
"owner": "localclient",
"paused": false,
"prioritize_first_last": false,
"prioritize_first_last_pieces": false,
"sequential_download": false,
"progress": 100.0,
"shared": false,
"remove_at_ratio": false,
"save_path": "/downloads/complete",
"download_location": "/downloads/complete",
"seeds_peers_ratio": 5.230769157409668,
"seed_rank": 268435661,
"state": "Seeding",
"stop_at_ratio": 1,
"stop_ratio": 2.5,
"time_added": 1562035118,
"total_done": 255785997,
"total_payload_download": 262922369,
"total_payload_upload": 196875570,
"total_peers": 13,
"total_seeds": 68,
"total_uploaded": 196875570,
"total_wanted": 255785997,
"total_remaining": 0,
"tracker": "https://tracker.org/a/randomnum/announce",
"tracker_host": "org",
"trackers": [{
"url": "https://tracker.org/a/randomnum/announce",
"trackerid": "",
"message": "",
"last_error": {
"value": 0,
"category": "system"
},
"next_announce": null,
"min_announce": null,
"scrape_incomplete": -1,
"scrape_complete": -1,
"scrape_downloaded": -1,
"tier": 0,
"fail_limit": 0,
"fails": 0,
"source": 1,
"verified": false,
"updating": false,
"start_sent": false,
"complete_sent": false,
"send_stats": true
}],
"tracker_status": "Announce OK",
"upload_payload_rate": 0,
"comment": "",
"creator": "mktorrent 1.0",
"num_files": 20,
"num_pieces": 488,
"piece_length": 524288,
"private": true,
"total_size": 255785997,
"eta": 0,
"file_progress": [1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0],
"files": [{
"index": 0,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.nfo",
"size": 9208,
"offset": 0
}, {
"index": 1,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r00",
"size": 15000000,
"offset": 9208
}, {
"index": 2,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r01",
"size": 15000000,
"offset": 15009208
}, {
"index": 3,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r02",
"size": 15000000,
"offset": 30009208
}, {
"index": 4,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r03",
"size": 15000000,
"offset": 45009208
}, {
"index": 5,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r04",
"size": 15000000,
"offset": 60009208
}, {
"index": 6,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r05",
"size": 15000000,
"offset": 75009208
}, {
"index": 7,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r06",
"size": 15000000,
"offset": 90009208
}, {
"index": 8,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r07",
"size": 15000000,
"offset": 105009208
}, {
"index": 9,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r08",
"size": 15000000,
"offset": 120009208
}, {
"index": 10,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r09",
"size": 15000000,
"offset": 135009208
}, {
"index": 11,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r10",
"size": 15000000,
"offset": 150009208
}, {
"index": 12,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r11",
"size": 15000000,
"offset": 165009208
}, {
"index": 13,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r12",
"size": 15000000,
"offset": 180009208
}, {
"index": 14,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r13",
"size": 15000000,
"offset": 195009208
}, {
"index": 15,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r14",
"size": 15000000,
"offset": 210009208
}, {
"index": 16,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r15",
"size": 15000000,
"offset": 225009208
}, {
"index": 17,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r16",
"size": 775925,
"offset": 240009208
}, {
"index": 18,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.rar",
"size": 15000000,
"offset": 240785133
}, {
"index": 19,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.sfv",
"size": 864,
"offset": 255785133
}],
"orig_files": [{
"index": 0,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.nfo",
"size": 9208,
"offset": 0
}, {
"index": 1,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r00",
"size": 15000000,
"offset": 9208
}, {
"index": 2,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r01",
"size": 15000000,
"offset": 15009208
}, {
"index": 3,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r02",
"size": 15000000,
"offset": 30009208
}, {
"index": 4,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r03",
"size": 15000000,
"offset": 45009208
}, {
"index": 5,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r04",
"size": 15000000,
"offset": 60009208
}, {
"index": 6,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r05",
"size": 15000000,
"offset": 75009208
}, {
"index": 7,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r06",
"size": 15000000,
"offset": 90009208
}, {
"index": 8,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r07",
"size": 15000000,
"offset": 105009208
}, {
"index": 9,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r08",
"size": 15000000,
"offset": 120009208
}, {
"index": 10,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r09",
"size": 15000000,
"offset": 135009208
}, {
"index": 11,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r10",
"size": 15000000,
"offset": 150009208
}, {
"index": 12,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r11",
"size": 15000000,
"offset": 165009208
}, {
"index": 13,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r12",
"size": 15000000,
"offset": 180009208
}, {
"index": 14,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r13",
"size": 15000000,
"offset": 195009208
}, {
"index": 15,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r14",
"size": 15000000,
"offset": 210009208
}, {
"index": 16,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r15",
"size": 15000000,
"offset": 225009208
}, {
"index": 17,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.r16",
"size": 775925,
"offset": 240009208
}, {
"index": 18,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.rar",
"size": 15000000,
"offset": 240785133
}, {
"index": 19,
"path": "episode.S03E02.WEBRiP.x264/episode.s03e02.webrip.x264.sfv",
"size": 864,
"offset": 255785133
}],
"is_seed": true,
"peers": [],
"queue": -1,
"ratio": 0.769688606262207,
"completed_time": 1562035127,
"last_seen_complete": 1562035284,
"name": "episode.S03E02.WEBRiP.x264",
"pieces": null,
"seed_mode": false,
"super_seeding": false,
"time_since_download": 834,
"time_since_upload": 44,
"time_since_transfer": 44
}
from deluge.
So, in the above, you can see that stop_at_ratio is an integer instead of boolean value.
- How is the json.Unmarshal function inferring that value should be boolean?
- Likewise, with the error you reported, how is
-1.0
an integer?
from deluge.
oooh... it's 1
. Which probably means true
. That's an easy fix. The error I reported was that -1.0
is not an integer. It's a float, I already fixed that one.
from deluge.
The latest release should fix this. Lemme know!
from deluge.
I had another issue come up but not sure of that was just a one off. If I see it again I'll post a new issue, otherwise, it's looking pretty solid now 👍 Nice one!
from deluge.
Glad it's working! If you still have the error message you saw in the log file I'd love to see it! Please open a new issue, I may never look at this one again. :D
from deluge.
Unsure exactly why this happened and there doesn't seem to be any more than what I've posted below.
2019/07/04 09:37:23.067905 helpers.go:100: Error Deleting /downloads/complete/episode.S01E01.720p.WEBRip.X264/episode.S01E01.720p.WEBRip.X264.mkv: remove /downloads/complete/episode.S01E01.720p.WEBRip.X264/episode.S01E01.720p.WEBRip.X264.mkv: no such file or directory
A few hours later...
2019/07/04 15:06:23.067756 pollers.go:66: Extract Statuses: 0 actively extracting, 0 queued, 0 extracted, 0 imported, 1 failed, 0 deleted
2019/07/04 15:06:23.067822 start.go:189: [DEBUG] Extract Status: episode.S01E01.720p.WEBRip.X264 (status: Delete Failed, elapsed: 5h29m0s)
If I see it again, will create a new issue
from deluge.
Something seems wrong about that path, but I realize you've truncated it a bit (that's ok). Is it possible something else moved the file before the poller got around to cleaning it? I've specifically set the app up to write the extracted files into a temporary folder and then move the files back into the download location. This prevents the Starr
apps from trying to import a partially extracted files. Is it possible something deviated for that behavior? I believe there should be a dot at the beginning, but I really don't remember the exact naming scheme as I write this; would have to go dig in some code or look at my logs to remember.
Very interesting. I'll snoop around in the code as I have some more time. Thanks!
from deluge.
Related Issues (2)
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from deluge.