You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to do backup to Hubic... It fails regularly with 503 status code on certain chunks and the retries do not help.
I observed that once it gets to 503 for some chunks, it stays there. Other chunks are uploaded no problem at the same time.
What is interesting is that
it is always the same chunk causing the problem - when I re-run the backup, it fails at the same file
it has something to do with the total number of requests that are before or after the problematic chunk; when I setup ignore the way that the total number of files is smaller "around" the problematic file, the chunk gets uploaded correctly
number of threads seems to have no influence, the problematic chunk always returns 503, unless I limit the total number of new (not uploaded files yet) "around" the problematic chunk
it makes no difference if the remainder of the files in the batch are new (ie. not uploaded yet) or if they are uploaded already, but the new revision is not saved yet (the only difference is that they are processes at 50MB/s speed instead for example 1.4MB/s)
limit-rate has no effect on this problem
once I limit the number of the files and the chunk gets uploaded and the revision saved, it is no problem anymore.
So in practice to make an initial backup of a new bigger folder, I have to bisect the chunk where the upload stops (repeated 503), limit the files around the chunk (using filters), have it upload and the revision saved, then continue to the next failing chunk.
My guess is (from watching the debug lines) that duplicacy is shooting the request too fasts and on certain occasions it exceeds Hubic's (and potentially other services) threshold and it keeps returning a 503 for that chunk indefinetely?
Hope the description makes some sense... anyone had the same problem and fixed it?
The text was updated successfully, but these errors were encountered:
I'm trying to do backup to Hubic... It fails regularly with 503 status code on certain chunks and the retries do not help.
I observed that once it gets to 503 for some chunks, it stays there. Other chunks are uploaded no problem at the same time.
What is interesting is that
So in practice to make an initial backup of a new bigger folder, I have to bisect the chunk where the upload stops (repeated 503), limit the files around the chunk (using filters), have it upload and the revision saved, then continue to the next failing chunk.
My guess is (from watching the debug lines) that duplicacy is shooting the request too fasts and on certain occasions it exceeds Hubic's (and potentially other services) threshold and it keeps returning a 503 for that chunk indefinetely?
Hope the description makes some sense... anyone had the same problem and fixed it?
The text was updated successfully, but these errors were encountered: