Dav and a lack of multipart support

We have been using webdav for some file transfers using Rclone.
Seems like one of the options we used, the multipart chunking option doesn’t quite work as advertised.
Basically rclone will split your file up into say 2M chunks and send them as a multipart upload. Only trouble is pydio needs to understand what it’s receiving and it doesn’t .
So i’ve ended up with a million fragments under
/pydio/data/.minio.sys/multipart
How do I clean this up?
I’m assuming that there is an entry for them in the DB? or if there’s not, that the minio index system will get all upset if I just delete them?
The other question is how do you see who’s fragments are who’s. All the files are owned by the pydio user. But may have been uploaded by different users. So how to you trace back who’s is who?

Hi!
Do you have any info about the exact spec of dav + multipart ? I’m a bit stunned : the .minio.sys/multipart you are referring to is linked to the s3 multipart api, which has nothing to do with webdav.
Are you doing rclone with s3 protocol maybe ?
-c

Are you doing rclone with s3 protocol maybe ?

… and if you are, would you be so kind as to show what command you’re using? I have been trying in vain to get rclone to ‘talk’ to Cells, but, alas, I cannot do much more than list the workspaces, e.g. rclone lsd pydio-cells:io/ will list the workspaces correctly, with the following configuration:

[pydio-cells]
type = s3
provider = Minio
env_auth = false
access_key_id = <Personal Access Token generated with ./cells admin user token -u <my_username> -a 365d>
secret_access_key = gatewaysecret
endpoint = https://my.server.name
acl = bucket-owner-full-control

I can list a specific file, if I know its name in advance; transferring it will basically fail with corrupted on transfer errors, claiming that MD5 hash differ… but I’ll spare you the details, which shall be posted on a separate thread.

All I want is to see a working example!

Hmm, no. I’ve told rlcone to talk webdav
So this is from my clone config

[pydio1]
type = webdav
url = https://pydio.myorganization.org.au/dav/personal-files
vendor = other
user = stestuser
pass = pVamj3v1skrMykx4DtRpQEB_zSA8AlNQjvE

So that will get you a webdav connection using clone

So this is what some of the users have been using
rclone copy --progress --buffer-size=100M --drive-chunk-size=128M --max-backlog=999999 --transfers=8 --checkers=64

And I think the way is seems to work is the --drive-chunk-size is trying to upload 128M chunks of the file

1 Like

This topic was automatically closed 35 days after the last reply. New replies are no longer allowed.