[Bug*] Pydio suffering when I upload big files (PutObjectPartFailed)

I am creating new topic because topics created are old and seems like not getting any interest because watchers just passes it thinking it’s old topic so it may be outdated.

First of all, here is my configuration information and server details:

- Platform: Docker behind LXC (do not ask why I choose this structure here, it's because I don't want to reinstall the os when I messed up) - Pydio Cells version: Home Edition 3.0.7 (rev 2ecc999d9d6e5e88470da8765eb531dda768a409) - Database: MariaDB 10.7.x (using latest tag of Docker) - Performance details: i7-8550U with 16GiB of RAM, Installed Pydio on HDD volumne - Network: Half-1Gbps - Platform: Web, Chrome - Machine: M1 with 16GiB of RAM

The problomatic situation is Pydio is constantly failing to upload big files like over few giga-bytes.
To investigate, I tested some situations with file costs 125GiB.

  1. Every time, when I left the laptop to sleep, I see 403 error.

So I tried to click refresh button on the navigation bar of the Pydio web client to mitigate the session expiration. However, problem was not fixed at this time and I searched the logs.

Ts : 1652544804
Level : error
Logger : pydio.gateway.data
Msg : PutObjectPart has failed - Put "": context canceled
UserName : ec25519
UserUuid : 87b8efa6-8b37-4f2f-848f-93904397624e
GroupPath : /
RemoteAddress :
UserAgent : Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.64 Safari/537.36
HttpProtocol : HTTP/1.1
SpanUuid : bf4625b4-d3a0-11ec-9e13-0242ac140002
JsonZaps : {"ContentType":"application/octet-stream"}

It was the log that I could find on the forum. However, there is no direct answer to this. But as you know, some configuration tricks maybe help. From the following post, I managed some uploader configuration on cells settings page.

Brand new installation from docker-compose not able to upload big file - #9 by zayn

MPart threshold: 50MB
MPart size: 20MB

Pydio still failed to upload but it remained longer.

  1. So how about timeout issue?

This is what I am looking for currently.

I also, looked up server status with htop and iotop.

When the server is idle state, CPU usage is even under 2%.
However, when I start the upload again on web client. the server changes to the monster with cells binary making over 80% of CPU usage:

Just like the CPU usage on the server, the time to upload the chunk takes over 15s. I remember that it took under 2s at first time.


I think what this mean is that there sould be a solution with update. (However, unfortunately, I cannot fix and analyze since I am not a golang dev) What do you think?


In general, when you login with success, cells give you an access token (whose the validation is limited i.g: 15 minutes) and refresh token. Before timeout, with refresh token, cells renews the access token to extend your opening web session. If your laptop is in sleep mode, access token will be invalid so you will get 403 error.

I think you can’t upload the big file because of web session timeout.
For the big file upload, it’s recommended to use cells client tools (cec).

We highly recommend to set the multipart upload params to optimized values. It can reduce cpu’s work:
MPart threshold: 200MB
MPart size: 50MB (or higher if you has very big file. The bigger chunk size, the less cpu usage percent)

when you increase the chunksize, please increase the timeout for uploading a chunk.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.