Scheduler task not writing to to S3 datasource - "Check your key and signing method."


I am developing a pydio cells v3 environment on an EC2 instance and using S3 for storage.
I am coming across an issue with the cells flows scheduler writing to files in the S3 storage.

The environment was installed using these instructions: and S3 was configured during installation using the browser install.

The FQDN is configured using A Records and the following is the /cells configure sites output:

| # | --------- BIND(S) --------- | ---- TLS ---- | EXTERNAL URL |
| 0 | | Lets Encrypt | |

The Issue
Everything is running very smoothly and, for the most part, I have been able to solve anything that comes up but this has stumped me. Cells flows isn’t writing to workspaces connected to an S3 bucket when triggered by an event.

Example job:

  "ID": "d0990ed3-fd9f-4b3c-b6c0-d3b0960cbf26",
  "Label": "Test Writing To S3 on Path Change",
  "Owner": "pydio.system.user",
  "Custom": true,
  "EventNames": [
  "Actions": [
      "ID": "actions.tree.put",
      "Label": "Write to Common",
      "Parameters": {
        "contents": "I have successfully written to S3.",
        "is_local": "false",
        "target_file": "pydiods1/TEST_FILE.txt"

This works when triggered manually but when using an event we get:

logs/tasks.log:{"level":"error","ts":"2022-07-12T12:19:11Z","logger":"pydio.grpc.tasks","msg":"Error while running action actions.tree.put","LogType":"tasks","SpanRootUuid":"d4254e3f-01dc-11ed-9f71-028cd8c94434","SpanParentUuid":"d4254e3f-01dc-11ed-9f71-028cd8c94434","SpanUuid":"d4a105e6-01dc-11ed-9b11-028cd8c94434","OperationUuid":"d0990ed3-fd9f-4b3c-b6c0-d3b0960cbf26-c036ace8","SchedulerJobUuid":"d0990ed3-fd9f-4b3c-b6c0-d3b0960cbf26","SchedulerTaskUuid":"c036ace8-fab3-4030-a6e9-41d763b9fd4d","SchedulerTaskActionPath":"ROOT/actions.tree.put$2","error":"The request signature we calculated does not match the signature you provided. Check your key and signing method."}

This seems to be an S3 error so I reconfigured the S3 storage and also reassigned the SSL certificate. We have also updated to the latest version 3.0.9.

Any assistance greatly appreciated,


Hello @cameron_penwern we will re-test that and get back to you asap.

Mmmm, i cannot reproduce that.
I tried with a standard S3 datasource, to trigger a Write To File action based on a “Node Path Changed” event and it works. I tried with different users.
In your case, setup is fully s3 (pydiods1 is in s3) ?

Thank you for getting back to me @charles.

Yes the set up is fully s3, including pydiods1.
I have tested with a datasource on the local file system and this works.
I have created new security credentials in aws and updated the datasources but no success.