Another (empty) directory that can not be deleted (using 2.0.5):
pydio.log
:
{"level":"info","ts":"2020-04-17T18:11:50Z","logger":"pydio.rest.tree","msg":"Deletion: moving [common-files/Foo Bar/baz] to recycle bin","SpanUuid":"e88adb63-80d6-11ea-94e9-fa163ee72a12","RemoteAddress":"x.x.x.x","UserAgent":"cells-client/2.0.2","ContentType":"application/json","HttpProtocol":"HTTP/1.1","UserName":"admin","UserUuid":"xxxx","GroupPath":"/","Profile":"admin","Roles":"ROOT_GROUP,ADMINS,ADMINS,894e22cd-5b92-4832-bf08-c3fc60ad38e7","RecycleRoot":{"Uuid":"DATASOURCE:ovh","Path":"ovh/","MTime":"2020-02-25T22:23:36Z","MetaStore":{"name":"\"\"","pydio:meta-data-source-path":"\"\""}}}
{"level":"info","ts":"2020-04-17T18:11:51Z","logger":"pydio.rest.tree","msg":"Recycle bin created before launching move task","SpanUuid":"e88adb63-80d6-11ea-94e9-fa163ee72a12","RemoteAddress":"x.x.x.x","UserAgent":"cells-client/2.0.2","ContentType":"application/json","HttpProtocol":"HTTP/1.1","UserName":"admin","UserUuid":"xxxx","GroupPath":"/","Profile":"admin","Roles":"ROOT_GROUP,ADMINS,ADMINS,894e22cd-5b92-4832-bf08-c3fc60ad38e7","NodePath":"ovh/recycle_bin"}
{"level":"info","ts":"2020-04-17T18:11:51Z","logger":"pydio.grpc.tasks","msg":"Run Job copy-move-37d2ec47-6cc8-44ff-92d8-924b98631f86 on timer event <nil>","SpanRootUuid":"e88adb63-80d6-11ea-94e9-fa163ee72a12","SpanParentUuid":"e88adb63-80d6-11ea-94e9-fa163ee72a12","SpanUuid":"e98a8ceb-80d6-11ea-ab49-fa163ee72a12"}
{"level":"info","ts":"2020-04-17T18:11:51Z","logger":"pydio.grpc.tasks","msg":"Setting Lock on Node with session 404aada5-9dc9-4097-b5a5-47d9e7cb1552","OperationUuid":"copy-move-37d2ec47-6cc8-44ff-92d8-924b98631f86-b38fcdaa","SpanRootUuid":"e88adb63-80d6-11ea-94e9-fa163ee72a12","SpanParentUuid":"e88adb63-80d6-11ea-94e9-fa163ee72a12","SpanUuid":"e98a8ceb-80d6-11ea-ab49-fa163ee72a12","OperationUuid":"copy-move-37d2ec47-6cc8-44ff-92d8-924b98631f86-b38fcdaa"}
{"level":"error","ts":"2020-04-17T18:11:51Z","logger":"pydio.grpc.acl","msg":"Filtered Error","OperationUuid":"copy-move-37d2ec47-6cc8-44ff-92d8-924b98631f86-b38fcdaa","SpanRootUuid":"e88adb63-80d6-11ea-94e9-fa163ee72a12","SpanParentUuid":"e88adb63-80d6-11ea-94e9-fa163ee72a12","SpanUuid":"e996e3b3-80d6-11ea-94e9-fa163ee72a12","OperationUuid":"copy-move-37d2ec47-6cc8-44ff-92d8-924b98631f86-b38fcdaa","UserName":"admin","UserUuid":"xxxx","GroupPath":"/","Profile":"admin","Roles":"ROOT_GROUP,ADMINS,ADMINS,894e22cd-5b92-4832-bf08-c3fc60ad38e7","error":"Error 1062: Duplicate entry '23-lock--1--1' for key 'acls_u1'"}
{"level":"warn","ts":"2020-04-17T18:11:51Z","logger":"pydio.grpc.tasks","msg":"Could not init lockSession","OperationUuid":"copy-move-37d2ec47-6cc8-44ff-92d8-924b98631f86-b38fcdaa","SpanRootUuid":"e88adb63-80d6-11ea-94e9-fa163ee72a12","SpanParentUuid":"e88adb63-80d6-11ea-94e9-fa163ee72a12","SpanUuid":"e98a8ceb-80d6-11ea-ab49-fa163ee72a12","OperationUuid":"copy-move-37d2ec47-6cc8-44ff-92d8-924b98631f86-b38fcdaa","error":"{\"id\":\"dao.error\",\"code\":500,\"detail\":\"DAO error received\",\"status\":\"Internal Server Error\"}"}
{"level":"info","ts":"2020-04-17T18:11:52Z","logger":"pydio.grpc.data.sync.ovh","msg":"Filtering TreePatch took","time":0.00000586,"source":"s3://y.y.y.y:9001/pydio","target":"index://ovh"}
tasks.log
{"level":"info","ts":"2020-04-17T18:11:53Z","logger":"pydio.grpc.tasks","msg":"There are 2 children to move","LogType":"tasks","SpanRootUuid":"e88adb63-80d6-11ea-94e9-fa163ee72a12","SpanParentUuid":"e88adb63-80d6-11ea-94e9-fa163ee72a12","SpanUuid":"e98a8ceb-80d6-11ea-ab49-fa163ee72a12","OperationUuid":"copy-move-37d2ec47-6cc8-44ff-92d8-924b98631f86-b38fcdaa"}
{"level":"error","ts":"2020-04-17T18:13:38Z","logger":"pydio.grpc.tasks","msg":"Error while running action actions.tree.copymove","LogType":"tasks","SpanRootUuid":"e88adb63-80d6-11ea-94e9-fa163ee72a12","SpanParentUuid":"e88adb63-80d6-11ea-94e9-fa163ee72a12","SpanUuid":"e98a8ceb-80d6-11ea-ab49-fa163ee72a12","OperationUuid":"copy-move-37d2ec47-6cc8-44ff-92d8-924b98631f86-b38fcdaa","error":"We encountered an internal error, please try again."}
{"level":"info","ts":"2020-04-17T18:15:23Z","logger":"pydio.grpc.tasks","msg":"Deleted 0 expired revoked tokens","LogType":"tasks","SpanUuid":"67b9eed2-80d7-11ea-ab49-fa163ee72a12","OperationUuid":"actions.auth.prune.tokens-d39020c1"}
{"level":"info","ts":"2020-04-17T18:15:23Z","logger":"pydio.grpc.tasks","msg":"Auth.PruneJob.ResetToken","LogType":"tasks","SpanUuid":"67b9eed2-80d7-11ea-ab49-fa163ee72a12","OperationUuid":"actions.auth.prune.tokens-d39020c1"}
{"level":"info","ts":"2020-04-17T18:15:23Z","logger":"pydio.grpc.mailer","msg":"Successfully sent 0 messages","LogType":"tasks","SpanUuid":"67ce7319-80d7-11ea-94e9-fa163ee72a12","OperationUuid":"flush-mailer-queue-435be272"}
and cells-client
which says:
2020/04/17 15:13:39 could not monitor job, copy-move-37d2ec47-6cc8-44ff-92d8-924b98631f86
Nodes have been moved to the Recycle Bin
…but the directory is still here. Wtf? Can’t we simply kind of “force-delete” to get back to a clean state?
> select * from idm_acls WHERE node_id != '-1';
+-----+----------------+--------------------------------------+---------+---------+--------------+---------------------+---------------------+
| id | action_name | action_value | role_id | node_id | workspace_id | created_at | expires_at |
+-----+----------------+--------------------------------------+---------+---------+--------------+---------------------+---------------------+
| 1 | read | 1 | 1 | 1 | 1 | 2020-03-30 17:17:33 | NULL |
| 2 | write | 1 | 1 | 1 | 1 | 2020-03-30 17:17:33 | NULL |
| 4 | read | 1 | 4 | 3 | 4 | 2020-03-30 17:17:33 | NULL |
| 5 | write | 1 | 4 | 3 | 4 | 2020-03-30 17:17:33 | NULL |
| 6 | deny | 1 | 6 | 1 | 1 | 2020-03-30 17:17:33 | NULL |
| 24 | recycle_root | 1 | -1 | 14 | -1 | 2020-03-30 17:17:33 | NULL |
| 25 | workspace-path | ovh | -1 | 15 | 20 | 2020-03-30 17:17:33 | NULL |
| 26 | recycle_root | 1 | -1 | 15 | 20 | 2020-03-30 17:17:33 | NULL |
| 27 | read | 1 | 1 | 15 | 20 | 2020-03-30 17:17:33 | NULL |
| 28 | write | 1 | 1 | 15 | 20 | 2020-03-30 17:17:33 | NULL |
| 47 | lock | 8131a6e8-fd8a-4750-8eb4-23554dca6121 | -1 | 22 | -1 | 2020-03-30 17:31:16 | 2020-03-30 17:31:18 |
| 48 | lock | af721d20-f548-4298-94b1-527564013b0e | -1 | 23 | -1 | 2020-03-30 17:31:51 | 2020-03-30 17:31:53 |
| 86 | read | 1 | 71 | 15 | 20 | 2020-04-14 17:10:37 | NULL |
| 101 | read | 1 | 86 | 15 | 20 | 2020-04-14 17:12:42 | NULL |
| 102 | write | 1 | 86 | 15 | 20 | 2020-04-14 17:12:42 | NULL |
+-----+----------------+--------------------------------------+---------+---------+--------------+---------------------+---------------------+
There seems to be at least first bug which is about an expired lock not being removed, creating a MySQL duplicated key error.