After fixing Request signature we calculated does not match (bis) [chunked-upload] and using 3.0.5 I could observe the following problem using chunked-upload (behind Cloudflare) to an unstructured S3 object storage backend.
./cells configure sites
gives
+---+--------------------------+--------------+--------------------------+
| # | BIND(S) | TLS | EXTERNAL URL |
+---+--------------------------------------+--------------+--------------+
| 0 | https://mydomain.org:443 | Lets Encrypt | https://mydomain.org |
+---+--------------------------+--------------+--------------------------+
One is, I think, a notice I can see dozen of times during a single upload (of a 100MB file split into 5MB chunks):
2022-02-28T19:13:21.777Z DEBUG http: TLS handshake error from 172.69.102.71:54547: EOF
The IP address often change, here is an extract
141.101.109.13
162.158.113.26
162.158.117.186
162.158.196.20
162.158.224.82
162.158.248.73
172.68.206.73
172.69.102.71
172.69.193.35
172.69.198.69
172.70.101.71
172.70.103.69
172.70.121.8
172.70.84.49
They all belong to Cloudflare. My interpretation is that, for an unknown reason, there are a lot of requests (initiated?) by Cells to Cloudflare (or to itself going through CF?) which result in an error.
This is clearly not desirable and it’s to be noted that they are logged in DEBUG level (instead of WARNING)
I’m already in debug loglevel, how to further investigate this?
The other kind of error (which was, again, mislabeled as DEBUG instead of WARNING/ERROR) is:
2022-02-28T19:09:59.422Z DEBUG pydio.gateway.data jwt rawIdToken verify: failed {"error": "token_expired", "errorVerbose": "token_expired\ngithub.com/pydio/cells/vendor/github.com/ory/fosite.(*Fosite).IntrospectToken\n\tgithub.com/pydio/cells/vendor/github.com/ory/fosite/introspect.go:69\ngithub.com/pydio/cells/common/auth.(*oryprovider).Verify\n\tgithub.com/pydio/cells/common/auth/jwt_ory.go:381\ngithub.com/pydio/cells/common/auth.(*JWTVerifier).verifyTokenWithRetry\n\tgithub.com/pydio/cells/common/auth/jwt.go:245\ngithub.com/pydio/cells/common/auth.(*JWTVerifier).Verify\n\tgithub.com/pydio/cells/common/auth/jwt.go:293\ngithub.com/pydio/cells/vendor/github.com/pydio/minio-srv/cmd.pydioAuthHandler.ServeHTTP\n\tgithub.com/pydio/cells/vendor/github.com/pydio/minio-srv/cmd/gateway-pydio-auth-handler.go:83\ngithub.com/pydio/cells/common/service/context.HttpSpanHandlerWrapper.func1\n\tgithub.com/pydio/cells/common/service/context/span.go:198\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2042\ngithub.com/pydio/cells/vendor/github.com/pydio/minio-srv/cmd.criticalErrorHandler.ServeHTTP\n\tgithub.com/pydio/cells/vendor/github.com/pydio/minio-srv/cmd/generic-handlers.go:775\ngithub.com/pydio/cells/vendor/github.com/pydio/minio-srv/cmd/http.(*Server).Start.func1\n\tgithub.com/pydio/cells/vendor/github.com/pydio/minio-srv/cmd/http/server.go:115\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2042\nnet/http.serverHandler.ServeHTTP\n\tnet/http/server.go:2843\nnet/http.(*conn).serve\n\tnet/http/server.go:1925\nruntime.goexit\n\truntime/asm_amd64.s:1374"}
2022-02-28T19:09:59.426Z DEBUG pydio.gateway.data jwt rawIdToken verify: failed {"error": "{\"id\":\"\",\"code\":0,\"detail\":\"token_expired\",\"status\":\"\"}"}
2022-02-28T19:09:59.427Z DEBUG pydio.gateway.data jwt rawIdToken verify: failed {"error": "{\"id\":\"token.invalid\",\"code\":401,\"detail\":\"Cannot validate token\",\"status\":\"Unauthorized\"}"}
2022-02-28T19:09:59.428Z DEBUG pydio.gateway.data jwt rawIdToken verify: failed {"error": "token_expired", "errorVerbose": "token_expired\ngithub.com/pydio/cells/vendor/github.com/ory/fosite.(*Fosite).IntrospectToken\n\tgithub.com/pydio/cells/vendor/github.com/ory/fosite/introspect.go:69\ngithub.com/pydio/cells/common/auth.(*oryprovider).Verify\n\tgithub.com/pydio/cells/common/auth/jwt_ory.go:381\ngithub.com/pydio/cells/common/auth.(*JWTVerifier).verifyTokenWithRetry\n\tgithub.com/pydio/cells/common/auth/jwt.go:245\ngithub.com/pydio/cells/common/auth.(*JWTVerifier).verifyTokenWithRetry\n\tgithub.com/pydio/cells/common/auth/jwt.go:254\ngithub.com/pydio/cells/common/auth.(*JWTVerifier).Verify\n\tgithub.com/pydio/cells/common/auth/jwt.go:293\ngithub.com/pydio/cells/vendor/github.com/pydio/minio-srv/cmd.pydioAuthHandler.ServeHTTP\n\tgithub.com/pydio/cells/vendor/github.com/pydio/minio-srv/cmd/gateway-pydio-auth-handler.go:83\ngithub.com/pydio/cells/common/service/context.HttpSpanHandlerWrapper.func1\n\tgithub.com/pydio/cells/common/service/context/span.go:198\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2042\ngithub.com/pydio/cells/vendor/github.com/pydio/minio-srv/cmd.criticalErrorHandler.ServeHTTP\n\tgithub.com/pydio/cells/vendor/github.com/pydio/minio-srv/cmd/generic-handlers.go:775\ngithub.com/pydio/cells/vendor/github.com/pydio/minio-srv/cmd/http.(*Server).Start.func1\n\tgithub.com/pydio/cells/vendor/github.com/pydio/minio-srv/cmd/http/server.go:115\nnet/http.HandlerFunc.ServeHTTP\n\tnet/http/server.go:2042\nnet/http.serverHandler.ServeHTTP\n\tnet/http/server.go:2843\nnet/http.(*conn).serve\n\tnet/http/server.go:1925\nruntime.goexit\n\truntime/asm_amd64.s:1374"}
2022-02-28T19:09:59.430Z DEBUG pydio.gateway.data jwt rawIdToken verify: failed {"error": "{\"id\":\"\",\"code\":0,\"detail\":\"token_expired\",\"status\":\"\"}"}
2022-02-28T19:09:59.432Z DEBUG pydio.gateway.data jwt rawIdToken verify: failed {"error": "{\"id\":\"token.invalid\",\"code\":401,\"detail\":\"Cannot validate token\",\"status\":\"Unauthorized\"}"}
2022-02-28T19:09:59.432Z DEBUG pydio.gateway.data error verifying token {"token": "A7pZEOJIuknwJxUguVuHbsHI8L1JG80KdxDyCqkxBsM.XxvpHcy06UOsNG5BrrIsjYTlqKuEuIlFmclcJNbAISE", "error": "empty idToken"}
-
About the
empty idToken
, I read Users logged out immediately after login - #2 by bsinou, but it can’t be a cache problem (the Firefox console was opened during the upload what discard any caching problem), and the server time is correct (server = UTC+00, client = UTC-03) -
About the
token_expired
, I can’t tell.
But it seems clearly related to the random 403 (or Cloudflare 520) returned to the browser these PUT /io/foobar.mp4?partNumber=4&uploadId=xxxx
requests)
The uploader is able to overcome these issues (reuploading chunks until it succeeds) but this is still a quality/instability problem I’d like to see resolved before I roll into production.
thx!