Cells Server closed the stream without sending trailers

Hello all,

Just started using Cells with my remote server. I am now attempting to set up automatic file sync, but despite establishing a successful connection to Pydio Cells from the sync client, when I attempt to access the files to set up a sync task it reports server closed the stream without sending trailers

I am running behind NGINX reverse proxy and have followed the (rather vague) instructions here. NGINX has been restarted since adding the config and it looks like so:

server {
  listen 33060 ssl http2;
  listen [::]:33060 ssl http2;
  ssl_certificate     /etc/letsencrypt/live/pydio.example.com/fullchain.pem;
  ssl_certificate_key /etc/letsencrypt/live/pydio.example.com/privkey.pem;
  ssl_protocols       TLSv1 TLSv1.1 TLSv1.2;
  ssl_ciphers         HIGH:!aNULL:!MD5;
  keepalive_timeout 600s;

  location / {
    grpc_pass grpcs://localhost:33060;

  error_log /var/log/nginx/proxy-grpc-error.log;
  access_log /var/log/nginx/proxy-grpc-access.log;

(Obviously replacing “example.com” with my own site.)

Is there an additional step I need to perform in Cells Sync to connect correctly? Is there a port I have to input somewhere in Cells Sync?

Hey folks,

After fighting with it all day, I have figured out a fix. I am documenting it here for reference.

The instructions found here are unfortunately rather vague and lack a lot of clarification. With the most recent versions of Pydio Cells, the additional server block provided in this guide in the “Cells Sync” section is not necessary when using SSL/HTTPS.

If you already have a reverse-proxy for NGINX configured, all you need to do in order to allow Cells Sync through is add this line to the root location block: grpc_pass grpcs://[binding ip]:[binding port]

In my case, this resulted in my SSL server block looking like this:

server {
    server_name pydio.example.com;
    client_max_body_size 200M;

    location / {
        proxy_pass https://localhost:8008;
        grpc_pass grpcs://localhost:8008; # This is the important one!! Replace this port as necessary to match your proxy_pass.

	location /ws/ {
		proxy_pass https://localhost:8008;
		proxy_http_version 1.1;
		proxy_set_header Upgrade $http_upgrade;
		proxy_set_header Connection "Upgrade";

    error_log /var/log/nginx/cells-proxy-error.log;
    access_log /var/log/nginx/cells-proxy-access.log;

    listen [::]:443 ssl;
    listen 443 ssl http2;
    # Configure your SSL certificates however necessary, in my case via Lets Encrypt.
    ssl_certificate /etc/letsencrypt/live/pydio.example.com/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/pydio.example.com/privkey.pem;
    include /etc/letsencrypt/options-ssl-nginx.conf;
    ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem;


The documentation on NGINX reverse proxy + Cells Sync could really use some work. It threw me quite off-track, leading me to believe I needed a separate server block when Pydio Cells will handle it just fine on the same block as your proxy.

As a closing note, this allowed me to connect Cells Sync just using my front-facing domain name and did not require any additional port specifications on the client’s side, keeping it a simple task to get set up.

Cheers, all!

Hello @Marc_Seamus,

thank you for your contribution, indeed the nginx part might be a bit misleading,
I will see if I can update the documentation using the same configuration layout as yours.

Hello @zayn

Thanks! I think that would help a lot in the future.

I discovered another issue and came across the fix myself. I am running Ubuntu 20.04 and was getting a grpc stream terminated by rst_stream internal error when attempting to begin a sync task.

As it turns out, the default packages for Ubuntu don’t provide the most recent version(s) of NGINX (1.18 versions, while the current is 1.19.10). I found that upon following this guide for setting up the Ubuntu packages and repositories for NGINX was sufficient for me to update to 1.19.10. Upon updating, the error stopped entirely, and my Cells Server and Cells Sync are now working flawlessly with the aforementioned config!

Just a clarification that might be important to note.


This topic was automatically closed 35 days after the last reply. New replies are no longer allowed.