We have a Pydio 8.2.2 instance running on CentOS 7.6, and have an indexing issue.
When we want to open a certain file repository with a lot of files & folders, it frequently timesout the connection before complete (or it is so stuck that it cannot complete). or in some cases just shows the first 1000 files or so and no more. Even if there are >1000 files/folders in it.
I think the indexing is done by the clients, and not the server it self…? Or is this done by the server, but as the repo’s change via a different path, and not through Pydio, it keeps on restarting the indexing…
I believe indexing and searching is not functioning properly if ‘odd’ characters are used like & ^ % $ @ [ ] etc. or should that not make a difference…?
We use the Pydio system as a remote File Management tool, not as cloud storage for users.
We use it to move files around from 1 repository to another.
The file repositories are for example, our FTP server and an Archive Server.
Files deliverred to the FTP are moved from the FTP repo to the Archive repo via a Pydio command.
this sometimes involves files and folders of over 500GB in size at the time.
We already tweaked the php.ini file so that commands can be bigger in size, and execution time is set to a full day.
Are there some more tweaks we can do to get the pydio system to operate faster and complete indexing the repo’s?
Sorry for the many questions in 1 topic, but it is all related I think.