What is the fastest way to do a recursive file glob on a remote server?
Keep in mind that I want to do some filtering based on the directory
name, file properties, and file extensions.
(I want to copy these files, but I assume once I have the list, that
using Net::FTP will be trivial)
I am concerned that just using ftp.chdir and ftp.list will be slow.
Perhaps there is a faster way using Net:SSH. I had an idea to try to
run a ruby program on the server (ruby is installed on the remote
server)
cmd = "ruby -e ’ #{ File.read( ruby_glob_program ) } ’ "
Net::SSH.start(SERVER, :username => u, :password => p) do |session|
input, output, error = session.process.popen3( cmd )
Net::SSH.start(SERVER, :username => u, :password => p) do |session|
input, output, error = session.process.popen3( cmd )
If you’re talking about a Linux or other Unix-like server, indexing a
filesystem is not something you want to do often. It is slow, even
using native utilities like “find”, or its cousin “slocate -u”. What
most sites do is run a cron job in the off hours to build the “slocate
-u” database, and then do searches using the database.
If you need more current indexing, you probably need to look at the
application and have it do its own index maintenance.