The semi-solution is to use find piped to cut as the option to rsyncs' --files-from

Note: You do have to limit the number of file that rsync is syncing. It will always sort files-from by name, even if you provide it sorted by date. However, you could programatically do batches of 100 or 1000 files.

bash # not fish, zsh, ksh, etc

rsync -avPhHz \
  --files-from=<(ssh dropsha.re 'find /srv/webapps/vhosts/dropsha.re/files -type f -size +100M -print | cut -d/ -f7-100') \
  dropsha.re:/srv/webapps/vhosts/dropsha.re/files/ \
  /Volumes/CoolAJ86\ 5TB/dropsha.re/files/

Larger than 100M

find /srv/webapps/vhosts/dropsha.re/files -type f -size +100M' -print | cut -d/ -f7-100

Newer than one week

For date use -mtime 7 (less than 7 days old)

By Date

find ./ -printf "%T+\t|%p\n" | sort | cut -d"|" -f2-100 oldest first

find ./ -printf "%T+\t|%p\n" | sort -r | cut -d"|" -f2-100 newest first

By Size

find ./ -printf "%s bytes|%p\n" | sort -n | cut -d"|" -f2-100 smallest first

find ./ -printf "%s bytes|%p\n" | sort -nr | cut -d"|" -f2-100 largest first

# Note that the cut -f7-100 would need to be modified to the number of / in your path prefix
# If your path were /srv/webapps/example.com/ then you would use cut -f4-100

rsync -avPhHz \
  --files-from=<(ssh dropsha.re 'find /srv/webapps/vhosts/dropsha.re/files -type f -printf "%s bytes|%p\n" | sort -n | cut -d"|" -f2-100 | cut -d/ -f7-100') \
  dropsha.re:/srv/webapps/vhosts/dropsha.re/files/ \
  /Volumes/CoolAJ86\ 5TB/dropsha.re/files/

Source: http://superuser.com/questions/297342/rsync-files-newer-than-1-week

How to delete the 15 largest files in a directory

bash # not fish, zsh, ksh, etc

ls -lahSr ./

ls -Sr ./ \
  | grep '' \
  | tail -n 15 \
  | while read F; \
    do \
      echo "${F}"; rm "${F}"; \
    done

By AJ ONeal

If you loved this and want more like it, sign up!


Did I make your day?
Buy me a coffeeBuy me a coffee  

(you can learn about the bigger picture I'm working towards on my patreon page )