Automatic compression before downloading
Hey all,
I'm trying to implement automated website backups from FTP.
For now, my script successfully downloads files and puts them in my local dir:
But I also want to have a zipped file instead of a folder as the result.
I am not a programmer so it's kind of hard for me to figure out how to integrate the "Automatically compress files before download" snippet from this website to my script.
I've been trying to play around with it but, unsurprisingly, it doesn't work:
Could someone please point me in the right direction. Help would be much appreciated.
Thanks!
I'm trying to implement automated website backups from FTP.
For now, my script successfully downloads files and puts them in my local dir:
option batch on option confirm off open session option transfer binary get /* "path\%TIMESTAMP#(dd.mm.yyyy)%\*" synchronize local "local\path\%TIMESTAMP#(dd.mm.yyyy)%" / close
But I also want to have a zipped file instead of a folder as the result.
I am not a programmer so it's kind of hard for me to figure out how to integrate the "Automatically compress files before download" snippet from this website to my script.
I've been trying to play around with it but, unsurprisingly, it doesn't work:
open session call tar -czf /tmp/archive.tar.gz / get /tmp/archive.tar.gz "local\path\%TIMESTAMP#(dd.mm.yyyy)%\*" exit
Could someone please point me in the right direction. Help would be much appreciated.
Thanks!