Okay, so you don’t have SSH access to your server, but you do have FTP access to it, and you need *all* the files in a folder.
Worse, your directory structure is many levels deep, and extremely messy. Normal FTP won’t cut it, because command-line FTP doesn’t do recursive downloading of folders. Turning off interactive mode and using mget doesn’t work too.
The easiest solution? Use wget to recursively download folders using FTP. Here’s the wget command:
wget -r ftp://username:email@example.com/directory1/directory2/
You can replace yourftphostname.com with an IP too. So with wget, and the -r flag (for Recursive) that’s all that you need to recursively download folders using FTP. Took me a while to figure that out, but what a relief when I finally discovered how to 😀
wget -r -c ftp://username:firstname.lastname@example.org/directory1/directory2/
And if you happen to get disconnected, don’t fret, because wget has the -c flag too, which probably stands for Continue. This useful wget flag actually provides a resume function, which is very handy when you are transferring large files!
(Of course you really shouldnâ€™t be running plain old insecure ftp when sftp is availableâ€¦)
- Example FTP Session in UNIX
- How To Set Up Your Own VPN
- Recursively Delete Selected Files or Folders In Windows
- How To Use Rsync on the Mac – A Backup Solution Better Than Time Machine?
- Recursively Delete Files and Retain Folders in Windows