How to Transfer a Folder Full of Files and Sub-Directories to an FTP Server

Today I had to figure out how to transfer 65GB of data (including dozens of files in separate subfolders) from a linux server (where it was stored after being pulled from a FilesAnywhere account using a recursive wget command, “wget -r -l 0”) to another storage account at

I first tried tarring up all the files and folders and transferring that as one file, but I couldn’t get the support people at egnyte to untar it for me. So I began creating a perl script to do this task. After awhile of reading about how to make multidimensional arrays in perl for the file structure (technically this is not possible, but interacting with arrays of references in perl is very similar to multidimensional arrays in php) I decided to look for a simpler solution.

What I found was a CPAN module that allows for recursive gets and puts, (rget and rput, respectively). The details of the module are described at So after a bunch of trial and error I came up with the following script, which you can download here or copy from below.

All you need to do (once you have perl and the Net::FTP::Recursive cpan module installed-see links below for help with that) is put your ftp credentials into the file, put the perl script in the directory where you have files and folders to transfer (I actually put the perl file in the folder above where I wanted to transfer from so I wouldn’t transfer the perl file too), then execute the file by typing ‘perl “”‘ (or in my case I typed perl ../ “”).

# Filename: - this script will recursively transfer files in a directory tree to an ftp server
# Inputs:
#	$host is an ip address of the ftp server to connect to
# Configurable parameters:
#	$user is the ftp username
#	$password is the ftp password
#	$remoteDirectory to transfer files and folders of files to (defaults to top level directory)
# Example usages:
#	perl ""
#	perl ""
use strict;
use Net::FTP::Recursive;
use FileHandle;

my $numParams = scalar(@ARGV);
if ($numParams<1) {
	print "Usage:\n";
	print 'perl "ftphost"'."\n";
	print "ftphost is an ip address or domain name of the ftp server to connect to\n";

my $host=$ARGV[0];
my $user='ftpuser'; #ftp username, use single quotes so special characters are not interpreted
my $password='ftppassword'; #ftp password, use single quotes so special characters are not interpreted
my $remoteBaseDir='/path/to/sub/directory'; #ftp remote base directory where to put all the files under

my $ftp = Net::FTP::Recursive->new("$host", Debug => 1); #use Debug=>0 if you dont want to see what is happening or detailed error messages
if (length($remoteBaseDir)>0) {
print "Done transferring files\n";

In order to install the Net::FTP::Recursive CPAN module you can follow the instructions at Alternatively, if you don’t even have CPAN yet you can start with

How to Easily Prepare Database Import Using Perl

I’m revisiting this topic because in my last post about translating Excel data into a text file for database import I used a php script and yet still had some manual steps (the one that bugged me most was adding custom line termination characters for multiline data). I mentioned the idea that if I wanted to put more time in I could use a Perl script to do the whole process. Well my next set of data I needed to import was almost 700 records spanning almost ten times as many lines in the tab delimited text file I exported from Excel (due to the description fields having newline characters). So here’s my latest solution:

  1. First I added a numeric column in the left-most position (within excel) and filled it with incremental integers.
  2. Then I exported the spreadsheet as tab delimited text
  3. Then I wrote a perl script to loop through the file and add line termination characters before each new record, this time I had to use double equal signs (==) because the data contained single equal signs
  4. I also found a perl module similar to the php htmlentities() function which actually worked for me, as opposed to the php function which I abandoned in favor of hard-coded string replacement commands
  5. Then I ran the perl script on my tab delimited text file to create a tab delimited file with double equal signs as the line terminator and all the funny characters encoded so they would show up correctly in a browser
  6. Finally I imported my file using the load data infile option in the phpmyadmin interface, specifying the delimiter and line termination as \t and == respectively.
  7. Then I looked at a webpage which pulled that data and it looked like it did in the excel file!

Here is the code for the perl script I created:

# Filename: - this script will take a list of data 
#	including multiline data where the first field is a line count.
#	It will add == before each line count to enable mysql importing and 
#	will encode html entities and convert newline characters to <br>
use HTML::Entities;

my $input = $ARGV[0]; #this is the file to parse
my $output = $ARGV[1]; #this is the file to create
my $lineTerminator = "==";
my $count=1;
my $line = '';
open (IN,$input) or die "can't open file $input\n";  
open (OUT, ">$output") or die "Can't open outfile : $output\n";
while (<IN>) {	# each line of the current input file
	$line .= "<br>"; # include this here so the <br> will be encoded too
	$line = encode_entities($_); #This is the function which encodes html symbols
	# Check if this is the beginning of a new record.  If so then add the line terminator
	if ($line =~ /^$count/) {
		$line = $lineTerminator.$line;
	print OUT $line;
close IN;
close OUT;
print "there were $count records processed.\n";

The syntax for running the script (assuming perl is installed on your system – I use activeperl on my PC which is free):

perl inputFileName.txt outputFileName.txt