APLawrence.com -  Resources for Unix and Linux Systems, Bloggers and the self-employed
RSS Feeds RSS Feeds











(OLDER) <- More Stuff -> (NEWER) (NEWEST)
Printer Friendly Version
->
-> Perl Net::FTP


Perl Net::FTP



Before the wide spread availability of Perl, I would script ftp transfers with .netrc, ksh scripts and other clumsy ways. None of those methods are fun, flexible or easy. On the other hand, Perl's Net::FTP module is all of that.

With Net::FTP, you have total control. You know when there are errors, timeouts, whatever. It's not at all difficult: anyone with basic scripting skills can understand and use this.

I'm going to present two programs here. One is very simple; you can probably understand it even if you know no Perl at all. It just logs into my ftp site, gets a listing, and displays it. The other is a fairly complicated program that goes out to a list of hosts and gets files with a date equal to or newer than what you specify. Even with the extra complexity, you should be able to follow it, and perhaps modify it for your own needs.

Here's the first:


#!/usr/bin/perl
use Net::FTP;

my $host="aplawrence.com";
my $directory="pub";

$ftp=Net::FTP->new($host,Timeout=>240) or $newerr=1;
  push @ERRORS, "Can't ftp to $host: $!\n" if $newerr;
  myerr() if $newerr;
print "Connected\n";

$ftp->login("ftp","apl\@") or $newerr=1;
print "Getting file list";
  push @ERRORS, "Can't login to $host: $!\n" if $newerr;
  $ftp->quit if $newerr;
  myerr() if $newerr; 
print "Logged in\n";

$ftp->cwd($directory) or $newerr=1; 
  push @ERRORS, "Can't cd  $!\n" if $newerr;
  myerr() if $newerr;
  $ftp->quit if $newerr;

@files=$ftp->dir or $newerr=1;
  push @ERRORS, "Can't get file list $!\n" if $newerr;
  myerr() if $newerr;
print "Got  file list\n";   
foreach(@files) {
  print "$_\n";
  }
$ftp->quit;


sub myerr {
  print "Error: \n";
  print @ERRORS;
  exit 0;
}
 

Here's another. This watches a directory for files, ftps them somewhere and deletes them.

#!/usr/bin/perl
use strict;
use Net::FTP;
#
# Following variables are needed
my $directory="/usr/ftpdir";
my $logfile="/tmp/xfer.log";
my $destination="xyz.com";
my $login="ftplogin";
my $password="ftppassword";
my $putdir="/pub";
my $delay=60;
#
# program starts here
#
my $ftp;
my $newerr;
my @files;
my $file;
my $filecount;
my $line;
my $date;
chdir($directory) or die("$! Can't cd to $directory");
while (1)  {
   @files=();
   $newerr=0;
   logit("Starting new loop");
   foreach(<*>) {
          push @files,$_;
   }
   my $filecount=@files;
   logit("$filecount files to transfer");
   if  (not $filecount) {
            logit("Nothing to do");
            sleep $delay;
            next;
   }
   $ftp=Net::FTP->new($destination,Timeout=>240) or $newerr=1;
        if ($newerr) {
             logit("Can't connect to $destination");
      sleep $delay;
             next;
        }
   $ftp->login("$login","$password") or $newerr=1;
        if ($newerr) {
          logit("Can't login  $destination with $login,$password");
         $ftp->quit;
         sleep $delay;
         next;
        }
   $ftp->binary(); # set binary mode
   $ftp->cwd($putdir) or $newerr=1; 
        if ($newerr) {
          logit("Can't cd to $putdir on  $destination");
          $ftp->quit;
          sleep $delay;
          next;
        }
   foreach(@files) {
      $file=$_;
      $newerr=0;
      $ftp->put($file,$file) or $newerr=1;
     if ($newerr) {
        logit("Error transferring $file");
        next;
   }
   unlink($file) or logit("$! can't unlink $file");;
          
        }

# delay for next loop
sleep $delay;
}
 
sub logit {
  $line=shift;
  $date=gmtime(time);
  open(LOG,">>$logfile") or die("$! Can't open $logfile");
  print LOG "$date: $line\n";
  print "$date: $line\n";
  close LOG;
}
 

Pretty simple, right? Net::FTP makes it all so easy, so let's do something that would absolutely drive me batty without it.

#!/usr/bin/perl
use Net::FTP;
$date=shift @ARGV;
@months=qw(null Jan Feb Mar Apr My Jun Jul Aug Sep Oct Nov Dec);
@hosts=qw(pcunix.org pcunix.com xyz.com);
@dirs=qw(pub pub pub);
@logins=qw(ftp anonymous fred);
@passwords=qw(tony\@ apl\@ fxdfed);

$x=0;
foreach(@months) {
 $months{$_}=$x++;
 }
# we need this hash later

if (not $date) {
  $now=time();
  $now -= (24 * 3600 );
  # yesterday
  ($nowsec,$nowmin,$nowhr,$nowday,$nowmon,$nowyr,$nowdow,$nowdoy,$nowdst)=localtime($now);
  $nowyr+=1900;$nowmon++;
  $date=sprintf("%.2d/%.2d/%.4d",$nowmon,$nowday,$nowyr);
  print "Using $date\n";
  }

$now=time();
($nowsec,$nowmin,$nowhr,$nowday,$nowmon,$nowyr,$nowdow,$nowdoy,$nowdst)=localtime($now);
$nowyr+=1900;
 
# need $nowyr later 

($month,$day,$year)=split /\//,$date;

#
# broken next century - blame me then
#
$year+=2000 if $year < 100;

$x=0;
foreach (@hosts) {
   $newerr=0;
   $ftp=Net::FTP->new($_,Timeout=>240) or $newerr=1;
     push @ERRORS, "Can't ftp to $_: $!\n" if $newerr;
     next if $newerr;
   print "Connected $_\n";

   $ftp->login($logins[$x],$passwords[$x]) or $newerr=1;
     push @ERRORS, "Can't login to $_: $!\n" if $newerr;
     $ftp->quit if $newerr;
     next if $newerr;
   print "Logged in $_\n";

   $ftp->cwd($dirs[$x]) or $newerr=1;
     push @ERRORS, "Can't cd $dirs[$x] on $_ $!\n" if $newerr;
     $ftp->quit if $newerr;
     next if $newerr;
   print "Getting file list $_\n";

   @files=$ftp->dir or $newerr=1;
     push @ERRORS, "Can't get file list on $_ $!\n" if $newerr;
     $ftp->quit if $newerr;
     next if $newerr;
   print "Got list $_\n";

   print "Looking for $date $time\n";
   foreach(@files) {
    $_=substr($_,41);
    s/  */ /g;
    s/^  *//g;
    chomp;
    @stuff=split / /;
    # if it's today, the year slot will have time instead
    # so make it this year
    $stuff[2]=$nowyr if /:/;

      $ftp->quit if ($stuff[2] < $year);
      next if ($stuff[2] < $year);
      $ftp->quit if ($months{$stuff[0]} < $month and $stuff[2] == $year);
      next if ($months{$stuff[0]} < $month and $stuff[2] == $year);
      $ftp->quit if ($stuff[0] < $day and $stuff[2] == $year and $months{$stuff[0]} == $month);
      next if ($stuff[1] < $day and $stuff[2] == $year and $months{$stuff[0]} == $month);

    print "Getting $_\n";
    $ftp->get($stuff[3],$stuff[3]) or $newerr=1;
      push @ERRORS, "Couldn't get $stuff[3] $!\n" if $newerr;

   }

   $ftp->quit;
}

print @ERRORS;
exit 0;

 



If this page was useful to you, please help others find it:  





127 comments




More Articles by - Find me on Google+



Click here to add your comments
- no registration needed!


Article:
http://www.aplawrence.com/Unixart/perlnetftp.html
if you wanna get a simple list of files in a ftp dir, you should replace dir with ls (i.e. @files=$ftp->ls or print "Cannot get file listing\n).

--

Sure. but that produces a different list. In this example, we want to compare dates.

--TonyLawrence

--

How did you know what I needed? Thank you for being perversely helpful. I hope this won't encourage my already significantly deep lazy streak. . .

Sam

Coolz shiznit manz, wellz dunz
I tihnk I may join yur clan of wiki tips for editing
thank you very mutch... they wont gimme anythin else to do here aight






Sat Mar 12 15:45:47 2005: 169   Timok


I was searching for information about net::ftp library for perl. Your article was very helpful. Now i can automate many task. Many tx

http://www.achsms.pl



Wed Mar 30 12:47:44 2005: 251   Jack


As listed on the page the script does not increment $x which means that it does not cycle through the remote directories properly. Adding an $x++; after the final $ftp->quit; fixes that.

The extra $ftp->quit s in the section where the script is checking the age of the file mean that the connection is dropped when the first old file is met and not reestablished before it attempts to download any files of the right age. Removing them seems to do no harm so I imagine that they are a hangover from some less general purpose script.

After fixing those I found this extremely valuable. Thank you Mr. Lawrence.



Wed Mar 30 13:33:45 2005: 252   TonyLawrence

gravatar
Sorry about that.. yes, this was taken from a more specific script, so yes, I must have left stuff in I shouldn't have :-)



Fri Apr 29 20:05:48 2005: 411   anonymous


is the second example ftp supposed to get/download the files to the current directory where the perl script is located?



Fri Apr 29 21:12:24 2005: 412   TonyLawrence

gravatar
is the second example ftp supposed to get/download the files to the current directory where the perl script is located?

Wherever you are stting when you run it. If the script is /xyz/script.pl, you could

cd /tmp
/xyz/script.pl

and the files would be in /tmp







Thu May 12 20:11:15 2005: 494   anonymous


Thank you, Tony. This was very helpful. - Rick Brandfass



Thu May 12 21:55:12 2005: 495   TonyLawrence

gravatar
I'm sure there's more than one Rick Brandfass in the world.. but I assume this is the same Rick that I see in person once a year or so?



Fri May 13 12:55:40 2005: 496   anonymous


Yup, hope to see you soon. - Rick



Fri May 13 13:58:55 2005: 497   anonymous


I tried your first script, but unfortunally it failed when it tried to get the file list.

Here is the error messages:
Connected
Getting file listLogged in
Can't locate object method "new" via package "Net::FTP::A" at /usr/share/lib/perl5/irix-n32/5.00405/IO/Socket.pm line 253.
Attempt to free unreferenced scalar during global destruction.

What do I miss here?

Roger



Fri May 13 14:35:12 2005: 498   TonyLawrence

gravatar
I don't know: did you cut and paste or type it yourself?



Mon May 16 06:38:58 2005: 505   anonymous


By cut and paste...

Roger



Mon May 16 10:19:11 2005: 507   TonyLawrence

gravatar
Well, I just cut and pasted it and it works..

If you didn't have Net::Ftp you'd get "Can't locate Net/FTP.pm in @INC" immediately, so maybe you have it but it is somewhow corrupt?



Tue May 17 19:01:24 2005: 523   Gabriel



I tried your script but I was not able to download any files. Instead I get the following error message: "Bad file descriptor". I tried using a manual FTP and I was able to download the file with no problems. What do you suggest?



Tue May 17 19:29:05 2005: 525   TonyLawrence

gravatar
I suggest you review the code. You have done something wrong.



Tue May 17 19:37:22 2005: 526   TonyLawrence

gravatar
Just so you know I'm not trying to be difficult, I just pasted the first example of the above code into a Linux machine:

"t.pl" 38L, 808C written
[root@kerio root]# chmod 755 t.pl
[root@kerio root]# ./t.pl
Connected
Getting file listLogged in
Got file list
-rw-r--r-- 1 pcunix ftpadmin 3353 Jul 11 2004 404.pl
-rw-r--r-- 1 pcunix ftpadmin 7146 Mar 3 2001 Driver.c
-rw-r--r-- 1 pcunix ftpadmin 36821 Dec 23 2002 Linuxsamples.zip
-rw-r--r-- 1 pcunix ftpadmin 40016 Jan 20 2003 MacOSXsamples.zip
-rw-r--r-- 1 pcunix ftpadmin 6640 Mar 16 2002 README.ssh
-rwxr-xr-x 1 pcunix ftpadmin 1474560 Jul 28 2004 RESCO218.RWI
-rw-r--r-- 1 pcunix ftpadmin 37550 Dec 23 2002 SCOsamples.zip


etc.

The code works. If it doesn't work for you, you need to figure out why.







Wed May 25 09:25:33 2005: 572   Jack


I think hte Bad file descriptor problem is due to the legacy $ftp->quit() calls which leave you referrring to a closed ftp connection. Make sense of those and you'll be on your way.



Wed May 25 10:57:04 2005: 575   TonyLawrence

gravatar
But it should be exiting after an ftp->quit



Tue Jun 14 18:10:43 2005: 653   anonymous


did you figure out how to get rid of that Bad File Descript error. I can run a standard get and put inside the perl script but the way it is fails with that error message as stated before.



Tue Jun 14 21:43:46 2005: 654   TonyLawrence

gravatar
I dunno. I must just not be seeing something here.. works for me, but not for you..



Wed Jul 13 23:05:49 2005: 795   anonymous


OK. I had the same error and I solved forcing passive mode this way:
$ftp=Net::FTP->new($host, Debug => 1, Passive => 1);
Note that Debug is ON, it really helps to see what is going on.
Something else, maybe the file you trying to send is already there, I guess this could be the reason for this message too. Try deleting the file(s) manually and run your code again with passive mode ON.
Enjoy!!! revac71



Thu Jul 21 15:47:15 2005: 835   anonymous


Tony - this is very useful, thanks for sharing. I need to do opposite, upload newer files, not download. My ftp and perl skills are minimal. Appreciate any help. -Debster



Thu Jul 21 18:06:10 2005: 838   TonyLawrence

gravatar
It's just "put" instead of "get".



Mon Aug 1 22:15:52 2005: 916   anonymous


Not that minimal :) Need to get the local list and upload any newer files.



Mon Aug 1 22:24:30 2005: 917   TonyLawrence

gravatar
Well, that's similar to what the second example does.

If you can't grok it, I or dozens of other folks at the Consultants page could write a script to spec; probably wouldn't cost much.



Fri Sep 9 15:43:30 2005: 1068   RobG


This doesn't seem to work very well. When I sign on, I am in (say) /home/user1, but I need to get files from /u1/scratch. I can cwd to u1, but only see subdirectories belonging to root. Any idea what's going on?



Fri Sep 9 16:14:18 2005: 1069   TonyLawrence

gravatar
I don't understand your question. You "signed on"? To what? What are you supposed to be seeing?

Perl isn't doing anything different than you'd do by hand, so if you aren't getting what you expect, you did something wrong or ignored an error return you need to pay attention to.



Sat Sep 10 15:40:52 2005: 1071   BigDumbDinosaur


Hey, everyone who had "trouble" with Tony's script! I copied and pasted it onto my rusty old SCO box and guess what? After removing the extra $ftp->quit it worked for me. So, please stop complaining that it *doesn't* work, 'cuz it does.

One bad thing, tho. Now I have to scrap the ancient ksh script I wrote oh so many years ago to "automate" FTP sessions. Now what am I going to do with all my new-found spare time? <Grin>



Mon Sep 12 17:08:16 2005: 1078   RobG


Sorry, my comment was a bit terse.

I have a perl script using Net::FTP to sign onto a remote server, change to a specific directory on that server, and look for certain files. Now, if I do this "by hand" (ftp in UNIX), I have no problem doing a cwd, but $ftp->cwd only works for some directories. And even if it works, it does not "see" everything in that directory. So, there seem to be some restrictions on what $ftp->cwd can do.



Mon Sep 12 17:12:54 2005: 1079   TonyLawrence

gravatar
There's something wrong in your script. If you can do it by hand, Net::FTP can do it.







Mon Sep 12 17:21:40 2005: 1080   RobG


In the following (with Debug on), I first look for files rg* in remote directory /home/rgrig/ftptmp - success. Then, look for files rg* in remote directory /u3/tsqadmin/scratch (same system) - failure ("No such file or directory")

rgrig@sc2{250} ftptst3 "/home/rgrig/ftptmp" "rg*"
Net::FTP>>> Net::FTP(2.74)
Net::FTP>>> Exporter(5.562)
Net::FTP>>> Net::Cmd(2.25)
Net::FTP>>> IO::Socket::INET(1.25)
Net::FTP>>> IO::Socket(1.26)
Net::FTP>>> IO::Handle(1.21)
Net::FTP=GLOB(0xd1acc)<<< 220 sc2 FTP server ready.
Net::FTP=GLOB(0xd1acc)>>> user rgrig
Net::FTP=GLOB(0xd1acc)<<< 331 Password required for rgrig.
Net::FTP=GLOB(0xd1acc)>>> PASS ....
Net::FTP=GLOB(0xd1acc)<<< 230 User rgrig logged in.
Net::FTP=GLOB(0xd1acc)>>> CWD /home/rgrig/ftptmp
Net::FTP=GLOB(0xd1acc)<<< 250 CWD command successful.
Net::FTP=GLOB(0xd1acc)>>> PORT 127,0,0,1,177,182
Net::FTP=GLOB(0xd1acc)<<< 200 PORT command successful.
Net::FTP=GLOB(0xd1acc)>>> NLST rg*
Net::FTP=GLOB(0xd1acc)<<< 150 Opening ASCII mode data connection for file list.
Net::FTP=GLOB(0xd1acc)<<< 226 Transfer complete.
rgtst1
rgtst2
Net::FTP=GLOB(0xd1acc)>>> QUIT
Net::FTP=GLOB(0xd1acc)<<< 221-You have transferred 0 bytes in 0 files.
Net::FTP=GLOB(0xd1acc)<<< 221-Total traffic for this session was 363 bytes in 1 transfers.
Net::FTP=GLOB(0xd1acc)<<< 221-Thank you for using the FTP service on sc2.
Net::FTP=GLOB(0xd1acc)<<< 221 Goodbye.

rgrig@yowie{83} ls /home/rgrig/ftptmp
abc rgtst1 rgtst2


rgrig@sc2{251} ftptst3 "/u3/tsqadmin/scratch" "rg*"
Net::FTP>>> Net::FTP(2.74)
Net::FTP>>> Exporter(5.562)
Net::FTP>>> Net::Cmd(2.25)
Net::FTP>>> IO::Socket::INET(1.25)
Net::FTP>>> IO::Socket(1.26)
Net::FTP>>> IO::Handle(1.21)
Net::FTP=GLOB(0xd1acc)<<< 220 sc2 FTP server ready.
Net::FTP=GLOB(0xd1acc)>>> user rgrig
Net::FTP=GLOB(0xd1acc)<<< 331 Password required for rgrig.
Net::FTP=GLOB(0xd1acc)>>> PASS ....
Net::FTP=GLOB(0xd1acc)<<< 230 User rgrig logged in.
Net::FTP=GLOB(0xd1acc)>>> CWD /u3/tsqadmin/scratch
Net::FTP=GLOB(0xd1acc)<<< 550 /u3/tsqadmin/scratch: No such file or directory.

rgrig@yowie{86} ls /u3/tsqadmin/scratch/rg*
/u3/tsqadmin/scratch/rgtst1 /u3/tsqadmin/scratch/rgtst2






Sun Dec 18 04:33:31 2005: 1447   anonymous


The simple script is exactly what I was looking for.

Thanks for doing this.

Don




Wed Jan 11 16:06:19 2006: 1500   jrg


when doing an "@files=$ftp->ls, $newerror=1", if there aren't any files, this returns with $newerror equal to 1, but if there is at least one file, then it works fine. Has anybody seen this before? Is there a workaround?



Wed Jan 11 18:39:59 2006: 1501   TonyLawrence

gravatar
Workaround for what? What do you want to do if there are no files?



Thu Jan 12 13:46:39 2006: 1504   jrg


I think it should return with $newerr=0 and @files="", since there really weren't any problems doing the ls, it's just that there are no files.






Wed Jan 25 08:00:14 2006: 1555   anonymous


Exactly what I was looking for... You Rock!!! Worked like a charm



Wed Jan 25 08:53:28 2006: 1557   anonymous


I am trying to print the size of the file (using examle #1) like so:

foreach (@files) {
$size = ftp->size($file); #Get file size then later convert to KB's
print "$_ is $size (bytes) in size.";

I am getting an error that I perhaps forgot to load the module. Can you help?



Wed Jan 25 11:21:14 2006: 1558   TonyLawrence

gravatar
We can't help when you are vague and coy. You got an error. Say what the error is, don't tell us what you think it means.

Getting free help demands more from YOU. If you want to pay for help, you can be confusing, vague, and incomplete. When you want free help, you have to be accurate, concise and intelligent. See http://aplawrence.com/newtosco.html#newsg for advice on what you need to do to get free advice.



Wed Jan 25 17:51:53 2006: 1563   anonymous


Didn't mean to sound "coy", I just wanted to know if the following is the correct syntax in order to get the size of the file:

$size = ftp->size($_);

No if you don't want answer this which ...was... my original question, then thanks for what you have given me so far and I'll research further.



Wed Jan 25 18:07:32 2006: 1564   TonyLawrence

gravatar
I call b.s.

You explicitly did NOT ask that question. Go back and read what you wrote. I don't mean to sound nasty, but you didn't even really ask a question at all, and certainly didn't ask this question.

Your syntax is incorrect if it's as you wrote it here. Look closely at the examples and note that you need $ftp->size



Wed Jan 25 18:43:10 2006: 1565   anonymous


OK, first, I appreciate your help!!! I didn't mean to sonund coy... I don't want to come across as vague... (It was 5AM and I had been searching *a long time* and finding a *lot* of bad examples on other sites.)

But thanks. I'll try it

NEXT PROJECT: Learn how to properly ask a question after hours of frustration



Wed Jan 25 18:54:15 2006: 1566   TonyLawrence

gravatar
Good luck - hope you have it now.







Fri Jan 27 19:01:57 2006: 1577   Tom


Like others here, I found this very helpful, so thanks!

One problem I had was using $ftp->ls on a directory with a really large number of files (5k plus). It returns with an error "Arguments too long". I'm assuming I'm pushing up against the size of the buffer or something. Do you have any suggestions as to how to deal with this? For example, is there a way to grab the directory listing in pieces?

Thank you for any suggestions.



Fri Jan 27 19:27:33 2006: 1578   TonyLawrence

gravatar
That's probably coming from the server - it's screwing up.



Fri Jan 27 20:57:09 2006: 1579   Tom


Turns out you're right. The server is running SCO Unix. When I did a test on another client's server with lots of files in it (running Redhat Fedora) it worked fine. So now I've got to figure out why the SCO box is doing that and either figure out a fix or a workaround.

Thanks.



Fri Jan 27 22:21:05 2006: 1580   TonyLawrence

gravatar
Increase MAXEXECARGS (kernel tunable)



Tue Jan 31 21:58:57 2006: 1595   anonymous


I see that information gets printed out if debug is on in a previous message, but the POD for Net::FTP says to use $ftp->message. When I do this, I get Net::FTP=GLOB(0x1f6c0e8)->message but there is no associated message...???

How can I get this to print the actual messages? $! doesn't give me what I would expect either, a bad password gives me Bad file descriptor



Tue Jan 31 22:13:56 2006: 1596   TonyLawrence

gravatar



To print the errors:

 
$ cat t.pl
#!/usr/bin/perl
use Net::FTP;

my $host="aplawrence.com";
my $directory="pub";

$ftp=Net::FTP->new($host,Timeout=>60) or die "Can't connect";
print "Connected\n";

$ftp->login("dxftp","apl\@") or die "Cannot login ", $ftp->message;

$ ./t.pl
Connected
Cannot login Login incorrect.








Tue Jan 31 22:17:56 2006: 1597   anonymous


But I don't want to die, I just want to log the message. I tried to treat it like an array ref, but that barfed...



Tue Jan 31 22:26:36 2006: 1598   TonyLawrence

gravatar
Then use "warn" or store it:

 
$err="";
$ftp->login("dxftp","apl\@") or $err= $ftp->message;
print $err if $err;
exit 1 if $err;




Tue Jan 31 22:28:00 2006: 1599   anonymous


Got it, Thanks so much for your help!



Mon Nov 13 10:45:15 2006: 2607   anonymous


Hi can you help me turning off the user interactive prompt for FTP using this script......Thanks



Mon Nov 13 11:21:13 2006: 2608   TonyLawrence

gravatar
No idea what you mean - explain more fully or see http://aplawrence.com/rates.html





Fri Jan 5 23:06:49 2007: 2801   anonymous


Your second Net::FTP example has some very clumsy Perl constructs that could be much simplified. Let me know if you'd like to see a much neater, easier, simpler version.



Fri Jan 5 23:12:17 2007: 2802   TonyLawrence

gravatar
That depends.

I try to write so that people who don't know much about Perl have a chance of following along. I admit that sometimes you can write great Perl that is also easy for the neophyte to understand, and if that's what you had in mind, yeah, I'm all for it.

On the other hand, there's nothing wrong with posting a higher level example in the comments so folks can learn from that too..



Thu Apr 19 14:55:00 2007: 2959   anonymous


I have a perl ftp script which has an statement like
$ftp->get($file1) || warn "can't get $file1";

It has to download hundreds of files from a ftp server. When running, it usually works ok, but suddently I get the following error:

Timeout at /usr/local/lib/perl5/site_perl/Net/FTP.pm line 503

I get the error at different files each time, so it's nothing to do with the files on the ftp site. I would like to retry with the file when this happens, but the problem is that control doesn't return to the script as it exits with a -1, my warn statement isn't reached, and the script fails.

Do you know how to get the module not to abort and kill the script too? Or can this not be avoided?



Thu Apr 19 17:37:17 2007: 2960   tonylawrence


Are you trapping ALL errors or just the get?

You can set a longer timeout too..



Thu Apr 19 18:06:43 2007: 2961   anonymous


I'm trapping all errors, but it is always 'get' that fails, and always on a different file.

Files are pretty small, so with a timeout=10 is normally ok. But anyway, I've tried with 1200 for example, and suddently on a file it keeps thinking and thinking until it reaches timeout (without a timeout it would probably go thinking infinitely...), and then kills the program with the message: Timeout at FTP.pm line 503.

I think it's a problem with the ftp server, not with the perl module, but I was wondering if I could wait for the timeout to run out and then just warn that the file could not be downloaded, but without killing the program with the Timeout message.



Thu Apr 19 18:09:58 2007: 2962   tonylawrence


Beyond my knowledge; suggest trying at a Perl newsgroup.



Fri Apr 20 07:43:04 2007: 2964   anonymous


Ok. Many thanks for your answer anyway!



Wed May 30 23:14:41 2007: 3009   Mac


In the @months array the abbreviation for May is incorrect (at least on my installation) it is listed as My when it should be May. Here I am testing in May, and it failed. Once I corrected this all worked fine. Thanks for this.



Mon Jul 16 15:37:51 2007: 3059   anonymous


This was really helpful



Thu May 1 17:04:19 2008: 4162   anonymous


Why would you try and do $ftp->quit and things like that AFTER calling myerr() which has an exit statement?



Thu May 1 17:09:17 2008: 4163   TonyLawrence

gravatar
Shouldn't be that way :-)



Mon Dec 22 19:10:23 2008: 4957   anonymous


any reason you did not use something like
(cut/paste from my own ftp upload script)

sub ftp_upload {
my $file = shift;
print "checking file $file on ftp server - ";

# my ISP supports modtime, my local NAS does not.
# we need to handle both
my $ftpmodtime = $ftp->mdtm($file);
if (defined $ftpmodtime) {
my $stamp = gmtime $ftpmodtime;
}
else {
$ftpmodtime = 0;
}
my $modtime = (stat ($file))[9];

if ($modtime > $ftpmodtime) {
$ftp->put($file,$file);
print "uploaded.\n";
}
else {
print "skipped\n";
}
}

in the second script, it would avoid all the date handling hassle



Mon Dec 22 19:26:26 2008: 4958   TonyLawrence

gravatar
I had the very best of reasons: I didn't notice that function!

Thanks! That will be handy.







Mon Apr 6 17:03:57 2009: 6018   TonyLawrence

gravatar
The examples given above will no longer work with my site as we stopped all ftp. Substitute a known working site.



Sat Jul 18 19:11:43 2009: 6645   Rajiv

gravatar
Thanks Lawrence , out of the many example programs that I found on the internet for Net::FTP this was the easiest and most helpful. I used a part of this script to implement an automated solution for testing an FTP server availability / Up time .



Sat Jul 18 19:19:28 2009: 6646   TonyLawrence

gravatar
Happy to have been helpful!



Wed Oct 7 07:22:41 2009: 7084   anonymous

gravatar
Hi Tony,

Is it possible to still move on in the code even if an ftp error is thrown? In your first example, If I would like to set some flag when the ftp error is thrown e.g

$ftp=Net::FTP->new($host,Timeout=>240) or $newerr=1;
push @ERRORS, "Can't ftp to $host: $!\n" if $newerr;
myerr() if $newerr;
print "Connected\n";

$Flag = "No Error"; ## no ftp error


sub myerr {
print "Error: \n";
print @ERRORS;
$Flag = "Error"; ## if there is an ftp error
exit 0;
}

ftp-> quit;

print " Message: $Flag";

how do I not exit after I set the flag to "Error" in myerr() subroutine and move on in the code and print the value of the flag after ftp-> quit. Currnetly, The code would exit after throwing the FTP error and would not print $Flag.

Thanks



Wed Oct 7 11:08:54 2009: 7085   TonyLawrence

gravatar
Basic misunderstanding: The code ends because myerr() calls exit. If myerr() did a "return;" instead of "exit;" , then the code continues. It's that simple.



Mon Oct 12 18:23:06 2009: 7203   anonymous

gravatar
Thanks for putting this out there for use. I'm having an issue with it sorting through the array for what I'm specifying. I've added debugging and "print @files" after it builds the array to be sure it's pulling in the directory's contents. I'm calling it with today's date (10/12/2009). There are 3 files that exist with today's date so far. Here's the last part of the printed array, which shows the three files with a oct 12 date and what happens:

1 utcftp System 259845 Oct 12 03:04 ParkHoldAdd.100912.0301.txt-rwxr-x--- 1 utcftp System 259845 Oct 12 07:06 ParkHoldAdd.100912.0704.txt-rwxr-x--- 1 utcftp System 260559 Oct 12 11:14 ParkHoldAdd.100912.1113.txtLooking for 10/12/2009
Net::FTP=GLOB(0x1b28078)>>> QUIT
Net::FTP=GLOB(0x1b28078)<<< 221 bye

Calling it with yesterday's date does the same thing... it doesn't "see" the files it needs to pull in so it exits. Can you help out?






Mon Oct 12 18:32:27 2009: 7204   TonyLawrence

gravatar
What you need to debug is the matching part - print what the code is looking for vs. what it actually found. That's what is going to show you the problem.

Probably you'll find that your server formats differently than what I tested this against.



Mon Oct 12 19:29:10 2009: 7205   anonymous

gravatar
You were right, it was including the last digit of the file size descriptor, i.e.:

9 Oct 12 11:14 ParkHoldAdd.100912.1113.txt

I changed the position from 41 to 42:

$_=substr($_,42);

It's now finding the files at least, but it's pulling in bad file descriptors:

Looking for 10/12/2009
Net::FTP=GLOB(0x9ef488)>>> QUIT
Net::FTP=GLOB(0x9ef488)<<< 221 bye
Getting Oct 12 15:20 .
Getting Oct 12 03:04 ParkHoldAdd.100912.0301.txt
Getting Oct 12 07:06 ParkHoldAdd.100912.0704.txt
Getting Oct 12 11:14 ParkHoldAdd.100912.1113.txt
Getting Oct 12 15:20 ParkHoldAdd.100912.1519.txt
Couldn't get .
Couldn't get ParkHoldAdd.100912.0301.txt Bad file descriptor
Couldn't get ParkHoldAdd.100912.0704.txt Bad file descriptor
Couldn't get ParkHoldAdd.100912.1113.txt Bad file descriptor
Couldn't get ParkHoldAdd.100912.1519.txt Bad file descriptor

I read the previous comment about this same error message. I have forced PASV mode but I'm still getting that returned.



Mon Oct 12 19:38:40 2009: 7206   TonyLawrence

gravatar
Have you tried manually with the same credentials etc. ?







Mon Oct 12 19:42:50 2009: 7207   anonymous

gravatar
Yes I can see/download these files manually. Is there a way to print exactly what the operation's command is and what is returned as it happens?



Mon Oct 12 19:45:56 2009: 7208   TonyLawrence

gravatar
I don't recall anything like that, but check the docs.



Thu Apr 15 12:55:31 2010: 8421   anonymous

gravatar


Hi can i use the code in Windows as well.? Especially the 2nd code. Its exactly what I am looking for.



Thu Apr 15 18:19:05 2010: 8423   TonyLawrence

gravatar


Yes, you can use it in Windows.



Thu Aug 12 11:03:03 2010: 8891   Picard

gravatar


Your second script is exactly what I need. Well written and easy to understand (even for a Perl newbie like me). However, the FTP server I'm trying to connect to isn't running on the default port of 21.
When I use $destination="my.destination.com 2121"; it can't connect.
If I try and specify a $port and add it in $ftp=Net::FTP->new($destination $port); it doesn't work either.
I've tried searching for this but can't find anything about Net::Ftp using anything specific for connecting to a non-default port.
So I'm hopeing you know.

regards

Picard



Thu Aug 12 11:14:52 2010: 8892   TonyLawrence

gravatar


See http://perldoc.perl.org/Net/FTP.html#CONSTRUCTOR

$ftp=Net::FTP->new($destination,Timeout=>240, Port=>2121)



Mon Sep 27 05:04:39 2010: 9010   MrZ

gravatar


Tony, 7 years... that's some hang time !
I recently slapped together some routines for Net::SFTP... the actual perl code is similar, but installing all the encryption stuff can be a bit of a pain depending on your perl install. My question? Did you think you'd be answering Net::FTP questions for 7 years? Oh, why do you exit with a state of '0' on an error? Shouldn't you exit with a state > 0 for errors? Works out better for shell scripts calling your code etc.

Just curious







Mon Sep 27 10:21:23 2010: 9011   TonyLawrence

gravatar


No doubt it will just keep on trucking.

That "myerror" should exit non-zero, I agree.



Wed Oct 27 14:49:05 2010: 9075   anonymous

gravatar


I was looking for a simple method to test whether an ftp server was up and running. Tried using .netrc but this did not work well when the server was unavailable...

Have not done any perl in years,.. but this method works, much cleaner and so simple ... thanks



Wed Jan 12 14:21:54 2011: 9229   Murugan

gravatar


I am new for perl.....can i get a script which is for download files from a server and upload that files to another server using sftp ?
The both remote servers are linux and local server is centos

Thanks



Wed Jan 12 18:19:56 2011: 9230   TonyLawrence

gravatar


If by "Get a script", you mean "get somebody to write it for me at no charge", then the answer is no.



Mon Jan 31 06:25:43 2011: 9260   MrZ

gravatar


For those who are interested, if you're going to write a script for sftp transfers you might want to investigate the lftp utility. Fairly easy to implement in shell for simple requirements... or stay up late a couple nights with Net::SFTP.



Mon Jan 31 11:50:58 2011: 9261   TonyLawrence

gravatar


I'd advise against that if it will be an unattended script.



Mon Jan 31 21:48:21 2011: 9264   MrZ

gravatar


I meant that where simple is defined as something that limits your finger movements - 'macro' kind of stuff, not unattended operations. Perl is probably the easiest 'good' way to do it.



Thu Feb 10 15:32:05 2011: 9289   JoeDeal

gravatar


The examples are priceless even after all these years.... Thanks



Tue Mar 22 15:27:26 2011: 9394   GuillermoParedes

gravatar


You have any plan to add proxy support to you Net::FTP library in the CPAN enviroment?
best regards







Tue Mar 22 17:00:17 2011: 9395   TonyLawrence

gravatar


We're just users, we don't write it. See http://perldoc.perl.org/Net/FTP.html#COPYRIGHT



Fri Jun 3 02:39:34 2011: 9523   anonymous

gravatar


hi, i want to download a database from an online server. would this script be helpful? if not, then suggest me what should i do?



Fri Jun 3 10:01:06 2011: 9524   TonyLawrence

gravatar


A script using Net::FTP would be helpful if that is something you need to do automatically and unattended. Otherwise, no, you'd just ftp or curl or scp manually.



Sat Jun 4 12:34:33 2011: 9526   anonymous

gravatar


i ran the first script but got this error:
"Can't ftp to ftp://ftp.ncbi.nih.gov/snp/database/organism_data/rice_4530"
but my statements in the script were:
my $host="ftp://ftp.ncbi.nih.gov/snp/database/";
my $directory="organism_data";

plz clearly tell me what shud i assign to variables $host and $directory.



Sat Jun 4 12:42:35 2011: 9527   TonyLawrence

gravatar


Your host is NOT "ftp://ftp.ncbi.nih.gov/snp/database/organism_data/rice_4530"

It is, perhaps, "ftp.ncbi.nih.gov/snp/database/organism_data/rice_4530"

The "ftp://" is what you'd use in a browser, not in a script like this. It tells the browser to use FTP protocol rather than http.



Sat Jun 4 12:45:54 2011: 9528   TonyLawrence

gravatar


Ooops. Hasty paste.

Your host is "ftp.ncbi.nih.gov"

Your directory is "snp/database/organism_data/rice_4530"



Sat Jun 4 12:48:05 2011: 9529   TonyLawrence

gravatar


Although of course I am only guessing.

Your directory could be "/snp/database/organism_data"

and "rice_4530" might be the file you want.

Hopefully you know what it is you actually want and where it is.



Sun Jun 5 12:16:46 2011: 9530   anonymous

gravatar


yup. host is "ftp.ncbi.nih.gov"
and "/snp/database/organism_data" is the directory which has the file "rice_4530" which i want.
but my point is i jst assign "ftp.ncbi.nih.gov/snp/database/" to $host and "organism_data" to $directory so how the hell it generates an error: "Can't ftp to ftp://ftp.ncbi.nih.gov/snp/database/organism_data/rice_4530" when i didnt even mention the file rice_4530 in the script and the directory has many other files. moreovr even if i make some changes in the URL it is generating the same error. how the hell is it possible? and note i didnt write ftp:// in the script as u asked me to but still i got the same error. plz help me out!



Sun Jun 5 15:09:11 2011: 9531   TonyLawrence

gravatar


You cannot set $host to anything more than "ftp.ncbi.nih.gov" . Set $directory to "snp/database/organism_data".

Then "$ftp->get" whatever you want.



Thu Jul 14 14:41:58 2011: 9615   anonymous

gravatar


it was really helpful
thanks a lot;



Sat Jul 16 14:38:30 2011: 9630   Nazar78

gravatar


Hi, I trying to write a http proxy script which works with http/s 1.0/1.1 connect, get, post, head etc. Those part are done and working as I expected. And I would like to include ftp. The dir listing part is complete but I can't find a way to dump $ftp->retr directly to the browser. Debug shows data connection open but there's nothing from read ( BUFFER, SIZE [, TIMEOUT ] ). No issues with get() but I want to skip downloading locally. Anyone can help using the dataconn class? Had been googling but no luck. Thanks.



Sat Jul 16 14:44:26 2011: 9631   TonyLawrence

gravatar


I suspect you'd need to hack Net::FTP to get what you want there. Far beyond my interest level and skills, honestly.



Tue Sep 6 04:55:56 2011: 9773   Juan
http://thetcp55vt30.com
gravatar


I just finished modifying the simple display script to actually get a file. I must say that without your listing part it would have been tougher for me. I have not written Perl in a while although I have written a lot in the past. I did read your others and this one seemed to be skeleton enough for me to change any way I liked. I am back to developing perl! Thanks for making my return a success in one day to get a script to work! Great scripts you have here.



Wed Oct 26 11:08:21 2011: 10067   Ak305nTemel

gravatar


Thank you your scritp.its work :).
I want to know can i backup the file before puting the ftp sites.

Best Regards



Wed Oct 26 11:11:16 2011: 10069   TonyLawrence

gravatar


Of course you can. Why wouldn't you be able to? You can do anything you want before, during or after.



Wed Oct 26 11:25:00 2011: 10070   Ak305nTemel

gravatar


Hi ,
befora begin put command the file is genereting on the time.That way file is not correctly created.How to issues this problem.

Thank you
Best Regards



Wed Oct 26 11:33:07 2011: 10071   TonyLawrence

gravatar


I have no idea what you mean.

If English is not your native language, find someone who can express it better - that might help.

I'm going to be out the rest of the day so will not be able to see replies until tomorrow.



Thu Apr 5 02:11:12 2012: 10824   anonymous

gravatar


I wonder what different between
my $directory="/usr/ftpdir"; ==> does it refer to the file dir that we want to transfer?
and
my $putdir="/pub"; => how about this one "pub" is a file name???



Thu Apr 5 03:04:52 2012: 10826   anonymous

gravatar


Dear TonyLawrence,

My purpose is I want to transfer file from my server to another server.
I have many files on this directory /home/sophea/file_mis/ and i want to transfer to other server on this directory. /emd_data/tsophea.
Below is the script i have create: but it was send only exact file name, i want it to do loop to send all all in my directory /home/sophea/file_mis/ to /emd_data/tsophea.

sub ftp_mis {
my $src = "/home/sophea/file_mis/file_20120404_music.txt";
my $host = '203.122.26.33';
my $user = 'abc';
my $passwd = 'abc';
my $ftp = Net::FTP -> new ($host,Timeout=>20) or mlog("Cannot access $host via FTP");
$ftp ->login($user,$passwd) or mlog($ftp->message);
$ftp->binary or mlog($ftp->message);
$ftp->put($src) or mlog($ftp->message);
$ftp->quit or mlog($ftp->message);
}






Thu Apr 5 11:47:57 2012: 10828   TonyLawrence

gravatar


To the person asking about "$pudir" - no, both are directories - notice the "cwd".

To the multiple files question:

$ftp ->login($user,$passwd) or mlog($ftp->message);
$ftp->binary or mlog($ftp->message);
foreach (<*>) {
$ftp->put($_) or mlog($ftp->message);
}
$ftp->quit or mlog($ftp->message); }


The "foreach (<*>)" can be "foreach (<*.txt>)" etc.



Mon Apr 16 07:16:03 2012: 10857   anonymous

gravatar


Hi,

I'm trying to get data rates when the file is downloaded or uploaded.

Is there any function or trick that can help me out?

Will appreciate your reply.

Regards,
Akram



Mon Apr 16 10:00:00 2012: 10858   TonyLawrence

gravatar


If I understand you, just grab time() at the start of each transfer and at the end and calculate the rate from the file size.






Tue May 29 19:40:24 2012: 11034   Senz

gravatar


Hello Sir,

could you please help me out to increase FTP transfer speed in perl, as it is too slow comparing to manual transfer in cmd.

i'm using below code for connecting ftp:

sub connectFTP()
{
my $lRC = 0;
$ftp = Net::FTP->new("$Host",Timeout =>40,Debug => 0 ,Bytes_read => 4096, BlockSize => 4096);
if ( defined $ftp && $ftp->code() == 220 )
{
$ftp->login("$UserName","$Password");
if ( defined $ftp && $ftp->code() == 230)
{
&Logtime();
printf FH "$logtime : FTP Connection Established $Host !!!!\n";
&Logtime();
printf FH "$logtime : After connectFTP to $Host of $section\n";
$gRc = &processFtpCopy() if ( $gRc == 0);
&Logtime();
}
else
{
$lRC = 201;
$Cnt_Mail=sprintf("Error:FTP Login Failed for $Host\n");
&err_mail();
&Logtime();
printf FH "$logtime : Error:FTP Login Failed $Host\n";
}
}
else
{
$lRC = 301;
$Cnt_Mail=sprintf("Error:FTP Connection Failed $Host\n");
&err_mail();
&Logtime();
printf FH "$logtime : Error:FTP Connection Failed \n";
}
return $lRC;
$ftp->quit;
print "Hia \n";
&Logtime();
printf FH "$logtime : FTP Connection closed \n";
}






Wed May 30 10:14:04 2012: 11036   TonyLawrence

gravatar


All I can suggest is to keep playing with blocksize.



Fri Jul 20 11:49:59 2012: 11211   emb

gravatar


Can we modify existing perl script which is used for FTP access to access both FTP and FTPS sites .?
Do we need to use /usr/bin/ftp-ssl for FTPS ?






Fri Jul 20 11:54:38 2012: 11212   TonyLawrence

gravatar


See http://search.cpan.org/~kral/Net-FTPSSL-0.04/FTPSSL.pm



Mon Oct 1 21:28:35 2012: 11362   anonymous

gravatar


how would you do this logging into a domain name ?

\\Domain\user please demonstrate thanks!



Mon Oct 1 21:43:07 2012: 11363   TonyLawrence

gravatar


The domain is the host.



Sun Nov 11 01:59:05 2012: 11426   Keith

gravatar


I need a script that I can run from my server that will connect to an external server via ftp. It then needs to select 2 zip files from a particular directory on the external server. It then needs to download those files to my server to the desired directory.

THEN, it needs to unzip the files from each zip into a particular directory. I'm willing to pay to get this done. Can you help?






Sun Nov 11 02:07:43 2012: 11427   TonyLawrence

gravatar


Sure, but that's a very simple script. I'd charge you $100.00 - you could probably find someone to do it for a quarter of that.



Wed Jan 16 01:03:19 2013: 11736   anonymous

gravatar


Hi,

I was using the Net::Ftp for transferring files and was wondering if there is any perl or ftp command that I could use to output/display the Throughput of the transfer ? Please let me know.

Thanks,
Arun



Wed Jan 16 01:38:16 2013: 11737   TonyLawrence

gravatar


ftp->size(file) will give you the size of a file you put on the other server; use ordinary stat() to get sizes of received files.



Wed Mar 13 11:59:18 2013: 11958   armin

gravatar


Like others here, I found this very helpful, so thanks! You made my day!







Thu Sep 26 13:39:58 2013: Website: 12325   anonymous

gravatar


Hi
I am using Net::FTP module to connect the FTP site. i need to connect automatically if FTP connection is failure. The script will not come to end if the connection is failure. It will trying to connect the FTP site until FTP is success. Please provide the solution and sample script.

It is very helpful to us

Suresh



Thu Sep 26 13:51:48 2013: Website: 12326   TonyLawrence

gravatar


I don't write scripts for free.

Everything you need to know is covered in the article and comments.

My minimum charge for writing anything is $400.00. That's outrageously high because I have no interest at all in doing it.



Thu Sep 26 13:53:35 2013: Website: 12327   TonyLawrence

gravatar


I don't write scripts for free.

Everything you need to know is covered in the article and comments.

My minimum charge for writing anything is $400.00. That's outrageously high because I have no interest at all in doing it.

Don't miss responses! Subscribe to Comments by RSS or by Email

Click here to add your comments


If you want a picture to show with your comment, go get a Gravatar

Kerio Samepage


Have you tried Searching this site?

Unix/Linux/Mac OS X support by phone, email or on-site: Support Rates

This is a Unix/Linux resource website. It contains technical articles about Unix, Linux and general computing related subjects, opinion, news, help files, how-to's, tutorials and more. We appreciate comments and article submissions.

Publishing your articles here

Jump to Comments



Many of the products and books I review are things I purchased for my own use. Some were given to me specifically for the purpose of reviewing them. I resell or can earn commissions from the sale of some of these items. Links within these pages may be affiliate links that pay me for referring you to them. That's mostly insignificant amounts of money; whenever it is not I have made my relationship plain. I also may own stock in companies mentioned here. If you have any question, please do feel free to contact me.

I am a Kerio reseller. Articles here related to Kerio products reflect my honest opinion, but I do have an obvious interest in selling those products also.

Specific links that take you to pages that allow you to purchase the item I reviewed are very likely to pay me a commission. Many of the books I review were given to me by the publishers specifically for the purpose of writing a review. These gifts and referral fees do not affect my opinions; I often give bad reviews anyway.

We use Google third-party advertising companies to serve ads when you visit our website. These companies may use information (not including your name, address, email address, or telephone number) about your visits to this and other websites in order to provide advertisements about goods and services of interest to you. If you would like more information about this practice and to know your choices about not having this information used by these companies, click here.

pavatar.jpg

This post tagged:

       - FTP
       - Networking
       - Perl
       - Popular
       - Programming
       - Scripting



















My Troubleshooting E-Book will show you how to solve tough problems on Linux and Unix systems!


book graphic unix and linux troubleshooting guide



Buy Kerio from a dealer
who knows tech:
I sell and support

Kerio Connect Mail server, Control, Workspace and Operator licenses and subscription renewals



Click and enter your name and phone number to call me about Kerio® products right now (Flash required)