Skip to content Skip to sidebar Skip to footer

Download Big Files Via Ftp With Python

Im trying to download daily a backup file from my server to my local storage server, but i got some problems. I wrote this code (removed the useless parts, as the email function):

Solution 1:

Sorry if i answer my own question, but I found the solution.

I tryed ftputil with no success, so i tryed many way and finally, this works:

def ftp_connect(path):
    link = FTP(host = 'example.com', timeout = 5) #Keep low timeout
    link.login(passwd = 'ftppass', user = 'ftpuser')
    debug("%s - Connected to FTP" % strftime("%d-%m-%Y %H.%M"))
    link.cwd(path)
    returnlink

downloaded = open('/local/path/to/file.tgz', 'wb')

def debug(txt):
    print txt

link = ftp_connect(path)
file_size = link.size(filename)

max_attempts = 5#I dont want death loops.while file_size != downloaded.tell():
    try:
        debug("%s while > try, run retrbinary\n" % strftime("%d-%m-%Y %H.%M"))
        if downloaded.tell() != 0:
            link.retrbinary('RETR ' + filename, downloaded.write, downloaded.tell())
        else:
            link.retrbinary('RETR ' + filename, downloaded.write)
    except Exception as myerror:
        if max_attempts != 0:
            debug("%s while > except, something going wrong: %s\n \tfile lenght is: %i > %i\n" %
                (strftime("%d-%m-%Y %H.%M"), myerror, file_size, downloaded.tell())
            )
            link = ftp_connect(path)
            max_attempts -= 1else:
            break
debug("Done with file, attempt to download m5dsum")
[...]

In my log file i found:

01-12-201123.30 - Connected to FTP
01-12-201123.30while > try, run retrbinary
02-12-201100.31while > except, something going wrong: timed out
    file lenght is: 1754695793 > 175469579302-12-201100.31 - Connected to FTP
Done with file, attempt to download m5dsum

Sadly, i have to reconnect to FTP even if the file has been fully downloaded, that in my cas is not a problem, becose i have to download the md5sum too.

As you can see, I'm not been able to detect the timeout and retry the connection, but when i got timeout, I simply reconnect again; If someone know how to reconnect without creating a new ftplib.FTP instance, let me know ;)

Solution 2:

I implemented code with ftplib which can monitor connection, reconnect and redownload file in case of failure. Details here: How to download big file in python via ftp (with monitoring & reconnect)?

Solution 3:

You might try setting the timeout. From the docs:

# timeout in secondslink = FTP(host=ftp_host, user=ftp_user, passwd=ftp_pass, acct='', timeout=3600)

Post a Comment for "Download Big Files Via Ftp With Python"