'A way to load big data on Python from SFTP server, not using my hard disk

As my hard disk memory is not sufficient, I uploaded big data to SFTP server. How can I access the server in Python and load them to analyze? I just want to work on Python and send the output to server again, not downloading them to my computer.



Solution 1:[1]

With Paramiko library, you can use SFTPClient.open method to retrieve a file-like object that works with remote data. Most Python APIs then can use this file-like object in place of local file handle.

with sftp.open('filename.txt', bufsize=32768) as f:
    # use f as if you have opened a local file with open()

For some real-life examples, see:


But you will still be downloading the data. You cannot work with remote data locally without downloading them (using network bandwidth). You just won't be storing the data to the local file system.


For the purpose of bufsize=32768, see Reading file opened with Python Paramiko SFTPClient.open method is slow.

Solution 2:[2]

Visit https://pypi.org/project/pysftp/

    import pysftp

    with pysftp.Connection('hostname', username='me', password='secret') as sftp:
        with sftp.cd('public'):             # temporarily chdir to public
        sftp.put('/my/local/filename')  # upload file to public/ on remote
        sftp.get('remote_file')         # get a remote file

Visit http://docs.paramiko.org/en/stable/api/sftp.html

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1
Solution 2 Jeyasuriya Natarajan