's3 - auto compression before transfer
I want to know :
If any available tool by default compress and transfer file to S3
or tool has option to compress and transfer file to S3
or I have to call python libraries and compress and then transfer to S3
Solution 1:[1]
You can do with simple python code if you want to.
import gzip
import json
import boto3
# To compress
# Data to compress
data = [{'name': 'test'}, {'name': 'test2'}]
# Converting data to string
json_str = json.dumps(data) + "\n"
# Converting to bytes
json_bytes = json_str.encode('utf-8')
jsonfilename = "s3_compressed_file.json.gz"
# Compressing to gzip
with gzip.GzipFile(jsonfilename, 'w') as f:
f.write(json_bytes)
# Upload to S3
s3BucketName = 'mybucket'
s3_resource = boto3.resource('s3')
# if you want to rename file while uploading
file_name = 's3_compressed_file-1.json.gz'
# '/current_dir_path/' + filename, '<bucket-name>', 's3_folder/{}'.format(filename)
s3_response = s3_resource.meta.client.upload_file('source_dir/' + 's3_compressed_file.gz', s3BucketName,
'destination_dir/{}'.format(file_name))
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 |