I'm trying to do a "hello world" with new boto3 client for AWS.
They use-case I have is fairly simple: get object from S3 and save it to the file.
In boto 2.X I would do it like this:
import boto
key = boto.connect_s3().get_bucket('foo').get_key('foo')
key.get_contents_to_filename('/tmp/foo')
In boto 3 . I can't find a clean way to do the same thing, so I'm manually iterating over the "Streaming" object:
import boto3
key = boto3.resource('s3').Object('fooo', 'docker/my-image.tar.gz').get()
with open('/tmp/my-image.tar.gz', 'w') as f:
chunk = key['Body'].read(1024*8)
while chunk:
f.write(chunk)
And it works fine. I was wondering is there any "native" boto3 function that will do the same task?
import boto3
RépondreSupprimers3_client = boto3.client('s3')
s3_client.download_file('fooo', 'docker/my-image.tar.gz', '/tmp/my-image.tar.gz')
# where the bucket name is fooo and the key name is docker/my-image.tar.gz
# This writes the file to /tmp/my-image.tar.gz