mercredi 2 septembre 2015

Out of memory error while uploading a large CSV file to Amazon S3

I am using the AWS SDK for JavaScript to upload a CSV (~50 MB) to a S3 bucket.

Getting an Out of memory error while uploading the file although I could upload a 35 MB file successfully.

Code so far:

<script src="http://ift.tt/1LKixaU"></script>
<script src="http://ift.tt/1LdO4RA"></script>

<script type="text/javascript">
AWS.config.update({accessKeyId: 'xxxxxxxxxxxxx', secretAccessKey:'xxxxxxxxxxx'});
AWS.config.region = 'ap-southeast-2';

var s3 = new AWS.S3();

$.get("gs_filename.csv",function(data) {
      var params = {Bucket: 'gstest', Key: 'gs_filename.csv', Body: data};
      var options = {partSize: 10 * 1024 * 1024, queueSize: 1};

      s3.upload(params, options, function(err, data) {
       console.log(err, data);
      });
    });
</script>

I've changed the bucket CORS configurations to expose ETag as well:

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://ift.tt/1f8lKAh">
    <CORSRule>
        <AllowedOrigin>*</AllowedOrigin>
        <AllowedMethod>HEAD</AllowedMethod>
        <AllowedMethod>GET</AllowedMethod>
        <AllowedMethod>PUT</AllowedMethod>
        <AllowedMethod>POST</AllowedMethod>
        <AllowedMethod>DELETE</AllowedMethod>
        <ExposeHeader>ETag</ExposeHeader>
        <AllowedHeader>*</AllowedHeader>
    </CORSRule>
</CORSConfiguration>

Any inputs on where am I doing it wrong?




Aucun commentaire:

Enregistrer un commentaire