jeudi 3 septembre 2015

Backup script writen in node.js for AWS S3 make synchronous to backup millions of files

Many search on google, stackoverflow and bing I have made but I have no answer find for my problem.

How can I make my backup script to run synchronous. I have the problem with too many files in the folder and subfolders stop the script as result of too many open files (fs.createReadStream).

I hope somebody can help me. Thank you. Greetings Sven

var AWS = require('aws-sdk')
        , s3 = new AWS.S3()
        , fs = require('fs')
        , wrench = require('wrench')
        , util = require('util')

        var smadir = "/Users/myuser/myfolder"
          , smafiles = wrench.readdirSyncRecursive(smadir)

          smafiles.forEach (function (file) {
             var params = {Bucket: 'mybucked', Key: file, Body: fs.createReadStream(smadir + '/' + file)};
             var options = {partSize: 10 * 1024 * 1024, queueSize: 1};
             s3.upload(params, options, function(err, data) {
               console.log(err, data);
             })
          })




Aucun commentaire:

Enregistrer un commentaire