dimanche 11 janvier 2015

Uploading multiple files to Amazon S3 from PHP

Is there a way to upload multiple files in one go, rather than having to reconnect for each one?


I am using S3 as storage for my php application, which needs to store large numbers (100 at a time) of mostly small (about 10k) image files. Currently I am looping through them and uploading individually for each with this code:



$s3->putObjectFile($uploadFile, $bucketName, ($uploadFile), S3::ACL_PUBLIC_READ)


This takes a LONG time. About a minute for 1.5 meg of files. Turning off SSL as has been suggested in other answers reduces to about 40s but that's still very slow.


Here is my current code, using Amazon S3 REST implementation for PHP



$s3 = new S3($awsAccessKey, $awsSecretKey, false);


function send_to_s3($s3, $bucketName, $uploadFile)

{

$start = microtime(true);



// Check if our upload file exists
if (!file_exists($uploadFile) || !is_file($uploadFile))
exit("\nERROR: No such file: $uploadFile\n\n");

// Check for CURL
if (!extension_loaded('curl') && !@dl(PHP_SHLIB_SUFFIX == 'so' ? 'curl.so' : 'php_curl.dll'))
exit("\nERROR: CURL extension not loaded\n\n");



if ($s3->putObjectFile($uploadFile, $bucketName, ($uploadFile), S3::ACL_PUBLIC_READ))
{

$end = microtime(true);

$took = $end - $start;

echo "S3::putObjectFile(): File copied to {$bucketName}/".($uploadFile).PHP_EOL . ' - ' . filesize($uploadFile) . ' in ' . $took . ' seconds<br />';

return $took;
}
else
{
print 'error';
}

}


appreciate any help.





Aucun commentaire:

Enregistrer un commentaire