I'm migrating files from one remote server to S3. There are about 10k files (all accessible via http URLs from the remote server). The total size is about 300GB (no individual file is more than 1GB). I'm trying to figure out the most efficient way to make this migration. So far I have a EC2 instance and I have the S3CMD installed; PHP-SDK, I have a text file with all the URL's as well. I'm able to move files from EC2 to S3 without any issue. but the problem is if I download everything in EC2 I run out of storage. Is there a way where I can download a file in EC2 (maybe look in the txt file) move it to S3 (using S3CMD) and then delete the file from EC2 before I go to the next file.
Ideally I would want to download everything straight to S3 from the remote location, but I don't think that is possible, unless someone here says it is.
Thanks in advance for the help.
Aucun commentaire:
Enregistrer un commentaire