I'm pretty sure I'm doing things the wrong way and I'd like to know the right way.
The problem I face is:
- I have a distributed batch job
- I use AWS spot instances to fulfill it
- always a one-off request (so I'm after simple solutions)
- The instances need some configuration and setup, and they need to be sent tasks to do
I typically solve it like this:
- Bring up one worker and configure it
- Save an image of it
- Then bring up all the other instances
- Then I'll do a big, annoying, copy paste job, to basically create a shell script like this:
$HOST=123.123.123.1 $TASK=task1 ssh ubuntu@$HOST task_runner $TASK & $HOST=123.123.123.2 $TASK=task2 ssh ubuntu@$HOST task_runner $TASK & $HOST=123.123.123.3 $TASK=task3 ssh ubuntu@$HOST task_runner $TASK & ...
I'd like to do this better, so I'm after ways to make it easier to:
- Set up AWS instances. Sometimes there's a boot-time code that needs to be run, and this means more big tasks like the one above
- Get the instance list. No more copy-pasting!
- Run all the tasks on all the instances.
- Shut them down when I'm done
Ideally I'd like to cover the whole lifecycle, end to end.
Lastly, I want something pretty simple. Because it's a one off task, I don't really want to have to deal with a whole lot of config options and wiring.
Aucun commentaire:
Enregistrer un commentaire