Throughout the day, we receive multi-GB files from various clients that we process and load into our MS SQL database. The format of the files is the same but each file has a unique set of rules applied to it when it's processed. Note that the jobs run with no UI and we've set them up to run on a schedule: wake up, check for a file from a client and process it if one exists.
The processing time is about 2 hours per file and speed isn't a huge concern. Note that the core code is the same, it is only in the settings file (app.config) where there are differences. We'd like to move this to a cloud environment to allow us to better scale as we're expecting additional clients over the next few months. As I see it, we have a few options:
- Azure: create a cloud service with multiple worker roles, one for each client. The problem I see here is deploying the same worker role with separate settings for each and then maintaining them consistently during upgrades, etc.
- AWS: Dynamically launch small instances (t2.small?) to process a file when we get a new one. The initial setup would be more complex but I think that the ongoing maintenance, if built correctly, would be much smaller.
Does Azure or AWS offer other products that might be better suited for this? I like the idea of the worker roles within Azure but I'm unclear how we can deploy and maintain them effectively.
Aucun commentaire:
Enregistrer un commentaire