I have tried both the commands below and did set the env variables prior to launch of the scripts, but I am hit with "AWS was not able to validate the provided access credentials" error. I don't think there is an issue with keys. I would appreciate any sort help to fix this. I am on ubuntu t2.micro instance. Regards, Mohan
export AWS_SECRET_ACCESS_KEY= export AWS_ACCESS_KEY_ID=
./spark-ec2 -k admin-key1 -i /home/ubuntu/admin-key1.pem -s 3 launch my-spark-cluster
./spark-ec2 --key-pair=admin-key1 --identity-file=/home/ubuntu/admin-key1.pem --region=ap-southeast-2 --zone=ap-southeast-2a launch my-spark-cluster
AuthFailure
AWS was not able to validate the provided access credentialsbbe3e1bf-5337-4fdd-9b49-e98e12361148 Traceback (most recent call last): File "./spark_ec2.py", line 1465, in main() File "./spark_ec2.py", line 1457, in main real_main() File "./spark_ec2.py", line 1277, in real_main opts.zone = random.choice(conn.get_all_zones()).name File "/cskmohan/spark-1.4.1/ec2/lib/boto-2.34.0/boto/ec2/connection.py", line 1759, in get_all_zones [('item', Zone)], verb='POST') File "/cskmohan/spark-1.4.1/ec2/lib/boto-2.34.0/boto/connection.py", line 1182, in get_list raise self.ResponseError(response.status, response.reason, body) boto.exception.EC2ResponseError: EC2ResponseError: 401 Unauthorized AuthFailure
AWS was not able to validate the provided access credentialsbbe3e1bf-5337-4fdd-9b49-e98e12361148
Aucun commentaire:
Enregistrer un commentaire