vendredi 28 août 2015

running spark 1.4.1 on EC2

I am trying to launch a cluster using spark-ec2. I use the following command:

./spark-ec2 /Users/homeFolder/spark-east.pem -k spark-east -t m3.xlarge -s 1 --region=us-east-1 --zone=us-east-1a --spark-version=1.4.1 --hadoop-major-version=2 --ebs-vol-size=100 launch test-cluster 

I receive the following error:

Deploying files to master...Traceback (most recent call last):
File "./spark_ec2.py", line 1465, in <module> main()
File "./spark_ec2.py", line 1457, in main real_main()
File "./spark_ec2.py", line 1293, in real_main
setup_cluster(conn, master_nodes, slave_nodes, opts, True)
File "./spark_ec2.py", line 794, in setup_cluster
modules=modules
File "./spark_ec2.py", line 1059, in deploy_files
subprocess.check_call(command)
File "/usr/lib/python2.7/subprocess.py", line 535, in check_call
retcode = call(*popenargs, **kwargs)
File "/usr/lib/python2.7/subprocess.py", line 522, in call
return Popen(*popenargs, **kwargs).wait()
File "/usr/lib/python2.7/subprocess.py", line 710, in __init__
errread, errwrite)
File "/usr/lib/python2.7/subprocess.py", line 1335, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory




1 commentaire: