python - file does not exist - spark submit -
i'm trying launch spark application using command:
time spark-submit --master "local[4]" optimize-spark.py
but got these errors:
using spark's default log4j profile: org/apache/spark/log4j-defaults.properties 16/01/27 15:43:32 info sparkcontext: running spark version 1.6.0 16/01/27 15:43:32 warn nativecodeloader: unable load native-hadoop library platform... using builtin-java classes applicable 16/01/27 15:43:32 info securitymanager: changing view acls to: damianfox 16/01/27 15:43:32 info securitymanager: changing modify acls to: damianfox 16/01/27 15:43:32 info securitymanager: securitymanager: authentication disabled; ui acls disabled; users view permissions: set(damianfox); users modify permissions: set(damianfox) 16/01/27 15:43:33 info utils: started service 'sparkdriver' on port 51613. 16/01/27 15:43:33 info slf4jlogger: slf4jlogger started 16/01/27 15:43:33 info remoting: starting remoting 16/01/27 15:43:33 info remoting: remoting started; listening on addresses :[akka.tcp://sparkdriveractorsystem@192.168.0.102:51614] 16/01/27 15:43:33 info utils: started service 'sparkdriveractorsystem' on port 51614. 16/01/27 15:43:33 info sparkenv: registering mapoutputtracker 16/01/27 15:43:33 info sparkenv: registering blockmanagermaster 16/01/27 15:43:33 info diskblockmanager: created local directory @ /private/var/folders/8m/h5qcvjrn1bs6pv0c0_nyqrlm0000gn/t/blockmgr-defb91b0-50f9-45a7-8e92-6d15041c01bc 16/01/27 15:43:33 info memorystore: memorystore started capacity 511.1 mb 16/01/27 15:43:33 info sparkenv: registering outputcommitcoordinator 16/01/27 15:43:33 info utils: started service 'sparkui' on port 4040. 16/01/27 15:43:33 info sparkui: started sparkui @ http://192.168.0.102:4040 16/01/27 15:43:33 error sparkcontext: error initializing sparkcontext. java.io.filenotfoundexception: added file file:/project/minimumfunction/optimize-spark.py not exist. @ org.apache.spark.sparkcontext.addfile(sparkcontext.scala:1364) @ org.apache.spark.sparkcontext.addfile(sparkcontext.scala:1340) @ org.apache.spark.sparkcontext$$anonfun$15.apply(sparkcontext.scala:491) @ org.apache.spark.sparkcontext$$anonfun$15.apply(sparkcontext.scala:491) @ scala.collection.immutable.list.foreach(list.scala:318) @ org.apache.spark.sparkcontext.<init>(sparkcontext.scala:491) @ org.apache.spark.api.java.javasparkcontext.<init>(javasparkcontext.scala:59) @ sun.reflect.nativeconstructoraccessorimpl.newinstance0(native method) @ sun.reflect.nativeconstructoraccessorimpl.newinstance(nativeconstructoraccessorimpl.java:62) @ sun.reflect.delegatingconstructoraccessorimpl.newinstance(delegatingconstructoraccessorimpl.java:45) @ java.lang.reflect.constructor.newinstance(constructor.java:422) @ py4j.reflection.methodinvoker.invoke(methodinvoker.java:234) @ py4j.reflection.reflectionengine.invoke(reflectionengine.java:381) @ py4j.gateway.invoke(gateway.java:214) @ py4j.commands.constructorcommand.invokeconstructor(constructorcommand.java:79) @ py4j.commands.constructorcommand.execute(constructorcommand.java:68) @ py4j.gatewayconnection.run(gatewayconnection.java:209) @ java.lang.thread.run(thread.java:745) 16/01/27 15:43:34 info sparkui: stopped spark web ui @ http://192.168.0.102:4040 16/01/27 15:43:34 info mapoutputtrackermasterendpoint: mapoutputtrackermasterendpoint stopped! 16/01/27 15:43:34 info memorystore: memorystore cleared 16/01/27 15:43:34 info blockmanager: blockmanager stopped 16/01/27 15:43:34 info blockmanagermaster: blockmanagermaster stopped 16/01/27 15:43:34 warn metricssystem: stopping metricssystem not running 16/01/27 15:43:34 info outputcommitcoordinator$outputcommitcoordinatorendpoint: outputcommitcoordinator stopped! 16/01/27 15:43:34 info sparkcontext: stopped sparkcontext 16/01/27 15:43:34 info remoteactorrefprovider$remotingterminator: shutting down remote daemon. 16/01/27 15:43:34 info remoteactorrefprovider$remotingterminator: remote daemon shut down; proceeding flushing remote transports. 16/01/27 15:43:34 info remoteactorrefprovider$remotingterminator: remoting shut down. error - failed write data stream: <open file '<stdout>', mode 'w' @ 0x10bb6e150> 16/01/27 15:43:34 info shutdownhookmanager: shutdown hook called 16/01/27 15:43:34 info shutdownhookmanager: deleting directory /private/var/folders/8m/h5qcvjrn1bs6pv0c0_nyqrlm0000gn/t/spark-c00170ca-0e05-4ece-a962-f9303bce4f9f spark-submit --master "local[4]" optimize-spark.py 6.12s user 0.52s system 187% cpu 3.539 total
how can fix this? wrong on variables? it's been lot of time i'm searching cannot find solution. thanks!
i move project folder desktop folder , working.
wasn't working before, because put project in folder name has spaces, therefore command, likely, didn't find file.
Comments
Post a Comment