How to manually deploy 3rd party utility jar for Apache Spark cluster? -


i have apache spark cluster (multi-nodes) , manually deploy utility jars each spark node. should put these jars to? example: spark-streaming-twitter_2.10-1.6.0.jar

i know can use maven build fat jar including these jars, deploy these utilities manually. in way, programmers not have deploy these utilities jars.

any suggestion?

1, copy 3rd party jars reserved hdfs directory;
example hdfs://xxx-ns/user/xxx/3rd-jars/

2, in spark-submit, specify these jars using hdfs path;
hdfs: - executors pull down files , jars hdfs directory

--jars hdfs://xxx-ns/user/xxx/3rd-jars/xxx.jar   

3, spark-submit not repleatly upload these jars

client: source , destination file systems same. not copying hdfs://xxx-ns/user/xxx/3rd-jars/xxx.jar 

Comments

Popular posts from this blog

php - Wordpress website dashboard page or post editor content is not showing but front end data is showing properly -

How to get the ip address of VM and use it to configure SSH connection dynamically in Ansible -

javascript - Get parameter of GET request -