apache spark - Override default cookbook chef variables -


i looking specific instructions on how override default values in third party cookbook. example, using apache_spark cookbook (https://github.com/clearstorydata-cookbooks/apache_spark)

and want override attribute default['apache_spark']['standalone']['master_host']

i tried making main recipe, in add node.default['apache_spark']['standalone']['master_host'] = 'foo.com'

and execute using chef solo like:

run_list(   'recipe[main]',   'recipe[apache_spark::spark-standalone-worker]' ) 

but not seem work. suggestions on how needs done? main recipe here https://github.com/vibhuti/chef-main

the correct fix make wrapper cookbook , set values in cookbook's attributes file (main/attributes/default.rb):

override['apache_spark']['standalone']['master_host'] = 'foo.com' 

also make sure add dependency in main's metadata.rb force load ordering correct:

depends 'apache_spark' 

Comments

Popular posts from this blog

php - Wordpress website dashboard page or post editor content is not showing but front end data is showing properly -

How to get the ip address of VM and use it to configure SSH connection dynamically in Ansible -

javascript - Get parameter of GET request -