The error "Invalid job type for this context" in spark SQL job with Spark job server -


i create spark sql job spark job server , use hivecontext following sample below: https://github.com/spark-jobserver/spark-jobserver/blob/master/job-server-extras/src/spark.jobserver/hivetestjob.scala

i able start server when run application(my scala class extends sparksqljob), getting following response:

{    "status": "error",     "result": "invalid job type context"  } 

can 1 suggest me going wrong or provide detailed procedure setting jobserver sparksql?

the code below:

import com.typesafe.config.{config, configfactory} import org.apache.spark._ import org.apache.spark.sql.hive.hivecontext import spark.jobserver.{sparkjobvalid, sparkjobvalidation, sparkhivejob}  object newhiverest extends sparkhivejob {     def validate(hive: hivecontext, config: config): sparkjobvalidation = sparkjobvalid    def runjob(hive: hivecontext, config: config): = {      hive.sql(s"use default")     val maxrdd = hive.sql(s"select count(*) 'default'.'passenger'")      maxrdd.count()   } } 

for spark sql can use following

https://github.com/spark-jobserver/spark-jobserver/blob/master/job-server-extras/src/spark.jobserver/sqltestjob.scala


Comments

Popular posts from this blog

php - Wordpress website dashboard page or post editor content is not showing but front end data is showing properly -

How to get the ip address of VM and use it to configure SSH connection dynamically in Ansible -

javascript - Get parameter of GET request -