Spark: Convert String to spark.sql.types object in scala -


i have array of string {"stringtype", integertype, "longtype", "stringtype"} in scala. , need covert each string spark.sql.types object while iterating

for example:

stringtype = spark.sql.types.stringtype integertype = spark.sql.types.integertype longtype = spark.sql.types.longtype.  

one solution create 1 1 hashmap of string , spark.sql.types , use while iterating array. there other cleaner way this?

i use scala pattern matching. like:

import org.apache.spark.sql.types._  val typelist = inputarray.map(s => {   s match {     case "stringtype" => stringtype     case "integertype" => integertype     etc...     case _ => throw new runtimeexception("unknown type")     } }) 

Comments

Popular posts from this blog

php - Wordpress website dashboard page or post editor content is not showing but front end data is showing properly -

How to get the ip address of VM and use it to configure SSH connection dynamically in Ansible -

javascript - Get parameter of GET request -