Spark: Convert String to spark.sql.types object in scala -
i have array of string {"stringtype", integertype, "longtype", "stringtype"}
in scala. , need covert each string spark.sql.types object while iterating
for example:
stringtype = spark.sql.types.stringtype integertype = spark.sql.types.integertype longtype = spark.sql.types.longtype.
one solution create 1 1 hashmap of string , spark.sql.types , use while iterating array. there other cleaner way this?
i use scala pattern matching. like:
import org.apache.spark.sql.types._ val typelist = inputarray.map(s => { s match { case "stringtype" => stringtype case "integertype" => integertype etc... case _ => throw new runtimeexception("unknown type") } })
Comments
Post a Comment