How to flatmap a nested Dataframe in Spark

Spark 2.0+

Dataset.flatMap:

val ds = df.as[(String, String, String, String)]
ds.flatMap { 
  case (x1, x2, x3, x4) => x3.split(",").map((x1, x2, _, x4))
}.toDF

Spark 1.3+.

Use split and explode functions:

val df = Seq(("A", "B", "x,y,z", "D")).toDF("x1", "x2", "x3", "x4")
df.withColumn("x3", explode(split($"x3", ",")))

Spark 1.x

DataFrame.explode (deprecated in Spark 2.x)

df.explode($"x3")(_.getAs[String](0).split(",").map(Tuple1(_)))

Leave a Comment