How to explode an array into multiple columns in Spark

Use apply:

import org.apache.spark.sql.functions.col

df.select(
  col("id") +: (0 until 3).map(i => col("DataArray")(i).alias(s"col$i")): _*
)

Leave a Comment