How jobs are assigned to executors in Spark Streaming?

Actually, in the current implementation of Spark Streaming and under default configuration, only job is active (i.e. under execution) at any point of time. So if one batch’s processing takes longer than 10 seconds, then then next batch’s jobs will stay queued. This can be changed with an experimental Spark property “spark.streaming.concurrentJobs” which is by … Read more