How to load local file in sc.textFile, instead of HDFS

Try explicitly specify sc.textFile("file:///path to the file/"). The error occurs when Hadoop environment is set.

SparkContext.textFile internally calls org.apache.hadoop.mapred.FileInputFormat.getSplits, which in turn uses org.apache.hadoop.fs.getDefaultUri if schema is absent. This method reads “fs.defaultFS” parameter of Hadoop conf. If you set HADOOP_CONF_DIR environment variable, the parameter is usually set as “hdfs://…”; otherwise “file://”.

Leave a Comment