Cannot Read a file from HDFS using Spark
Here is the solution sc.textFile(“hdfs://nn1home:8020/input/war-and-peace.txt”) How did I find out nn1home:8020? Just search for the file core-site.xml and look for xml element fs.defaultFS
Here is the solution sc.textFile(“hdfs://nn1home:8020/input/war-and-peace.txt”) How did I find out nn1home:8020? Just search for the file core-site.xml and look for xml element fs.defaultFS
In command line, you can use spark-shell -i file.scala to run code which is written in file.scala
Forget about the Hadoop UGI: a JDBC driver just needs the raw JAAS configuration to create a Kerberos ticket on-the-fly (with useKeyTab raised and useTicketCache lowered). System properties java.security.krb5.conf => (optional) non-defaut Kerberos conf java.security.auth.login.config => JAAS config file javax.security.auth.useSubjectCredsOnly => must be forced to “false” (the default has changed in some Java release, duh) … Read more