Wednesday, June 6, 2012

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask, java.io.IOException: Exception reading file:/

Exception in hive :

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask, java.io.IOException: Exception reading file:/

Error Stack :


Error initializing attempt_201206070234_0004_m_000002_0:
java.io.IOException: Exception reading file:/../Hadoop/hdfs/tmp/mapred/local/ttprivate/taskTracker/shashwat/jobcache/job_201206070234_0004/jobToken
at org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:135)
t org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1154) at org
at org.apache.hadoop.mapreduce.security.TokenCache.loadTokens(TokenCache.java:165) a.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1091) at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2360)
kTracker/shashwat/jobcache/job_2012060702
at java.lang.Thread.run(Thread.java:722) Caused by: java.io.FileNotFoundException: File file:/../Hadoop/hdfs/tmp/mapred/local/ttprivate/ta s34_0004/jobToken does not exist. at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:372)
ystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:125) at org.apache.ha
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:251) at org.apache.hadoop.fs.ChecksumFile Sdoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:283) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:418)
t_201206070234_0004_m_000002_1: java.io.IOException: Exception reading file:/../Hadoo
at org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:129) ... 5 more Error initializing attem pp/hdfs/tmp/mapred/local/ttprivate/taskTracker/shashwat/jobcache/job_201206070234_0004/jobToken at org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:135)
p.mapred.TaskTracker.localizeJob(TaskTracker.java:1091) at org.apache.hadoop.mapre
at org.apache.hadoop.mapreduce.security.TokenCache.loadTokens(TokenCache.java:165) at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1154) at org.apache.hado od.TaskTracker$5.run(TaskTracker.java:2360) at java.lang.Thread.run(Thread.java:722) Caused by: java.io.FileNotFoundException: File file:/../Hadoop/hdfs/tmp/mapred/local/ttprivate/taskTracker/shashwat/jobcache/job_201206070234_0004/jobToken does not exist.
stem.java:125) at org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.jav
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:372) at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:251) at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileS ya:283) at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:418) at org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:129)
... 5 more


Probable Reason: 
You are running a mapreduce query like (select count(*) from tablename) or some queries that include a mapreduce to run. and you have defined in core-site.xml something like this

<property>
<name>hadoop.tmp.dir</name>
<value>../Hadoop/hdfs/tmp</value>
</property>

so when the map reduce will work it will try to find the same directory on hdfs which will not be there so it will trow a file not found error message.


Soluton: 
Just remove this property from core-site.xml means remove the hadoop.tmp.dir property from the core-site.xml file and then try will work, :) worked for my case :)

No comments:

Post a Comment

Thank you for Commenting Will reply soon ......

Featured Posts

#Linux Commands Unveiled: #date, #uname, #hostname, #hostid, #arch, #nproc

 #Linux Commands Unveiled: #date, #uname, #hostname, #hostid, #arch, #nproc Linux is an open-source operating system that is loved by millio...