Hadoop使用学习笔记
3. Map-Reduce本地调试全程Debug(上)
将之前的项目中的Resource中的除了log4j配置其他的文件全部删除。同时,添加本地库(就是之前从集群中拷贝下来的Hadoop文件夹),添加其目录下的share/hadoop中的所有文件作为一个library,如下所示:
之后,注释掉删除/test/ouput那一行代码,因为本地运行无法这样删除远程HDFS目录:
代码语言:javascript复制//先删除输出目录
//deleteDir(jobConf, args[1]);
我们在集群机器上手动删除:
代码语言:javascript复制./bin/hdfs dfs -rm -r /test/output
运行,发现异常:
代码语言:javascript复制16/08/04 19:17:51 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/08/04 19:17:52 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
16/08/04 19:17:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
16/08/04 19:17:53 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
16/08/04 19:17:54 INFO input.FileInputFormat: Total input paths to process : 2
16/08/04 19:17:55 INFO mapreduce.JobSubmitter: number of splits:2
16/08/04 19:17:56 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_local2007553514_0001
16/08/04 19:17:56 INFO mapreduce.JobSubmitter: Cleaning up the staging area file:/tmp/hadoop-862911/mapred/staging/sfdba2007553514/.staging/job_local2007553514_0001
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:609)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:187)
at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:174)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:108)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:285)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:344)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:115)
at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:125)
at org.apache.hadoop.mapred.LocalJobRunner$Job.(LocalJobRunner.java:163)
at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:731)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:240)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at com.hash.test.hadoop.mapred.wordcount.WordCount.run(WordCount.java:54)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at com.hash.test.hadoop.mapred.wordcount.WordCount.main(WordCount.java:59)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
这个是因为windows目录权限所致的问题,改写(NativeIO.java:609)代码:
代码语言:javascript复制public static boolean access(String path, NativeIO.Windows.AccessRight desiredAccess) throws IOException {
return true;
// return access0(path, desiredAccess.accessRight());
}
让这个方法直接返回true(改写方法就是新建同包同名类NativeIO,复制所有源代码,替换上面的代码部分) 继续运行,成功 接下来我们可以打断点调试了。