13/07/02 13:37:04 WARN conf.Configuration: DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop- site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
Exception in thread "main" java.io.IOException: Call to Master.Hadoop/172.20.145.22: 9000 failed on local exception: java. io.EOFException
at org.apache.hadoop.ipc.Client.wrapException (Client.java: 1107)
at org.apache.hadoop.ipc.Client.call (Client.java: 1075)
at org.apache.hadoop.ipc.RPC $ Invoker.invoke (RPC.java: 225)
at $ Proxy1.getProtocolVersion (Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy (RPC.java: 396)
at org.apache.hadoop.ipc.RPC.getProxy (RPC.java: 379)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode (DFSClient.java: 119)
at org.apache.hadoop.hdfs.DFSClient.
at org.apache.hadoop.hdfs.DFSClient.
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize (DistributedFileSystem.java: 89)
at org.apache.hadoop.fs.FileSystem.createFileSystem (FileSystem.java: 1386)
at org.apache.hadoop.fs.FileSystem.access $ 200 (FileSystem.java: 66)
at org.apache.hadoop.fs.FileSystem $ Cache.get (FileSystem.java: 1404)
at org.apache.hadoop.fs.FileSystem.get (FileSystem.java: 254)
at org.apache.hadoop.fs.Path.getFileSystem (Path.java: 187)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath (FileInputFormat.java: 372)
at org.apache.hadoop.examples.WordCount.main (WordCount.java: 68)
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt (Unknown Source)
at org.apache.hadoop.ipc.Client $ Connection.receiveResponse (Client.java: 804)
at org.apache.hadoop.ipc.Client $ Connection.run (Client.java: 749)
I started rookie, seek expert advice, thank thank
------ Solution -------------------------- ------------------
window with the hosts file on the machine yet
------ Solution -------- ------------------------------------
Call to Master.Hadoop/172.20.145.22 : 9000 failed Mr
description of your instances and HDFS nowhere
in the code to try to write dead jobtracker address
conf.set ("mapred.job.tracker", "Master.Hadoop");
------ For reference only ---------------------------------- -----
describe very clearly
------ For reference only -------------------------- -------------
configured before. Just also found, window under the "Map / Reduce Locations" to "Advanced parameters" and not the same parameters under linux, window more of the following parameters.
------ For reference only -------------------------------------- -
conf.set ("mapred.job.tracker", "172.20.145.22:9001");
conf.set ("fs.default.name", "hdfs :/ / 172.20.145.22:9000");
loaded with the above two together, or the same error. I'm a little perplexed, as well as what might be the problem then?
------ For reference only -------------------------------------- -
issue has been resolved. Delete the original data and re-establish MR engineering, reconfiguring "Map / Reduce Locations", successfully executed.
Thank moderators tntzbzc's advice, thank you, oh, please also future exhibitions.
没有评论:
发表评论