2013年8月1日星期四

eclipse hadoop plugin configuration

with a good hadoop 's eclipse plugin, wrote a test class is used to print configuration information , configuration information is found to eclipse plugin that comes with , rather than hadoop installation directory configuration information, such as my core-site.xml configuration fs.default.name information hdfs :/ / localhost: 9000, eclipse print out the message that fs.default.name = file :/ / /. This class is packaged and put into a jar file, run the shell command line , the printed information is configured hadoop installation directory information , ie hdfs :/ / localhost: 9000.

This is the test code from the Hadoop The Definitive Guide Second Edition revised edition of 135

package lin;

import java.util.Map.Entry;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;

public class ConfigurationPrinter extends Configured implements Tool{
static{
Configuration.addDefaultResource("hdfs-default.xml");
Configuration.addDefaultResource("hdfs-sit.xml");
Configuration.addDefaultResource("mapred-default.xml");
Configuration.addDefaultResource("mapred-site.xml");
}

@Override
public int run(String[] arg0) throws Exception {
// TODO Auto-generated method stub
Configuration conf = getConf();
for(Entry<String,String> entry : conf){
System.out.printf("%s=%s\n",entry.getKey(),entry.getValue());
}
return 0;
}

public static void main(String args[]) throws Exception{
int exitCode = ToolRunner.run(new ConfigurationPrinter(),args);
System.out.println(exitCode);
}

}


eclipse print information
fs.default.name = file :/ / /

packaged into jar command line after printing information
fs.default.name = hdfs :/ / localhost: 9000

This situation is normal do

------ Solution ------------------------------------ --------
not called , do not control

没有评论:

发表评论