java - Accessing hadoop from remote machine -


i have hadoop set (pseudo distributed) on server vm , i'm trying use java api access hdfs.

the fs.default.name on server hdfs://0.0.0.0:9000 (as localhost:9000 wouldn't accept requests remote sites).

i can connect server on port 9000

$ telnet srv-lab 9000 trying 1*0.*.30.95... connected srv-lab escape character '^]'. ^c 

which indicates me connection should work fine. java code i'm using is:

try {         path pt = new path(                 "hdfs://srv-lab:9000/test.txt");         configuration conf = new configuration();         conf.set("fs.default.name", "hdfs://srv-lab:9000");         filesystem fs = filesystem.get(conf);         bufferedreader br = new bufferedreader(new inputstreamreader(                 fs.open(pt)));         string line;         line = br.readline();         while (line != null) {             system.out.println(line);             line = br.readline();         }     } catch (exception e) {         e.printstacktrace();     } 

but is:

java.net.connectexception: call clt-lab/1*0.*.2*2.205 srv-lab:9000 failed on connection exception: java.net.connectexception: connection refused; more details see:  http://wiki.apache.org/hadoop/connectionrefused 

thus, hints on why connection refused though connecting through telnet works fine?

your hdfs entry wrong. fs.default.name has set hdfs://srv-lab:9000. set , restart cluster. fix issue


Comments

Popular posts from this blog

php - Wordpress website dashboard page or post editor content is not showing but front end data is showing properly -

How to get the ip address of VM and use it to configure SSH connection dynamically in Ansible -

javascript - Get parameter of GET request -