Getting ClassNotFound Exception: While dumping Configuration in Hadoop -


i'm trying simple program "hadoop in action" book merge series of files local file system 1 file in hdfs. code snippet same 1 provided in book.

import java.lang.*; import java.util.*; import java.io.*; import org.apache.hadoop.fs.*; import org.apache.hadoop.fs.filesystem; import org.apache.hadoop.fs.filestatus; import org.apache.hadoop.fs.fsdatainputstream; import org.apache.hadoop.fs.fsdataoutputstream; import org.apache.hadoop.fs.path;  public class putmerge {      public static void main(string[] args) throws ioexception{         configuration conf = new configuration();         filesystem hdfs = filesystem.get(conf);         filesystem local = filesystem.getlocal(conf);          path inputdir = new path(args[0]); // first argument has input directory          path hdfsfile = new path(args[1]); // concatenated hdfs file name          try {             filestatus[] inputfiles = local.liststatus(inputdir); // list of local files              fsdataoutputstream out = hdfs.create(hdfsfile); // target file creation              (int = 0; i<inputfiles.size; i++ {                  fsdatainputstream in = local.open(inputfiles[i].getpath());                  int bytesread = 0;                 byte[] buff = new byte[256];                  while (bytesread = (in.read(buff))>0) {                     out.write(buff,0,bytesread);                 }                 in.close();             }             out.close();          }          catch(exception e) {             e.printstacktrace();         }      } } 

the program compiled , while trying run i'm getting following exception

exception in thread "main" java.lang.noclassdeffounderror: org/apache/commons/configuration/configuration @ org.apache.hadoop.metrics2.lib.defaultmetricssystem.(defaultmetricssystem.java:37) @ org.apache.hadoop.metrics2.lib.defaultmetricssystem.(defaultmetricssystem.java:34) @ org.apache.hadoop.security.ugiinstrumentation.create(ugiinstrumentation.java:51) @ org.apache.hadoop.security.usergroupinformation.initialize(usergroupinformation.java:217) @ org.apache.hadoop.security.usergroupinformation.ensureinitialized(usergroupinformation.java:185) @ org.apache.hadoop.security.usergroupinformation.issecurityenabled(usergroupinformation.java:237) @ org.apache.hadoop.security.kerberosname.(kerberosname.java:79) @ org.apache.hadoop.security.usergroupinformation.initialize(usergroupinformation.java:210) @ org.apache.hadoop.security.usergroupinformation.ensureinitialized(usergroupinformation.java:185) @ org.apache.hadoop.security.usergroupinformation.issecurityenabled(usergroupinformation.java:237) @ org.apache.hadoop.security.usergroupinformation.getloginuser(usergroupinformation.java:482) @ org.apache.hadoop.security.usergroupinformation.getcurrentuser(usergroupinformation.java:468) @ org.apache.hadoop.fs.filesystem$cache$key.(filesystem.java:1519) @ org.apache.hadoop.fs.filesystem$cache.get(filesystem.java:1420) @ org.apache.hadoop.fs.filesystem.get(filesystem.java:254) @ org.apache.hadoop.fs.filesystem.get(filesystem.java:123) @ putmerge.main(putmerge.java:16) caused by: java.lang.classnotfoundexception: org.apache.commons.configuration.configuration @ java.net.urlclassloader$1.run(urlclassloader.java:366) @ java.net.urlclassloader$1.run(urlclassloader.java:355) @ java.security.accesscontroller.doprivileged(native method) @ java.net.urlclassloader.findclass(urlclassloader.java:354) @ java.lang.classloader.loadclass(classloader.java:423) @ sun.misc.launcher$appclassloader.loadclass(launcher.java:308) @ java.lang.classloader.loadclass(classloader.java:356) ... 17 more

based on inputs of posts, added commons package. classpath definition is

/usr/java/jdk1.7.0_21:/data/commons-logging-1.1.2/commons-logging-1.1.2.jar:/data/hadoop-1.1.2/hadoop-core-1.1.2.jar:/data/commons-logging-1.1.2/commons-logging-adapters-1.1.2.jar:/data/commons-logging-1.1.2/commons-logging-api-1.1.2.jar:. 

any clue on why not working?

you didnt include apache configuration in classpath.

really though shouldn't need include besides hadoop itself. make sure running jar hadoop itself.

> hadoop -jar myjar.jar


Comments

Popular posts from this blog

jquery - How can I dynamically add a browser tab? -

node.js - Getting the socket id,user id pair of a logged in user(s) -

keyboard - C++ GetAsyncKeyState alternative -