Controlling logging functionality in hadoop

How to control logging functionality in hadoop? Hadoop uses default file for controlling logs. My use case is to control logs generated by my classes.

Hadoop daemons like JobTracker, TaskTracker, NameNode and DataNode daemon processes use file from their respective host node’s hadoop-conf-directory. The rootLogger is set to “INFO,console” which logs all message at level INFO to the console.

I trigger hadoop jobs using Oozie Workflow. I tried passing my custom file to the job by setting -Dlog4j.configuration=path/to/ system property, but it is not working. Still, it takes log4j properties from the default one.

I am not supposed to touch default file.

I am using Oozie-v3.1.3-incubating, hadoop-v0.20 and cloudera CDH-v4.0.1.

How can I override the default file ?? or How can I control logs for my classes ??

1 Reply

@soujanyabargavi Accomplishing this goal is a bit involved, and may require additional troubleshooting to achieve. That being said, there seem to be a couple of ways you can set custom Log levels in Hadoop, two of which are listed below.

  1. Custom levels in code
  2. Custom levels in configuration

Since Hadoop uses the Apache log4j via the Apache Commons Logging framework, you can find additional information on how to achieve custom log levels by reviewing the information in the below section in the Apache Logging manual.

I've also located some additional Hadoop focused resources that may help you troubleshoot your configuration.

Additional Resources

Hope this helps!


Please enter an answer

You can mention users to notify them: @username

You can use Markdown to format your question. For more examples see the Markdown Cheatsheet.

> I’m a blockquote.

I’m a blockquote.

[I'm a link] (

I'm a link

**I am bold** I am bold

*I am italicized* I am italicized

Community Code of Conduct