Failed to locate the winutils binary in the hadoop binary path

I am getting the following error while starting namenode for latest hadoop-2.2 release. I didn't find winutils exe file in hadoop bin folder. I tried below commands

$ bin/hdfs namenode -format
$ sbin/yarn-daemon.sh start resourcemanager

ERROR [main] util.Shell (Shell.java:getWinUtilsPath(303)) - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
	at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278)
	at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:300)
	at org.apache.hadoop.util.Shell.<clinit>(Shell.java:293)
	at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:863)

Answers

You can download winutils.exe here: http://public-repo-1.hortonworks.com/hdp-win-alpha/winutils.exe

Then copy it to your HADOOP_HOME/bin directory.

Posted on by Soumya Kanti
  1. Download [winutils.exe]

From URL :
https://github.com/steveloughran/winutils/hadoop-*`version`*/bin
2. Past it under HADOOP_HOME/bin
***Note : You should Set environmental variables:***
**User variable:**
**Variable**: HADOOP_HOME
**Value**: Hadoop or spark dir Posted on by Anonymous

I was facing the same problem. Removing the bin\ from the HADOOP_HOME path solved it for me. The path for HADOOP_HOME variable should look something like.

C:\dev\hadoop2.6\

System restart may be needed. In my case, restarting the IDE was sufficient.

Posted on by Anonymous

Winutils.exe is used for running the shell commands for SPARK. When you need to run the Spark without installing Hadoop, you need this file.

Steps are as follows:

  1. Download the winutils.exe from following location for hadoop 2.7.1

https://github.com/steveloughran/winutils/tree/master/hadoop-2.7.1/bin [NOTE: If you are using separate hadoop version then please download the winutils from corresponding hadoop version folder on GITHUB from the location as mentioned above.]

  1. Now, create a folder 'winutils' in C:\ drive. Now create a folder 'bin' inside folder 'winutils' and copy the winutils.exe in that folder.

So the location of winutils.exe will be C:\winutils\bin\winutils.exe

  1. Now, open environment variable and set HADOOP_HOME=C:\winutils

[NOTE: Please do not add \bin in HADOOP_HOME and no need to set HADOOP_HOME in Path]

Your issue must be resolved !!

Posted on by Anurag

Simple Solution: Download it from here and add to $HADOOP_HOME/bin

(Source)

IMPORTANT UPDATE:

For hadoop-2.6.0 you can download binaries from Titus Barik blog >>.

I have not only needed to point HADOOP_HOME to extracted directory [path], but also provide system property -Djava.library.path=[path]\bin to load native libs (dll).

Posted on by Prasad D

I was getting the same issue in windows. I fixed it by

  • Downloading hadoop-common-2.2.0-bin-master from link.
  • Create a user variable HADOOP_HOME in Environment variable and assign the path of hadoop-common bin directory as a value.
  • You can verify it by running hadoop in cmd.
  • Restart the IDE and Run it.
Posted on by Anonymous

In Pyspark, to run local spark application using Pycharm use below lines

os.environ['HADOOP_HOME'] = "C:\\winutils"
print os.environ['HADOOP_HOME']
Posted on by Narsireddy

Download desired version of hadoop folder (Say if you are installing spark on Windows then hadoop version for which your spark is built for) from this link as zip.

Extract the zip to desired directory. You need to have directory of the form hadoop\bin (explicitly create such hadoop\bin directory structure if you want) with bin containing all the files contained in bin folder of the downloaded hadoop. This will contain many files such as hdfs.dll, hadoop.dll etc. in addition to winutil.exe.

Now create environment variable HADOOP_HOME and set it to <path-to-hadoop-folder>\hadoop. Then add ;%HADOOP_HOME%\bin; to PATH environment variable.

Open a "new command prompt" and try rerunning your command.

Posted on by Mahesha999

Set up HADOOP_HOME variable in windows to resolve the problem.

You can find answer in org/apache/hadoop/hadoop-common/2.2.0/hadoop-common-2.2.0-sources.jar!/org/apache/hadoop/util/Shell.java :

IOException from

  public static final String getQualifiedBinPath(String executable) 
  throws IOException {
    // construct hadoop bin path to the specified executable
    String fullExeName = HADOOP_HOME_DIR + File.separator + "bin" 
      + File.separator + executable;
    File exeFile = new File(fullExeName);
    if (!exeFile.exists()) {
      throw new IOException("Could not locate executable " + fullExeName
        + " in the Hadoop binaries.");
    }
    return exeFile.getCanonicalPath();
  }


HADOOP_HOME_DIR from

// first check the Dflag hadoop.home.dir with JVM scope
String home = System.getProperty("hadoop.home.dir");
// fall back to the system/user-global env variable
if (home == null) {
  home = System.getenv("HADOOP_HOME");
}



Posted on by Anonymous

If you face this problem when running a self-contained local application with Spark (i.e., after adding spark-assembly-x.x.x-hadoopx.x.x.jar or the Maven dependency to the project), a simpler solution would be to put winutils.exe (download from here) in "C:\winutil\bin". Then you can add winutils.exe to the hadoop home directory by adding the following line to the code:

System.setProperty("hadoop.home.dir", "c:\\\winutil\\\")

Source: Click here

Posted on by TrnKh

I just ran into this issue while working with Eclipse. In my case, I had the correct Hadoop version downloaded (hadoop-2.5.0-cdh5.3.0.tgz), I extracted the contents and placed it directly in my C drive. Then I went to

Eclipse->Debug/Run Configurations -> Environment (tab) -> and added

variable: HADOOP_HOME

Value: C:\hadoop-2.5.0-cdh5.3.0

Posted on by Anonymous

winutils.exe are required for hadoop to perform hadoop related commands. please download hadoop-common-2.2.0 zip file. winutils.exe can be found in bin folder. Extract the zip file and copy it in the local hadoop/bin folder.

Posted on by Mohan Raj

If we directly take the binary distribution of Apache Hadoop 2.2.0 release and try to run it on Microsoft Windows, then we'll encounter ERROR util.Shell: Failed to locate the winutils binary in the hadoop binary path.

The binary distribution of Apache Hadoop 2.2.0 release does not contain some windows native components (like winutils.exe, hadoop.dll etc). These are required (not optional) to run Hadoop on Windows.

So you need to build windows native binary distribution of hadoop from source codes following "BUILD.txt" file located inside the source distribution of hadoop. You can follow the following posts as well for step by step guide with screen shot

Build, Install, Configure and Run Apache Hadoop 2.2.0 in Microsoft Windows OS

ERROR util.Shell: Failed to locate the winutils binary in the hadoop binary path

Posted on by Abhijit

The statement java.io.IOException: Could not locate executable null\bin\winutils.exe

explains that the null is received when expanding or replacing an Environment Variable. If you see the Source in Shell.Java in Common Package you will find that HADOOP_HOME variable is not getting set and you are receiving null in place of that and hence the error.

So, HADOOP_HOME needs to be set for this properly or the variable hadoop.home.dir property.

Hope this helps.

Thanks, Kamleshwar.

Posted on by Anonymous