I have hadoop-3.2.2 on windows 10.
I implemented the installtion here
https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/ClusterSetup.html
Everything works fine, until I try to start yarn
sbin/start-yarn.sh
this gives error
2021-01-28 00:13:05,968 ERROR util.SysInfoWindows: ExitCodeException exitCode=1: PdhAddCounter Network Interface(*)Bytes Received/Sec failed with 0xc0000bb8.
Error in GetDiskAndNetwork. Err:1
at org.apache.hadoop.util.Shell.runCommand(Shell.java:1008)
at org.apache.hadoop.util.Shell.run(Shell.java:901)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1213)
at org.apache.hadoop.util.SysInfoWindows.getSystemInfoInfoFromShell(SysInfoWindows.java:86)
at org.apache.hadoop.util.SysInfoWindows.refreshIfNeeded(SysInfoWindows.java:101)
at org.apache.hadoop.util.SysInfoWindows.getPhysicalMemorySize(SysInfoWindows.java:153)
at org.apache.hadoop.yarn.util.ResourceCalculatorPlugin.getPhysicalMemorySize(ResourceCalculatorPlugin.java:64)
at org.apache.hadoop.yarn.server.nodemanager.NodeResourceMonitorImpl$MonitoringThread.run(NodeResourceMonitorImpl.java:143)
2021-01-28 00:13:09,882 INFO impl.MetricsSystemImpl: Stopping NodeManager metrics system...
2021-01-28 00:13:09,884 INFO impl.MetricsSystemImpl: NodeManager metrics system stopped.
2021-01-28 00:13:09,885 INFO impl.MetricsSystemImpl: NodeManager metrics system shutdown complete.
Env variables
HADOOP_HOME=~/hadoop/hadoop-3.3.0/bin
PATH=$PATH:$HADOOP_HOME/bin
PATH=$PATH:$HADOOP_HOME/sbin
HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
HADOOP_MAPRED_HOME=$HADOOP_HOME
HADOOP_COMMON_HOME=$HADOOP_HOME
HADOOP_HDFS_HOME=$HADOOP_HOME
YARN_HOME=$HADOOP_HOME
Here are my config files
core-site.xml
<configuration>
<property>
<name>fs.defaultFS</name>
<!-- <value>hdfs://localhost:9000</value> -->
<value>hdfs://0.0.0.0:19000</value>
</property>
</configuration>
hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:///D:/Dev/Hadoop/hadoop-3.1.0/data/namenode</value>
<!-- <value>/disk1/hdfs/name,/remote/hdfs/name</value> -->
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:///D:/Dev/Hadoop/hadoop-3.1.0/data/datanode</value>
<!-- <value>/disk1/hdfs/data,/disk2/hdfs/data</value> -->
</property>
mapred-site.xml
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
<property>
<name>mapreduce.application.classpath</name>
<value>
HADOOP_HOME;/share/hadoop/mapreduce/*,;
HADOOP_HOME;/share/hadoop/mapreduce/lib/*,;
HADOOP_HOME;/share/hadoop/common/*,;
HADOOP_HOME;/share/hadoop/common/lib/*,;
HADOOP_HOME;/share/hadoop/yarn/*,;
HADOOP_HOME;/share/hadoop/yarn/lib/*,;
HADOOP_HOME;/share/hadoop/hdfs/*,;
HADOOP_HOME;/share/hadoop/hdfs/lib/*</value>
</property>
</configuration>
yarn-site.xml
<configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.env-whitelist</name>
<value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME</value>
</property>
</configuration>
I am not able to figure out why yarn does not start
question from:
https://stackoverflow.com/questions/65928562/hadoop-yarn-does-not-start