'0.0.0.0 permission denied (publickey)'
This is due to the fact that, by default, the hadoop core process will try to start secondary namenode on the default ip address 0.0.0.0 which by default should be pointing to localhost. In order to do that it tries to establish an SSH connection to the ip address 0.0.0.0 and as the error message states, it fails to make the connection because the passwordless SSH is not setup correctly for the host 0.0.0.0. Below steps will help you fix this
1) If you are using a config file for passwordless ssh (mostly in case of AWS servers)
In order to setup passwordless ssh, you might have created a config file in .ssh folder of your home directory with the entries for hostname, userid and the keyfile to be used while doing ssh, as shown belowHost namenode User hadoopuser HostName namenode.xyz.com IdentityFile ~/.ssh/hadoop_key.pem Host localhost HostName 127.0.0.1 User hadoopuser IdentityFile ~/.ssh/hadoop_key.pem Host datanode1 User hadoopuser HostName datanode1.xyz.com IdentityFile ~/.ssh/hadoop_key.pem Host datanode2 User hadoopuser HostName datanode2.xyz.com IdentityFile ~/.ssh/hadoop_key.pem Host datanode3 User hadoopuser HostName datanode3.xyz.com IdentityFile ~/.ssh/hadoop_key.pem
just add the below entries
Host 0.0.0.0 User hadoopuser IdentityFile ~/.ssh/hadoop_key.pem
and try to connect to 0.0.0.0 via ssh as below
$ ssh 0.0.0.0 The authenticity of host '0.0.0.0' can't be established. ED25519 key fingerprint is e4:ff:65:d7:be:5d:c8:44:1d:89:6b:50:f5:50:a0:ce. Are you sure you want to continue connecting (yes/no)?Type 'yes' and you should be all set.
2) If you are using ssh-keygen generated key for passwordless ssh
In this case, you dont need to do anything other than updating the known hostsTry to connect to 0.0.0.0 via ssh as below
$ ssh 0.0.0.0 The authenticity of host '0.0.0.0' can't be established. ED25519 key fingerprint is e4:ff:65:d7:be:5d:c8:44:1d:89:6b:50:f5:50:a0:ce. Are you sure you want to continue connecting (yes/no)?Type 'yes' and you should be all set