After configuring ssh-free login, starting hadoop still reports an error
[hadoop@spark1 sbin]$ ./start-dfs.sh
17/04/09 17:41:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [spark1]
The authenticity of host 'spark1 (192.168.10.91)' can't be established.
RSA key fingerprint is 14:89:07:e9:b0:99:3e:a9:ef:0b:1c:a1:6a:92:ee:24.
Are you sure you want to continue connecting (yes/no)? no
spark1: Host key verification failed.
spark2: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hadoop-datanode-spark02.out
spark3: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hadoop-datanode-spark03.out
Starting secondary namenodes [spark1]
The authenticity of host 'spark1 (192.168.10.91)' can't be established.
RSA key fingerprint is 14:89:07:e9:b0:99:3e:a9:ef:0b:1c:a1:6a:92:ee:24.
Are you sure you want to continue connecting (yes/no)? no
spark1: Host key verification failed.
17/04/09 17:43:46 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
The reason is forgetting to execute
# vi /etc/ssh/sshd_config
Let go of the annotation configuration
RSAAuthentication yes
PubkeyAuthentication yes
AuthorizedKeysFile .ssh/authorized_keys