I've followed this installation procedure and when I reach the Inspect Role
Assignments stage I only see one managed host:
ip-000-000-000-000.ap-northeast-1.compute.internal. In spite of Cloudera
Manager setups EC2 server * 3 successfully.
Two PNG files I attached are snapshots of installation process.
First PNG file means that Cloudera Manager installs two servers
Second PNG file means that the result of host inspector show me only one
Can anyone provide Cloudera Manager 4 Free Version help with instructions
on resolving hosts in EC2?
I seem to look like that these are same problems.
How to resolve hosts using Cloudera Manager 4.0 installed on EC2
Cloudera Manager fails to add hosts
My environment is here.
OS: EC2 Redhat Enterprise Linux 64bit Large Instanse * 3
Storage: EBS 500GBytes * 3
Global IP: Elastic IP * 3
DNS: I don't use my own DNS.
each server's /etc/hosts file:
127.0.0.1 localhost.localdomain localhost
::1 localhost6.localdomain6 localhost6
# In fact, these IP is Elastic IP and I don't have `mydomain`.
000.000.000.001 master.cloudera.mydomain master
000.000.000.002 slave1.cloudera.mydomain slave1
000.000.000.003 slave2.cloudera.mydomain slave2
each server's /etc/resolv.conf file:
search cloudera.mydomain ap-northeast-1.compute.internal
When I edit these two files, I execute `service network restart`.
Any help is appreciated.