|
Joey Echeverria |
at Dec 29, 2011 at 3:40 pm
|
⇧ |
| |
1) My guess is that was included in one of the patches ported to the
0.22 branch.
2) You're correct, you need to limit access to the cluster via a
gateway. This isn't really sufficient as the code in your MapReduce
job will run as the mapred user and have access to anything it can
see.
If you care about security, I'd stick with CDH3, 0.20.20x/1.0.0 or
0.23, with 0.23 being the least stable of the bunch.
-Joey
On Thu, Dec 29, 2011 at 10:37 AM, Praveen Sripati
wrote:
Joey,
1) if(!conf.get(HADOOP_SECURITY_AUTHENTICATION).equals("kerberos"))
     throw new RuntimeException("Cannot start secure datanode in unsecure
cluster");
Then what is this code in
o.a.h.hdfs.server.datanode.SecureDataNodeStarter.java about?
2) If Kerberos is not supported, then how to authenticate the user? Is
authenticating the use at the cluster gateway the only way? Once the user is
in the cluster, if I am not wrong the user can pretend as any user.
Praveen
On Thu, Dec 29, 2011 at 8:49 PM, Joey Echeverria wrote:Yes, it means that 0.22 doesn't support Kerberos.
-Joey
On Thu, Dec 29, 2011 at 9:41 AM, Praveen Sripati
wrote:
Hi,
The release notes for 0.22
(http://hadoop.apache.org/common/releases.html#10+December%2C+2011%3A+release+0.22.0+available)
it says
The following features are not supported in Hadoop 0.22.0.
   >Security.
   >Latest optimizations of the MapReduce framework introduced in the
Hadoop 0.20.security line of releases.
   >Disk-fail-in-place.
   >JMX-based metrics v2.
By Security missing, what all features are missing? Does it mean
Kerberos
integration is missing?
In o.a.h.hdfs.server.datanode.SecureDataNodeStarter.java, the following
code
is there
   if(!conf.get(HADOOP_SECURITY_AUTHENTICATION).equals("kerberos"))
     throw new RuntimeException("Cannot start secure datanode in
unsecure
cluster");
Regards,
Praveen
--
Joseph Echeverria
Cloudera, Inc.
443.305.9434
--
Joseph Echeverria
Cloudera, Inc.
443.305.9434