|| at Dec 29, 2011 at 3:40 pm
1) My guess is that was included in one of the patches ported to the
2) You're correct, you need to limit access to the cluster via a
gateway. This isn't really sufficient as the code in your MapReduce
job will run as the mapred user and have access to anything it can
If you care about security, I'd stick with CDH3, 0.20.20x/1.0.0 or
0.23, with 0.23 being the least stable of the bunch.
On Thu, Dec 29, 2011 at 10:37 AM, Praveen Sripati
Ã‚Â Ã‚Â Ã‚Â Ã‚Â Ã‚Â throw new RuntimeException("Cannot start secure datanode in unsecure
Then what is this code in
2) If Kerberos is not supported, then how to authenticate the user? Is
authenticating the use at the cluster gateway the only way? Once the user is
in the cluster, if I am not wrong the user can pretend as any user.
On Thu, Dec 29, 2011 at 8:49 PM, Joey Echeverria wrote:
Yes, it means that 0.22 doesn't support Kerberos.
On Thu, Dec 29, 2011 at 9:41 AM, Praveen Sripati
The release notes for 0.22
The following features are not supported in Hadoop 0.22.0.
Ã‚Â Ã‚Â Ã‚Â >Security.
Ã‚Â Ã‚Â Ã‚Â >Latest optimizations of the MapReduce framework introduced in the
Hadoop 0.20.security line of releases.
Ã‚Â Ã‚Â Ã‚Â >Disk-fail-in-place.
Ã‚Â Ã‚Â Ã‚Â >JMX-based metrics v2.
By Security missing, what all features are missing? Does it mean
integration is missing?
In o.a.h.hdfs.server.datanode.SecureDataNodeStarter.java, the following
Ã‚Â Ã‚Â Ã‚Â if(!conf.get(HADOOP_SECURITY_AUTHENTICATION).equals("kerberos"))
Ã‚Â Ã‚Â Ã‚Â Ã‚Â Ã‚Â throw new RuntimeException("Cannot start secure datanode in