create said directory. The superuser in HDFS is typically `hdfs'. The
`root' user doesn't have any special power here.
You can chmod the path with `sudo -u hdfs hadoop fs -chmod ...'. Or chown
would also work.
Cheers,
bc
On Tue, Jun 4, 2013 at 12:10 AM, HyperUser wrote:
Hi,
I am using hypertable on Cloudera Hadoop cdh3.
I accidentally executed
sudo -u hdfs hadoop namenode -format
Then datanode failed to start due to namespaceID id in namenode and
datanode didn't match. I fixed that by manually changing those.
#Mon Jun 03 19:00:37 EDT 2013
namespaceID=2098902017
storageID=DS-971380851-10.195.241.242-50010-1369424059347
cTime=0
storageType=DATA_NODE
layoutVersion=-19
#Tue Jun 04 02:28:50 EDT 2013
namespaceID=277954195
cTime=0
storageType=NAME_NODE
layoutVersion=-19
But I am still getting this error when try to start hypertable master.
What is wrong here?
sudo -u hdfs hadoop fs -ls /hyperspace
Found 1 items
drwxr-xr-x - hdfs supergroup 0 2013-06-04 02:52 /hyperspace/dfs
1370329705 ERROR Hypertable.Master : main
(/root/src/hypertable/src/cc/Hypertable/Master/main.cc:369):
Hypertable::Exception: Error mkdirs DFS directory
/hypertable/servers/master/log/mml - DFS BROKER i/o error
at virtual void Hypertable::DfsBroker::Client::mkdirs(const
Hypertable::String&)
(/root/src/hypertable/src/cc/DfsBroker/Lib/Client.cc:517)
at virtual void Hypertable::DfsBroker::Client::mkdirs(const
Hypertable::String&)
(/root/src/hypertable/src/cc/DfsBroker/Lib/Client.cc:514):
org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=root, access=WRITE, inod
Hi,
I am using hypertable on Cloudera Hadoop cdh3.
I accidentally executed
sudo -u hdfs hadoop namenode -format
Then datanode failed to start due to namespaceID id in namenode and
datanode didn't match. I fixed that by manually changing those.
#Mon Jun 03 19:00:37 EDT 2013
namespaceID=2098902017
storageID=DS-971380851-10.195.241.242-50010-1369424059347
cTime=0
storageType=DATA_NODE
layoutVersion=-19
#Tue Jun 04 02:28:50 EDT 2013
namespaceID=277954195
cTime=0
storageType=NAME_NODE
layoutVersion=-19
But I am still getting this error when try to start hypertable master.
What is wrong here?
sudo -u hdfs hadoop fs -ls /hyperspace
Found 1 items
drwxr-xr-x - hdfs supergroup 0 2013-06-04 02:52 /hyperspace/dfs
1370329705 ERROR Hypertable.Master : main
(/root/src/hypertable/src/cc/Hypertable/Master/main.cc:369):
Hypertable::Exception: Error mkdirs DFS directory
/hypertable/servers/master/log/mml - DFS BROKER i/o error
at virtual void Hypertable::DfsBroker::Client::mkdirs(const
Hypertable::String&)
(/root/src/hypertable/src/cc/DfsBroker/Lib/Client.cc:517)
at virtual void Hypertable::DfsBroker::Client::mkdirs(const
Hypertable::String&)
(/root/src/hypertable/src/cc/DfsBroker/Lib/Client.cc:514):
org.apache.hadoop.security.AccessControlException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=root, access=WRITE, inod