FAQ
Per the stack trace, you are not running Sqoop2 but the Sqoop1 CLI
command tool instead.

What was your exact Sqoop command in this case?
On Wed, May 14, 2014 at 9:10 PM, MrAkhe83 wrote:
Hi,

when trying to run sqoop2 we get the following error:

14/05/14 10:34:41 WARN security.UserGroupInformation:
PriviledgedActionException as:nat_export_d_us_perf (auth:SIMPLE)
cause:java.io.FileNotFoundException: File does not exist:
hdfs://nameservice1/usr/lib/sqoop/lib/jackson-mapper-asl-1.9.13.jar
14/05/14 10:34:41 ERROR tool.ImportTool: Encountered IOException running
import job: java.io.FileNotFoundException: File does not exist:
hdfs://nameservice1/usr/lib/sqoop/lib/jackson-mapper-asl-1.9.13.jar
at
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1128)
at
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
at
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
at
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
at
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
at
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
at
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
at
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
at
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313)
at
org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159)
at
org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:247)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:614)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:413)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:506)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:222)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:231)
at org.apache.sqoop.Sqoop.main(Sqoop.java:240)

This is happening on CDH5.0.1 with YARN. Sqoop is configured to point to
YARN.

How comes it's looking for a jar file in the local file system but in the
front it says hdfs:// ?

File does not exist:
hdfs://nameservice1/usr/lib/sqoop/lib/jackson-mapper-asl-1.9.13.jar

The file is located locally under
/usr/lib/sqoop/lib/jackson-mapper-asl-1.9.13.jar

Thx

To unsubscribe from this group and stop receiving emails from it, send an
email to scm-users+unsubscribe@cloudera.org.


--
Harsh J

To unsubscribe from this group and stop receiving emails from it, send an email to scm-users+unsubscribe@cloudera.org.

Search Discussions

Discussion Posts

Previous

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 2 of 2 | next ›
Discussion Overview
groupscm-users @
categorieshadoop
postedMay 14, '14 at 3:40p
activeMay 14, '14 at 4:20p
posts2
users2
websitecloudera.com
irc#hadoop

2 users in discussion

MrAkhe83: 1 post Harsh J: 1 post

People

Translate

site design / logo © 2021 Grokbase