FAQ
I'm getting error when trying to access job logs. Here is a screen, I do
click "*Logs*" tab

<https://lh3.googleusercontent.com/-Lvb_or27Ivs/UXeUPAjvl7I/AAAAAAAAA9I/hwP2zYxsh4E/s1600/01_job_page.png>

I do get such screen:

Server Error (500)

Sorry, there's been an error. An email was sent to your administrators.
Thank you for your patience.
** More Info View Logs


Here is a part of "View Logs"

[24/Apr/2013 01:16:34 +0000] access WARNING 10.236.34.228 hdfs - "GET /logs HTTP/1.0"

[24/Apr/2013 01:13:30 +0000] access INFO 10.236.34.228 hdfs - "GET /debug/check_config_ajax HTTP/1.0"

[24/Apr/2013 01:13:30 +0000] middleware INFO Processing exception: Unexpected end tag : td, line 7, column 12: Traceback (most recent call last):
File "/opt/cloudera/parcels/CDH-4.2.0-1.cdh4.2.0.p0.10/share/hue/build/env/lib/python2.6/site-packages/Django-1.2.3-py2.6.egg/django/core/handlers/base.py", line 100, in get_response
response = callback(request, *callback_args, **callback_kwargs)
File "/opt/cloudera/parcels/CDH-4.2.0-1.cdh4.2.0.p0.10/share/hue/apps/jobbrowser/src/jobbrowser/views.py", line 62, in decorate
return view_func(request, *args, **kwargs)
File "/opt/cloudera/parcels/CDH-4.2.0-1.cdh4.2.0.p0.10/share/hue/apps/jobbrowser/src/jobbrowser/views.py", line 290, in single_task_attempt_logs
logs += [ section.strip() for section in attempt.get_task_log() ]
File "/opt/cloudera/parcels/CDH-4.2.0-1.cdh4.2.0.p0.10/share/hue/apps/jobbrowser/src/jobbrowser/models.py", line 451, in get_task_log
et = lxml.html.parse(data)
File "/opt/cloudera/parcels/CDH-4.2.0-1.cdh4.2.0.p0.10/share/hue/build/env/lib/python2.6/site-packages/lxml-2.2.2-py2.6-linux-x86_64.egg/lxml/html/__init__.py", line 661, in parse
return etree.parse(filename_or_url, parser, base_url=base_url, **kw)
File "lxml.etree.pyx", line 2698, in lxml.etree.parse (src/lxml/lxml.etree.c:49590)
File "parser.pxi", line 1513, in lxml.etree._parseDocument (src/lxml/lxml.etree.c:71423)
File "parser.pxi", line 1543, in lxml.etree._parseFilelikeDocument (src/lxml/lxml.etree.c:71733)
File "parser.pxi", line 1426, in lxml.etree._parseDocFromFilelike (src/lxml/lxml.etree.c:70648)
File "parser.pxi", line 997, in lxml.etree._BaseParser._parseDocFromFilelike (src/lxml/lxml.etree.c:67944)
File "parser.pxi", line 539, in lxml.etree._ParserContext._handleParseResultDoc (src/lxml/lxml.etree.c:63820)
File "parser.pxi", line 625, in lxml.etree._handleParseResult (src/lxml/lxml.etree.c:64741)
File "parser.pxi", line 565, in lxml.etree._raiseParseError (src/lxml/lxml.etree.c:64084)
XMLSyntaxError: Unexpected end tag : td, line 7, column 12

[24/Apr/2013 01:13:30 +0000] models INFO Retrieving *http://prod-node021.lol.ru:50060/tasklog?attemptid=attempt_201304111653_1209_m_000014_0*

[24/Apr/2013 01:13:30 +0000] thrift_util DEBUG Thrift call <class 'hadoop.api.jobtracker.Jobtracker.Client'>.getTracker returned in 1ms: ThriftTaskTrackerStatus(taskReports=None, availableSpace=2034783674368, totalVirtualMemory=139483734016, failureCount=41, httpPort=50060, host='prod-node021.lol.ru', totalPhysicalMemory=135290486784, reduceCount=0, lastSeen=1366791208983, trackerName='tracker_prod-node021.lol.ru:localhost/127.0.0.1:53237', mapCount=0, maxReduceTasks=16, maxMapTasks=32)

[24/Apr/2013 01:13:30 +0000] thrift_util DEBUG Thrift call: <class 'hadoop.api.jobtracker.Jobtracker.Client'>.getTracker(args=(RequestContext(confOptions={'effective_user': u'hdfs'}), 'tracker_prod-node021.lol.ru:localhost/127.0.0.1:53237'), kwargs={})


Here is what I get trying to access *h**ttp://prod-node021.lol.ru:50060/tasklog?attemptid=attempt_201304111653_1209_m_000014_0*


Task Logs: 'attempt_201304111653_1209_m_000014_0'

*stdout logs*

------------------------------


*stderr logs*

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-4.2.0-1.cdh4.2.0.p0.10/lib/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/data/disk5/mapred/local/taskTracker/hdfs/distcache/1909621032795827811_-309592906_991047864/nameservice1/applications/oozie-workflows/kyc-dataprocessing/sub-workflows/lib/slf4j-simple-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

------------------------------


*syslog logs*

2013-04-24 12:07:58,693 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
2013-04-24 12:07:59,497 WARN org.apache.hadoop.conf.Configuration: session.id is deprecated. Instead, use dfs.metrics.session-id
2013-04-24 12:07:59,499 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=MAP, sessionId=
2013-04-24 12:08:00,104 INFO org.apache.hadoop.util.ProcessTree: setsid exited with exit code 0
2013-04-24 12:08:00,112 INFO org.apache.hadoop.mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@4301ac93
2013-04-24 12:08:00,721 INFO org.apache.hadoop.mapred.MapTask: Processing split: hdfs://nameservice1/staging/landing/source/xvlr/2013/04/11/12/xvlr.1365668436571out:0+38722415
2013-04-24 12:08:00,807 INFO org.apache.hadoop.io.compress.CodecPool: Got brand-new compressor [.snappy]
2013-04-24 12:08:07,885 INFO org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
2013-04-24 12:08:07,888 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.EOFException
2013-04-24 12:08:07,889 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.EOFException
at java.io.DataInputStream.readFully(DataInputStream.java:180)
at org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:68)
at org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:106)
at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:2294)
at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:2426)
at org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.nextKeyValue(SequenceFileRecordReader.java:68)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:484)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:673)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:331)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
2013-04-24 12:08:07,895 INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task

Search Discussions

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupscm-users @
categorieshadoop
postedApr 24, '13 at 8:19a
activeApr 24, '13 at 8:19a
posts1
users1
websitecloudera.com
irc#hadoop

1 user in discussion

Serega Sheypak: 1 post

People

Translate

site design / logo © 2022 Grokbase