FAQ
AspectJ framework for HDFS code and tests
-----------------------------------------

Key: HADOOP-6003
URL: https://issues.apache.org/jira/browse/HADOOP-6003
Project: Hadoop Core
Issue Type: Sub-task
Reporter: Konstantin Boudnik


This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file

--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Search Discussions

  • Konstantin Boudnik (JIRA) at Jun 9, 2009 at 7:22 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Konstantin Boudnik reassigned HADOOP-6003:
    ------------------------------------------

    Assignee: Konstantin Boudnik
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik

    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 9, 2009 at 8:01 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Konstantin Boudnik updated HADOOP-6003:
    ---------------------------------------

    Attachment: HADOOP-6003.patch
    HADOOP-6003.sh

    Shell script for folders creation and the patch
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Attachments: HADOOP-6003.patch, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 9, 2009 at 8:11 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12717807#action_12717807 ]

    Konstantin Boudnik edited comment on HADOOP-6003 at 6/9/09 1:10 PM:
    --------------------------------------------------------------------

    This patch includes the following additions:
    - AspectJ framework (version 1.6.4) is added to the Ivy resolver's configuration
    - the implementation of a simple probability calculation and configuration needed by fault injection
    - two aspects for datanode's classes BlockReceiver and FSDataset are created and tested

    It is expected to see unit tests failing with faults in place. We might need to develop different kind of tests to utilize fault injection in a better way.

    The interface of the new framework is as follows:
    - ant injectfaults will weave the aspects in place after the normal compilation of HDFS classes is complete
    - ant run-test-hdfs will execute unit tests as usual, but faults will be injected according to the rules
    - ant jar will create Hadoop's jar as usual, but if 'injectfaults' has been executed before then the jar file will include instrumented classes, e.g. with fault invocations

    The rules of faults injection probability calculation are as follows:
    * default probability level is set to 0. Thus even with aspects weaved into the classes faults won't be injected/executed unless specified explicitly
    * to set certain class' faults probability level one needs to specify system property in the following format
    **
    {code}
    ant run-test-hdfs -Dfault.probability.FSDataset=3
    {code}
    which will set the probability of faults injections into FSDataset class at about 3%


    was (Author: cos):
    This patch includes the following additions:
    - AspectJ framework (version 1.6.4) is added to the Ivy resolver's configuration
    - the implementation of a simple probability calculation and configuration needed by fault injection
    - two aspects for datanode's classes BlockReceiver and FSDataset are created and tested

    It is expected to see unit tests failing with faults in place. We might need to develop different kind of tests to utilize fault injection in a better way.

    The interface of the new framework is as follows:
    - ant injectfaults will weave the aspects in place after the normal compilation of HDFS classes is complete
    - ant run-test-hdfs will execute unit tests as usual, but faults will be injected according to the rules
    - ant jar will create Hadoop's jar as usual, but if 'injectfaults' has been executed before then the jar file will include instrumented classes, e.g. with fault invocations

    The rules of faults injection probability calculation are as follows:
    * default probability level is set to 0. Thus even with aspects weaved into the classes faults won't be injected/executed unless specified explicitly
    * to set certain class' faults probability level one needs to specify system property in the following format
    **
    {code}
    ant run-test-hdfs -Dfault.probability.FSDataset=3
    {code}
    which will probability of faults injections into FSDataset class at about 3%

    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 9, 2009 at 8:11 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Konstantin Boudnik updated HADOOP-6003:
    ---------------------------------------

    Fix Version/s: 0.21.0
    Affects Version/s: 0.20.0
    Status: Patch Available (was: Open)

    This patch includes the following additions:
    - AspectJ framework (version 1.6.4) is added to the Ivy resolver's configuration
    - the implementation of a simple probability calculation and configuration needed by fault injection
    - two aspects for datanode's classes BlockReceiver and FSDataset are created and tested

    It is expected to see unit tests failing with faults in place. We might need to develop different kind of tests to utilize fault injection in a better way.

    The interface of the new framework is as follows:
    - ant injectfaults will weave the aspects in place after the normal compilation of HDFS classes is complete
    - ant run-test-hdfs will execute unit tests as usual, but faults will be injected according to the rules
    - ant jar will create Hadoop's jar as usual, but if 'injectfaults' has been executed before then the jar file will include instrumented classes, e.g. with fault invocations

    The rules of faults injection probability calculation are as follows:
    * default probability level is set to 0. Thus even with aspects weaved into the classes faults won't be injected/executed unless specified explicitly
    * to set certain class' faults probability level one needs to specify system property in the following format
    **
    {code}
    ant run-test-hdfs -Dfault.probability.FSDataset=3
    {code}
    which will probability of faults injections into FSDataset class at about 3%

    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Hadoop QA (JIRA) at Jun 12, 2009 at 12:07 am
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12718670#action_12718670 ]

    Hadoop QA commented on HADOOP-6003:
    -----------------------------------

    -1 overall. Here are the results of testing the latest attachment
    http://issues.apache.org/jira/secure/attachment/12410248/HADOOP-6003.patch
    against trunk revision 783672.

    +1 @author. The patch does not contain any @author tags.

    +1 tests included. The patch appears to include 10 new or modified tests.

    +1 javadoc. The javadoc tool did not generate any warning messages.

    +1 javac. The applied patch does not increase the total number of javac compiler warnings.

    +1 findbugs. The patch does not introduce any new Findbugs warnings.

    -1 Eclipse classpath. The patch causes the Eclipse classpath to differ from the contents of the lib directories.

    +1 release audit. The applied patch does not increase the total number of release audit warnings.

    +1 core tests. The patch passed core unit tests.

    -1 contrib tests. The patch failed contrib unit tests.

    Test results: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch-vesta.apache.org/491/testReport/
    Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch-vesta.apache.org/491/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
    Checkstyle results: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch-vesta.apache.org/491/artifact/trunk/build/test/checkstyle-errors.html
    Console output: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch-vesta.apache.org/491/console

    This message is automatically generated.
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 12, 2009 at 5:48 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12718917#action_12718917 ]

    Konstantin Boudnik commented on HADOOP-6003:
    --------------------------------------------

    As far as I can tell this failing test isn't related to the patch at all, because of its orthogonality. Test failure seems to be connected to some sort of configuration issues.

    Review of the patch will be highly appreciated!
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Philip Zeyliger (JIRA) at Jun 15, 2009 at 3:54 am
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12719382#action_12719382 ]

    Philip Zeyliger commented on HADOOP-6003:
    -----------------------------------------

    It looks like some of the patches have inconsistent usage of tabs/spaces; you may want to clean that up.

    bq. It is expected to see unit tests failing with faults in place. We might need to develop different kind of tests to utilize fault injection in a better way.

    I think you're right on here. It seems like having a global setting for injection probability will mostly lead to test results that are hard to grok. Have you thought about setting up a separate package/target for tests that are designed to interact with injected faults? Once one or two of those tests exists, it'll be clearer whether "ant run-test-hdfs -Dfault.probability.FSDataset=3" is the right API.
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 15, 2009 at 6:32 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Konstantin Boudnik updated HADOOP-6003:
    ---------------------------------------

    Attachment: HADOOP-6003.patch

    Indentation inconsistency is fixed.

    As for the separate tests development: I agree on that. Different tests would have to be developed with FI in minds. However, for the Phase 1 (e.g. the framework introduction) we might use existing tests if in need to verify certain aspects of the error handling.

    On the API: I think current proposal will not stand when we'll be talking about complex injection scenarios and would have to replaced with something else. I'm agree that we need to develop some of these scenarios before a better API might be developed.
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Tsz Wo (Nicholas), SZE (JIRA) at Jun 15, 2009 at 9:52 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12719803#action_12719803 ]

    Tsz Wo (Nicholas), SZE commented on HADOOP-6003:
    ------------------------------------------------

    I think the ant command may look like:
    {noformat}
    ant test -Dfault.injection.enable -Dfault.injection.conf=CONF_FILE
    {noformat}
    All the FI parameters, included probability values, are specified in CONF_FILE. Then, we only have to change conf file but not the build script for the later development.
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 15, 2009 at 10:34 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12719828#action_12719828 ]

    Konstantin Boudnik commented on HADOOP-6003:
    --------------------------------------------

    Agree, config file option seems like a good idea. I was planning to add this
    later and have this framework to be in place ASAP, but I can surely add this
    feature right now.

    One more thing: I believe it would be good to have dynamic configuration i.e.
    through system properties as well as the config file. This ability to pass
    system properties provides ad-hoc testing opportunity if one needs to quickly
    check/reproduce something.

    I'll provide config file option shortly.


    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 16, 2009 at 10:48 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Konstantin Boudnik updated HADOOP-6003:
    ---------------------------------------

    Attachment: HADOOP-6003.patch

    This version of the patch includes a way to define probability levels through a standard Hadoop configuration file. Default location of the config file is conf/fi-site.xml
    An alternative location might be set through -Dfault.probability.config= system property

    Also, one can use a standard build.property file to specify all needed probability levels in the runtime
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 16, 2009 at 10:50 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Konstantin Boudnik updated HADOOP-6003:
    ---------------------------------------

    Attachment: HADOOP-6003.patch
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 16, 2009 at 10:50 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Konstantin Boudnik updated HADOOP-6003:
    ---------------------------------------

    Attachment: (was: HADOOP-6003.patch)
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 17, 2009 at 9:08 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Konstantin Boudnik updated HADOOP-6003:
    ---------------------------------------

    Attachment: HADOOP-6003.patch
    HADOOP-6003.sh

    Final modifications:
    - new package is renamed to org.apache.hadoop.fi
    - FIConfig class is renamed to FiConfig
    - system property prefix is made to be 'fi' instead of rather lengthy fault.probability
    - default config file for FI framework is added
    - missed Apache license boiler plate is attached to one of the files
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Tsz Wo (Nicholas), SZE (JIRA) at Jun 19, 2009 at 2:01 am
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Tsz Wo (Nicholas), SZE updated HADOOP-6003:
    -------------------------------------------

    Component/s: test

    - Tried the following
    # ant injectfaults
    # set fi.* to 1
    # ant test-core -Dtestcase=TestFileCreation
    It still passed the test. Have I done something wrong?

    - ant clean failed.
    {noformat}
    bash-3.2$ ant clean
    Buildfile: build.xml
    ...
    BUILD FAILED
    d:\@sze\hadoop\latest\build.xml:1604: Unable to delete file d:\@sze\hadoop\latest\build\ivy\lib\Hadoop\common\aspectjtools-1.6.4.jar

    Total time: 0 seconds
    {noformat}

    - I think fi-site.xml should not be placed in conf directory. It may confuse cluster administrators. It is better to put everything under test.
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Components: test
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 19, 2009 at 10:02 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Konstantin Boudnik updated HADOOP-6003:
    ---------------------------------------

    Attachment: HADOOP-6003.patch

    Slightly different version of the original patch:
    conf/fi-site.xml is moved to src/test. build.xml is modified to copy it for test runs.
    <iajc> task definition is moved inside of the target 'injectfaults' to guarantee that it always defined

    Couldn't reproduce the issue with
    {noformat}
    ant clean
    {noformat}
    works for me every time. Is it possible that you had some files permission issues?

    Wrt to TestFileCreation: I have ran this test many times and seen a failure only once. The problem with this test is that while I can confirm that faults methods are called they weren't called frequent enough (i.e. < 80 called during the test execution) to reach the necessary threshold of 1% to inject a fault. In other words, the instrumented isn't being called often enough. I'd suggest to run TestDirectoryScanner instead where these instrumented functions are called pretty often so the test fails on every run.

    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Components: test
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 19, 2009 at 10:04 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12722045#action_12722045 ]

    Konstantin Boudnik edited comment on HADOOP-6003 at 6/19/09 3:02 PM:
    ---------------------------------------------------------------------

    Slightly different version of the original patch:
    conf/fi-site.xml is moved to src/test. build.xml is modified to copy it for test runs.
    <iajc> task definition is moved inside of the target 'injectfaults' to guarantee that it always defined

    Couldn't reproduce the issue with
    {noformat}
    ant clean
    {noformat}
    works for me every time. Is it possible that you had some files permission issues?

    Wrt to TestFileCreation: I have ran this test many times and seen a failure only once. The problem with this test is that while I can confirm that faults methods are called they aren't called frequent enough (i.e. < 80 calls during the test execution) to reach the necessary threshold of 1% to inject a fault. In other words, the instrumented isn't being called often enough. I'd suggest to run TestDirectoryScanner instead where these instrumented functions are called pretty often so the test fails on every run.


    was (Author: cos):
    Slightly different version of the original patch:
    conf/fi-site.xml is moved to src/test. build.xml is modified to copy it for test runs.
    <iajc> task definition is moved inside of the target 'injectfaults' to guarantee that it always defined

    Couldn't reproduce the issue with
    {noformat}
    ant clean
    {noformat}
    works for me every time. Is it possible that you had some files permission issues?

    Wrt to TestFileCreation: I have ran this test many times and seen a failure only once. The problem with this test is that while I can confirm that faults methods are called they weren't called frequent enough (i.e. < 80 called during the test execution) to reach the necessary threshold of 1% to inject a fault. In other words, the instrumented isn't being called often enough. I'd suggest to run TestDirectoryScanner instead where these instrumented functions are called pretty often so the test fails on every run.

    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Components: test
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Tsz Wo (Nicholas), SZE (JIRA) at Jun 20, 2009 at 12:12 am
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12722100#action_12722100 ]

    Tsz Wo (Nicholas), SZE commented on HADOOP-6003:
    ------------------------------------------------

    - I thought fi.* is the probability, but it actually is the probability in percentage. It seems to me that it is not common to specify probability in percentage. I suggest that if it is a percentage, it needs a %. i.e.
    -* <value>1</value> means probability = 1
    -* <value>1%</value> means probability = 0.01


    - Tried the latest patch. Everything worked fine: TestFileCreation failed when setting fi.* = 100.0. ant clean succeeded. I probably messed something up last time. My faults!


    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Components: test
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 20, 2009 at 1:36 am
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Konstantin Boudnik updated HADOOP-6003:
    ---------------------------------------

    Attachment: HADOOP-6003.patch

    I agree with the point - it is misleading when a level of probability set as an integer.
    I have fixed the config file comment and ProbabilityModel code to work with the value as a float between 0.00 and 1.00
    I think adding a special case to handle '%' is excessive.

    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Components: test
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 20, 2009 at 1:38 am
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Konstantin Boudnik updated HADOOP-6003:
    ---------------------------------------

    Attachment: HADOOP-6003.patch

    Default probability for all faults has to be 0.00
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Components: test
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 20, 2009 at 1:42 am
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Konstantin Boudnik updated HADOOP-6003:
    ---------------------------------------

    Attachment: HADOOP-6003.patch

    Debug output has to have log4j's debug level
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Core
    Issue Type: Sub-task
    Components: test
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Tsz Wo (Nicholas), SZE (JIRA) at Jun 20, 2009 at 6:08 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12722234#action_12722234 ]

    Tsz Wo (Nicholas), SZE commented on HADOOP-6003:
    ------------------------------------------------

    Codes look good. Two more minor comments:
    - javadoc need to be quoted by /** */. For example,
    {code}
    +/*
    + * This class wraps the logic around fault injection configuration file
    + * Default file is expected to be found in src/test/fi-site.xml
    + * This default file should be copied by JUnit Ant's tasks to
    + * build/test/extraconf folder before tests are ran
    + * An alternative location can be set through
    + * -Dfault.property.config=<file_name>
    + */
    {code}

    - Unnecessary space changes should be reverted. e.g.
    {code}
    --- build.xml (revision 786191)
    +++ build.xml (working copy)
    @@ -428,7 +428,7 @@
    <fileset dir="${mapred.src.dir}" includes="mapred-default.xml"/>
    </copy>
    </target>
    -
    +
    <target name="compile-hdfs-classes" depends="compile-core-classes">
    <jsp-compile
    uriroot="${src.webapps}/hdfs"
    @@ -452,7 +452,7 @@
    </jsp-compile>

    <!-- Compile Java files (excluding JSPs) checking warnings -->
    - <javac
    + <javac
    encoding="${build.encoding}"
    srcdir="${hdfs.src.dir};${build.src}"
    includes="org/apache/hadoop/**/*.java"
    {code}
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Common
    Issue Type: Sub-task
    Components: test
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Tsz Wo (Nicholas), SZE (JIRA) at Jun 20, 2009 at 6:12 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Tsz Wo (Nicholas), SZE updated HADOOP-6003:
    -------------------------------------------

    Component/s: dfs

    After the project split, this issue belongs to hdfs.
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Common
    Issue Type: Sub-task
    Components: dfs, test
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.
  • Konstantin Boudnik (JIRA) at Jun 21, 2009 at 5:40 pm
    [ https://issues.apache.org/jira/browse/HADOOP-6003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

    Konstantin Boudnik updated HADOOP-6003:
    ---------------------------------------

    Attachment: HADOOP-6003.patch

    JavaDoc comments tags are fixed
    Better documentation for the FI classes is written.

    Other comments are considered and fixes are made.
    AspectJ framework for HDFS code and tests
    -----------------------------------------

    Key: HADOOP-6003
    URL: https://issues.apache.org/jira/browse/HADOOP-6003
    Project: Hadoop Common
    Issue Type: Sub-task
    Components: dfs, test
    Affects Versions: 0.20.0
    Reporter: Konstantin Boudnik
    Assignee: Konstantin Boudnik
    Fix For: 0.21.0

    Attachments: HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.patch, HADOOP-6003.sh, HADOOP-6003.sh


    This subtask takes care about HDFS part of Hadoop only. Others will be added later as needed: it will include only new aspects development and modifications of build.xml file
    --
    This message is automatically generated by JIRA.
    -
    You can reply to this email to add a comment to the issue online.

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-dev @
categorieshadoop
postedJun 9, '09 at 6:30p
activeJun 21, '09 at 5:40p
posts25
users1
websitehadoop.apache.org...
irc#hadoop

1 user in discussion

Konstantin Boudnik (JIRA): 25 posts

People

Translate

site design / logo © 2022 Grokbase