FAQ

On Tue, Dec 18, 2012 at 1:05 AM, Colin McCabe wrote:
On Mon, Dec 17, 2012 at 11:03 AM, Steve Loughran wrote:
On 17 December 2012 16:06, Tom White wrote:

There are some tests like the S3 tests that end with "Test" (e.g.
Jets3tNativeS3FileSystemContractTest) - unlike normal tests which
start with "Test". Only those that start with "Test" are run
automatically (see the surefire configuration in
hadoop-project/pom.xml). You have to run the others manually with "mvn
test -Dtest=...".

The mechanism that Colin describes is probably better though, since
the environment-specific tests can be run as a part of a full test run
by Jenkins if configured appropriately.
I'd like that -though one problem with the current system is that you need
to get the s3 (and soon: openstack) credentials into
src/test/resources/core-site.xml, which isn't the right approach. If we
could get them into properties files things would be easier.
That's overkill for adding a few more openstack tests -but I would like to
make it easier to turn those and the rackspace ones without sticking my
secrets into an XML file under SCM
I think the way to go is to have one XML file include another.

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration xmlns:xi="http://www.w3.org/2001/XInclude">
<property>
<name>boring.config.1</name>
<value>boring-value</value>
... etc, etc...
<xi:include href="../secret-stuff.xml" />
</configuration>

That way, you can keep the boring configuration under version control,
and still have your password sitting in a small separate
non-version-controlled XML file.

We use this trick a bunch with the HA configuration stuff-- 99% of the
configuration is the same between the Active and Standby Namenodes,
but you can't give them the same dfs.ha.namenode.id or dfs.name.dir.
Includes help a lot here.
another tactic could be to have specific test projects: test-s3,
test-openstack, test-... which contain nothing but test cases. You'd set
jenkins up those test projects too -the reason for having the separate
names is to make it blatantly clear which tests you've not run
I dunno. Every time a project puts unit or system tests into a
separate project, the developers never run them. I've seen it happen
enough times that I think I can call it an anti-pattern by now. I
like having tests alongside the code-- to the maximum extent that is
possible.
Just to be clear, I'm not referring to any Hadoop-related project
here, just certain other open source (and not) ones I've worked on.
System/unit tests belong with the rest of the code, otherwise they get
stale real fast.

It sometimes makes sense for integration tests to live in a separate
repo, since by their nature they're usually talking to stuff that
lives in multiple repos.

best,
Colin
cheers,
Colin

Tom

On Mon, Dec 17, 2012 at 10:06 AM, Steve Loughran <stevel@hortonworks.com>
wrote:
thanks, I'l; have a look. I've always wanted to add the notion of skipped
to test runs -all the way through to the XML and generated reports, but
you'd have to do a new junit runner for this and tweak the reporting code.
Which, if it involved going near maven source, is not something I am
prepared to do
On 14 December 2012 18:57, Colin McCabe wrote:

One approach we've taken in the past is making the junit test skip
itself when some precondition is not true. Then, we often create a
property which people can use to cause the skipped tests to become a
hard error.

For example, all the tests that rely on libhadoop start with these
lines:
@Test
public void myTest() {
Assume.assumeTrue(NativeCodeLoader.isNativeCodeLoaded());
...
}
This causes them to be silently skipped when libhadoop.so is not
available or loaded (perhaps because it hasn't been built.)

However, if you want to cause this to be a hard error, you simply run
mvn test -Drequire.test.libhadoop
See TestHdfsNativeCodeLoader.java to see how this is implemented.

The main idea is that your Jenkins build slaves use all the -Drequire
lines, but people running tests locally are not inconvenienced by the
need to build libhadoop.so in every case. This is especially good
because libhadoop.so isn't known to build on certain platforms like
AIX, etc. It seems to be a good tradeoff so far. I imagine that s3
could do something similar.

cheers,
Colin


On Fri, Dec 14, 2012 at 9:56 AM, Steve Loughran <stevel@hortonworks.com
wrote:
The swiftfs tests need only to run if there's a target filesystem; copying
the s3/s3n tests, something like

<property>
<name>test.fs.swift.name</name>
<value>swift://your-object-store-herel/</value>
</property>

How does one actually go about making junit tests optional in
mvn-land?
Should the probe/skip logic be in the code -which can make people
think
the
test passed when it didn't actually run? Or can I turn it on/off in maven?
-steve

Search Discussions

Discussion Posts

Previous

Follow ups

Related Discussions

Discussion Navigation
viewthread | post
posts ‹ prev | 7 of 10 | next ›
Discussion Overview
groupcommon-dev @
categorieshadoop
postedDec 14, '12 at 5:56p
activeDec 18, '12 at 4:42p
posts10
users3
websitehadoop.apache.org...
irc#hadoop

People

Translate

site design / logo © 2021 Grokbase