Git Product home page Git Product logo

Comments (76)

philsquared avatar philsquared commented on May 21, 2024

First cut of this has been committed, but it's not fully tested yet and more examples of real JUnit output would be useful

from catch2.

wichert avatar wichert commented on May 21, 2024

Jenkins does not like the current output:

ERROR: Publisher hudson.tasks.junit.JUnitResultArchiver aborted due to exception
java.lang.NullPointerException
    at hudson.tasks.junit.CaseResult.getPackageName(CaseResult.java:266)
    at hudson.tasks.junit.TestResult.tally(TestResult.java:500)
    at hudson.tasks.junit.JUnitParser$ParseResultCallable.invoke(JUnitParser.java:115)
    at hudson.tasks.junit.JUnitParser$ParseResultCallable.invoke(JUnitParser.java:87)
    at hudson.FilePath.act(FilePath.java:757)
    at hudson.FilePath.act(FilePath.java:739)
    at hudson.tasks.junit.JUnitParser.parse(JUnitParser.java:83)
    at hudson.tasks.junit.JUnitResultArchiver.parse(JUnitResultArchiver.java:123)
    at hudson.tasks.junit.JUnitResultArchiver.perform(JUnitResultArchiver.java:135)
    at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
    at hudson.model.AbstractBuild$AbstractRunner.perform(AbstractBuild.java:649)
    at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:625)
    at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:603)
    at hudson.model.Build$RunnerImpl.post2(Build.java:161)
    at hudson.model.AbstractBuild$AbstractRunner.post(AbstractBuild.java:572)
    at hudson.model.Run.run(Run.java:1386)
    at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
    at hudson.model.ResourceController.execute(ResourceController.java:88)
    at hudson.model.Executor.run(Executor.java:145)
Finished: FAILURE

Here is an exampe of a valid junit output as generated by nose:

<?xml version="1.0" encoding="UTF-8"?>
<testsuite name="nosetests" tests="175" errors="0" failures="0" skip="0">
  <testcase classname="pyrad.tests.testBidict.BiDictTests" name="testBackwardAccess" time="0"/>
  <testcase classname="pyrad.tests.testBidict.BiDictTests" name="testBackwardDeletion" time="0"/>
  <testcase classname="pyrad.tests.testBidict.BiDictTests" name="testDeletion" time="0"/>
  <testcase classname="pyrad.tests.testBidict.BiDictTests" name="testForwardAccess" time="0"/>
  <testcase classname="pyrad.tests.testBidict.BiDictTests" name="testItemAccessor" time="0"/>
  <testcase classname="pyrad.tests.testBidict.BiDictTests" name="testLength" time="0"/>
  <testcase classname="pyrad.tests.testBidict.BiDictTests" name="testStartEmpty" time="0"/>
  <testcase classname="pyrad.tests.testClient.ConstructionTests" name="testNamedParameters" time="0"/>
  <testcase classname="pyrad.tests.testClient.ConstructionTests" name="testParameterOrder" time="0"/>
  <testcase classname="pyrad.tests.testClient.ConstructionTests" name="testSimpleConstruction" time="0"/>
  <testcase classname="pyrad.tests.testClient.OtherTests" name="testCreateAcctPacket" time="0"/>
  <testcase classname="pyrad.tests.testClient.OtherTests" name="testCreateAuthPacket" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testAuthDelay" time="2"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testBind" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testBindClosesSocket" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testDoubleAccountDelay" time="3"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testDoubleRetry" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testIgnorePacketError" time="1"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testInvalidReply" time="1"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testNoRetries" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testReopen" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testSendPacket" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testSingleAccountDelay" time="2"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testSingleRetry" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testValidReply" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.AttributeTests" name="testConstructionParameters" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.AttributeTests" name="testInvalidDataType" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.AttributeTests" name="testNamedConstructionParameters" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.AttributeTests" name="testValues" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryInterfaceTests" name="testContainment" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryInterfaceTests" name="testEmptyDictionary" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryInterfaceTests" name="testReadonlyContainer" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testAttributeEncryptionError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testAttributeOptions" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testAttributeTooFewColumnsError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testAttributeUnknownTypeError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testAttributeUnknownVendorError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testBeginVendorParsing" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testBeginVendorTooFewColumns" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testBeginVendorUnknownVendor" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testDictFileParseError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testDictFilePostParse" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testEndVendorParsing" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testEndVendorUnbalanced" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testEndVendorUnknownVendor" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testInclude" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testIntegerValueParsing" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testParseEmptyDictionary" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testParseMultipleDictionaries" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testParseSimpleDictionary" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testStringValueParsing" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testValueForUnknownAttributeError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testValueTooFewColumnsError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testVenderTooFewColumnsError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testVendorFormatError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testVendorFormatSyntaxError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testVendorOptionError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testVendorParsing" time="0"/>
  <testcase classname="pyrad.tests.testHost.ConstructionTests" name="testNamedParameters" time="0"/>
  <testcase classname="pyrad.tests.testHost.ConstructionTests" name="testParameterOrder" time="0"/>
  <testcase classname="pyrad.tests.testHost.ConstructionTests" name="testSimpleConstruction" time="0"/>
  <testcase classname="pyrad.tests.testHost.PacketCreationTests" name="testCreateAcctPacket" time="0"/>
  <testcase classname="pyrad.tests.testHost.PacketCreationTests" name="testCreateAuthPacket" time="0"/>
  <testcase classname="pyrad.tests.testHost.PacketCreationTests" name="testCreatePacket" time="0"/>
  <testcase classname="pyrad.tests.testHost.PacketSendTest" name="testSendPacket" time="0"/>
  <testcase classname="pyrad.tests.testHost.PacketSendTest" name="testSendReplyPacket" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketConstructionTests" name="testBasicConstructor" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketConstructionTests" name="testConstructWithDictionary" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketConstructionTests" name="testConstructorDefaults" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketConstructionTests" name="testConstructorIgnoredParameters" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketConstructionTests" name="testConstructorRawPacket" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketConstructionTests" name="testConstructorWithAttributes" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketConstructionTests" name="testNamedConstructor" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketTests" name="testCreateReply" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketTests" name="testRequestPacket" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketTests" name="testRequestPacketSetsId" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketTests" name="testVerifyAcctRequest" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketConstructionTests" name="testBasicConstructor" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketConstructionTests" name="testConstructWithDictionary" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketConstructionTests" name="testConstructorDefaults" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketConstructionTests" name="testConstructorIgnoredParameters" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketConstructionTests" name="testConstructorWithAttributes" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketConstructionTests" name="testNamedConstructor" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testCreateReply" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testPwCryptEmptyPassword" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testPwCryptPassword" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testPwCryptSetsAuthenticator" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testPwDecryptEmptyPassword" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testPwDecryptPassword" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testRequestPacket" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testRequestPacketCreatesAuthenticator" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testRequestPacketCreatesID" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketConstructionTests" name="testBasicConstructor" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketConstructionTests" name="testConstructWithDictionary" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketConstructionTests" name="testConstructorIgnoredParameters" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketConstructionTests" name="testConstructorWithAttributes" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketConstructionTests" name="testNamedConstructor" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testAddAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testAttributeAccess" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testAttributeValueAccess" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testCreateAuthenticator" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testCreateReply" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithBadAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithEmptyAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithEmptyPacket" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithInvalidLength" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithMultiValuedAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithPartialAttributes" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithTooBigPacket" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithTwoAttributes" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithVendorAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithoutAttributes" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDelItem" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testEncodeKey" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testEncodeKeyValues" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testGenerateID" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testHasKey" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testHasKeyWithUnknownKey" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testKeys" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testPktDecodeVendorAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testPktEncodeAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testPktEncodeAttributes" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testRawAttributeAccess" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testReplyPacket" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testVendorAttributeAccess" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testVerifyReply" time="0"/>
  <testcase classname="pyrad.tests.testPacket.UtilityTests" name="testGenerateID" time="0"/>
  <testcase classname="pyrad.tests.testProxy.OtherTests" name="testProcessInput" time="0"/>
  <testcase classname="pyrad.tests.testProxy.OtherTests" name="testProcessInputNonProxyPort" time="0"/>
  <testcase classname="pyrad.tests.testProxy.ProxyPacketHandlingTests" name="testHHandleProxyPacketHandlesWrongPacket" time="0"/>
  <testcase classname="pyrad.tests.testProxy.ProxyPacketHandlingTests" name="testHandleProxyPacketSetsSecret" time="0"/>
  <testcase classname="pyrad.tests.testProxy.ProxyPacketHandlingTests" name="testHandleProxyPacketUnknownHost" time="0"/>
  <testcase classname="pyrad.tests.testProxy.SocketTests" name="testProxyFd" time="0"/>
  <testcase classname="pyrad.tests.testServer.AcctPacketHandlingTests" name="testHandleAcctPacket" time="0"/>
  <testcase classname="pyrad.tests.testServer.AcctPacketHandlingTests" name="testHandleAcctPacketUnknownHost" time="0"/>
  <testcase classname="pyrad.tests.testServer.AcctPacketHandlingTests" name="testHandleAcctPacketWrongPort" time="0"/>
  <testcase classname="pyrad.tests.testServer.AuthPacketHandlingTests" name="testHandleAuthPacket" time="0"/>
  <testcase classname="pyrad.tests.testServer.AuthPacketHandlingTests" name="testHandleAuthPacketUnknownHost" time="0"/>
  <testcase classname="pyrad.tests.testServer.AuthPacketHandlingTests" name="testHandleAuthPacketWrongPort" time="0"/>
  <testcase classname="pyrad.tests.testServer.OtherTests" name="testAcctProcessInput" time="0"/>
  <testcase classname="pyrad.tests.testServer.OtherTests" name="testAuthProcessInput" time="0"/>
  <testcase classname="pyrad.tests.testServer.OtherTests" name="testCreateReplyPacket" time="0"/>
  <testcase classname="pyrad.tests.testServer.RemoteHostTests" name="testNamedConstruction" time="0"/>
  <testcase classname="pyrad.tests.testServer.RemoteHostTests" name="testSimpleConstruction" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerConstructiontests" name="testBindDuringConstruction" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerConstructiontests" name="testParameterOrder" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerConstructiontests" name="testSimpleConstruction" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerRunTests" name="testRunIgnoresPacketErrors" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerRunTests" name="testRunIgnoresPollErrors" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerRunTests" name="testRunIgnoresServerPacketErrors" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerRunTests" name="testRunInitializes" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerRunTests" name="testRunRunsProcessInput" time="0"/>
  <testcase classname="pyrad.tests.testServer.SocketTests" name="testBind" time="0"/>
  <testcase classname="pyrad.tests.testServer.SocketTests" name="testGrabPacket" time="0"/>
  <testcase classname="pyrad.tests.testServer.SocketTests" name="testPrepareSocketAcctFds" time="0"/>
  <testcase classname="pyrad.tests.testServer.SocketTests" name="testPrepareSocketAuthFds" time="0"/>
  <testcase classname="pyrad.tests.testServer.SocketTests" name="testPrepareSocketNoFds" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testAddressDecoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testAddressEncoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testDateDecoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testDateEncoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testDecodeFunction" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testEncodeFunction" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testIntegerDecoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testIntegerEncoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testInvalidAddressEncodingRaisesTypeError" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testInvalidDataEncodingRaisesTypeError" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testInvalidIntegerEncodingRaisesTypeError" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testInvalidStringEncodingRaisesTypeError" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testStringDecoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testStringEncoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testUnknownTypeDecoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testUnknownTypeEncoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testUnsignedIntegerEncoding" time="0"/>
</testsuite>

from catch2.

wichert avatar wichert commented on May 21, 2024

This stackoverflow discussion might have useful information.

from catch2.

philsquared avatar philsquared commented on May 21, 2024

Thanks @wichert. I'm going to have to have another look at this.
I did see that stackoverflow question (and all that it links to) when I was originally looking into it.
I think the only way I'm going to get this working is to install an instance of Jenkins and/ or Hudson myself

from catch2.

wichert avatar wichert commented on May 21, 2024

If you need another example: I put some zope testrunner output online as well. One thing that testrunner is does create a separate file for each source file containing tests so you end up with lots of XML files.

from catch2.

wichert avatar wichert commented on May 21, 2024

Can you provide a status update for this ticket?

from catch2.

philsquared avatar philsquared commented on May 21, 2024

I'm really sorry, Wichert. I managed to drop this somewhere along the line.
I'm pretty snowed at the moment so I don't know when I will be able to get back to it.
Will do my best.
If you fancy looking into it yourself you'd be very welcome, of course :-)

from catch2.

wichert avatar wichert commented on May 21, 2024

I'll give it a try. I was still using an older version of Catch, so I'll have to test upgrading first. The first thing I see is lots of new warnings produced by catch and a compile error. so it doesn't appear to be a trivial upgrade.

from catch2.

wichert avatar wichert commented on May 21, 2024

I think we need to think a bit about how to structure the output. As an example lets assume tests that are setup like this:

TEST_CASE("ClassA", "Short description of class A") {
    SECTION("methodOne", "Tests for method one") {
        SECTION("situation-1", "What happens if XYZ") {
            REQUIRE(...);
            REQUIRE(...);
            REQUIRE(...);
        }
         SECTION("situation-2", "What happens if XYZ") {
            REQUIRE(...);
            REQUIRE(...);
            REQUIRE(...);
        }
    }

    SECTION("methodTwo", "Tests for method two") {
        SECTION("situation-1", "What happens if XYZ") {
            REQUIRE(...);
            REQUIRE(...);
            REQUIRE(...);
        }
         SECTION("situation-2", "What happens if XYZ") {
            REQUIRE(...);
            REQUIRE(...);
            REQUIRE(...);
        }
    }
}

TEST_CASE("ClassB", "Short description of class B") {
    SECTION("methodOne", "Tests for method one") {
        // Repeat similar structure as for classA

when reporting results for these tests in JUnit format we run into one problem: Catch supports arbitrary nesting of sections, while JUnit only supports two levels (testsuite -> testcase). I suggest that the simplest thing to do here is to only take the two top levels of Catch (TEST_CASE and top level SECTION). That results in this JUnit structure:

<testsuites>
  <testsuite name="ClassA">
    <testcase name="Tests for method one" classname="methodOne" />
    <testcase name="Tests for method two" classname="methodTwo" />
   </testsuite>
  <testsuite name="ClassB">
    <testcase name="Tests for method one" classname="methodOne" />
    <testcase name="Tests for method two" classname="methodTwo" />
   </testsuite>
</testsuites>

(ignoring the mandatory attributes such as time, tests, failures, etc.). Looking at the data passed to a Catch reporter this does not match correctly: the top level there is a group, but I never see more than one group being generated. I'm not sure if that is due to me not knowing how to make groups, if that is an not fully implemented feature in Catch or a design flaw (unused group level).

Can you provide some guidance how I should proceed with this?

from catch2.

wichert avatar wichert commented on May 21, 2024

Hi Phil. Can you spare a few minutes to give me a few tips so I can try to fix this?

from catch2.

wichert avatar wichert commented on May 21, 2024

Hi Phil. Is this still on your radar?

from catch2.

philsquared avatar philsquared commented on May 21, 2024

Sorry Wichert, I've still not really caught up.
Hoping to do a big push soon…

On 13 Mar 2012, at 09:05, Wichert Akkerman wrote:

Hi Phil. Is this still on your radar?


Reply to this email directly or view it on GitHub:
#5 (comment)

from catch2.

wichert avatar wichert commented on May 21, 2024

Hi Phil. I'm still willing to poke at this, but need some input from you. Is this issue still on your radar?

from catch2.

philsquared avatar philsquared commented on May 21, 2024

Hey Wichert,

Thanks for your patience. I'm really sorry I've not been getting back to this - and thanks for your help.
This has not dropped off my radar - but I'm trying to get through some work that will impact the reporter interface before I do too much with the JUnit reporter. I'm aiming to get that done in the next week or so.

As for your the question you raised before (sorry I didn't respond before - that was inbox syndrome):

The way I have been approaching it is that each isolated test case run (one for each leaf section) is a JUnit "test". The name can be constructed from the test case name + section name(s) (I need to formalise that naming scheme too).

JUnit test suites are then each set of tests that are matched by the filters provided by a single -t switch on the command line (if no -t is provided then there is just a single suite). Bear in mind that -t can have multiple filters. So, e.g:

-t abc/def* ghi/jkl* -t random/1 random/2

Gives us two suites - one named, "abc/def* ghi/jkl*", and one named, "random/1 random/2".
I have support for naming suites coming too - and this is all being captured in the TestCaseFilters class and will be passed on to the reporter (it's not, yet).

Does that make sense?

from catch2.

wichert avatar wichert commented on May 21, 2024

This matches the naming scheming, which I admint I have never understood in catch. I am always struggling with two things:

  • why should I have to provide a description for test cases and sections? I already use descriptive names, so the description is often the exact same string. Having to provide it is just annoying.
  • how do test case names map to the commandline option for the testrunner? I would expect that if I have a test case named "foo" with a nested testcase named "bar", which contains a section named "buz" I can use options like -t foo, -t foo/bar or -t foo/bar/buz . That never worked, at which point I always gave up trying to use the -t option.

That aside there is a problem with junit XML support: Catch supports nested test cases which does not map to the junit scheme which only supports three levels: package, class and test. Trying force Catch structure into that I would expect something like this:

  • each top level test case is a package
  • each second level test case is a class. If no second level is used insert a dummy level?
  • each section is a test

or alternatively:

  • each source file is a package
  • each top level test case is a class
  • each section is a test

The latter matches python (and I'm guessing java) better, but I am not sure if you can get the source filename in a useful way.

from catch2.

philsquared avatar philsquared commented on May 21, 2024

The idea is that the test name is a short, hierarchical, name - something like: "stuff/sub stuff/details" - so all "stuff" tests can be grouped, and all "stuff/sub stuff" tests can be grouped.
You are not forced to work that way (and I don't always) but it's a useful convention.
If you do that then its nice to be able to supply a more detailed description string alongside it too. But I have to admit I use the description string a lot less than I expected to - a lot of the time I leave it to "" (you don't have to put anything the string - but, due to C++98 not having variadic macros you must provide at least the empty string).

Same with sections, although the hierarchical part is less common as they are already in a hierarchy - so I use descriptions even less there.

If you have used hierarchical test case naming (ignore sections for a moment) then you might have tests like:

"a/b/c"
"a/b/d"
"a/e/f"
"g"

Now you can run the first three with "a*", the first two with "a/b*" or just the first one with "a/b/c"

The idea has always been that this ability would seamlessly extend to sections too - so a section, "1" in the first test case, could be run with "a/b/c/1".
But I have not yet implemented that - and there are some subtleties to it that make me wonder if it would work that way.

Nonetheless, the ability to create groups on-the-fly using wildcards (especially with hierarchical naming) has been useful - and has got even more useful just recently as I have added prefix wildcards (*foo*) and exclusions (exclude:foo - or ~foo) - which can also be mixed (~*foo*). Additionally you can supply a series of these filters and the group will be the union of the inclusions, less the union of the exclusions.

So a command line like:

-t foo/bar* -t a* ~*b*

Would run two groups. The first is "foo/bar*" and will match anything that starts with foo/bar.
The second is "a* ~*b*" and will match anything that starts with a, except anything that contains b.

Two features that are coming: named groups and tags (non hierarchical matching) should make this approach more powerful.

So my scheme was to map groups to Junit suites.

from catch2.

wichert avatar wichert commented on May 21, 2024

What I am struggling with is the need to prefix fhings manually. To
illustrate this is how I usually structure my tests:

TEST_CASE("MyClass") {
     TEST_CASE("SomeMethod") {
         SECTION("situation-one") {
             ....
            }
         SECTION("situation-two") {
             ....
            }
     }
}

I would expect to be able to tell Catch to run tests for MyClass using
"-t MyClass", or just the tests for MyClass::SomeMethod using "-t
MyClass/SomeMethod", or even a specific situation using "-t
MyClass/SomeMethod/situation-one". I do not see why I should be forced
to manually repeat the hierarchy structure in the test case name as
Catch is currently forcing me to do:

TEST_CASE("MyClass") {
     TEST_CASE("MyClas/SomeMethod") {
         SECTION("MyClass/SomeMethod/situation-one") {
             ....
            }
         SECTION("MyClass/SomeMethod/situation-two") {
             ....
            }
     }
}

from catch2.

philsquared avatar philsquared commented on May 21, 2024

I'm not sure where the outer TEST_CASE is coming from.
TEST_CASEs can appear at only one level (they are implemented as free functions).
SECTIONS may be arbitrarily nested within TEST_CASEs (they are implemented as if statements with scoped objects).

It is true the test cases that logically belong together require a common prefix for the hierarchical matching to work. But it's precisely that prefixing that logically groups them together.

Sections do not require prefixes, however. So I believe your example would be written something like:

TEST_CASE("MyClass/SomeMethod", "") {
    SECTION("situation-one", "") {
        ....
    }
    SECTION("situation-two", "") {
        ....
    }
}

Which is not so bad.

Now you can run all MyClass tests with:

-t MyClass*

And everything in that first test case with:

-t MyClass/SomeMethod

Unfortunately, at time of writing, you cannot selectively run Sections within a test case. There are some issues around this that make it not straightforward, but I believe I should be able to get it working to some approximation (the main issue is that discovery of sections only occurs as the test case is running).

Does that clarify anything at all? Is it the section selection that you are particularly missing?

from catch2.

wichert avatar wichert commented on May 21, 2024

I misremembered by code - I was nesting SECTIONs instead of TEST_CASEs. That makes my code look like this:

TEST_CASE("MyClass") {
     SECTION("SomeMethod") {
         SECTION("situation-one") {
             ....
         }
         SECTION("situation-two") {
             ....
         }
     }
}

I still do not see why I need to repeat a prefix in the section name when I already create an explicit nesting level in the code. Is there no way to avoid that?

from catch2.

wichert avatar wichert commented on May 21, 2024

Section selection is certainly something that I am missing. Especially when debugging I often want to run a single (leaf) section so I can test for unintended side-effects and side breakpoints easily without having to worry about other tests triggering them.

from catch2.

TypicalFooBar avatar TypicalFooBar commented on May 21, 2024

Hey, I just wanted to chime in here on this conversation of two years :)

I experienced the same error in Jenkins that you were experiencing Wichert. The NullPointerException because of the output format of the JUnit reporter.

I was able to figure out why Jenkins is crashing, and I have an ugly workaround that lets Jenkins continue to run, but it doesn't solve this issue.

Currently, in the JunitReporter class, there is a function that looks like this:

void OutputTestCases( XmlWriter& xml, const Stats& stats ) {
            std::vector<TestCaseStats>::const_iterator it = stats.m_testCaseStats.begin();
            std::vector<TestCaseStats>::const_iterator itEnd = stats.m_testCaseStats.end();
            for(; it != itEnd; ++it ) {
                xml.writeBlankLine();
                xml.writeComment( "Test case" );

                XmlWriter::ScopedElement e = xml.scopedElement( "testcase" );
                xml.writeAttribute( "classname", it->m_className );
                xml.writeAttribute( "name", it->m_name );
                xml.writeAttribute( "time", "tbd" );

                OutputTestResult( xml, *it );
            }

Using the above, the output from the JunitReporter looks like the following:

<testsuites>
  <testsuite errors="0" failures="0" tests="1" hostname="tbd" time="tbd" timestamp="tbd">

    <!--Test case-->
    <testcase name="SimpleTest" time="tbd"/>
  </testsuite>
  <system-out/>
  <system-err/>
</testsuites>

After some trial and error, I found that what was making Jenkins crash was the fact that the testcase element did NOT have the classname attribute. This is the line of code in the OutputTestCases() function of the JunitReporter class that causes the trouble:

xml.writeAttribute( "classname", it->m_className );

In my case, this line was not actually printing the classname. Perhaps it->m_className is null? When I switched out that line for a random string of "foo", it printed. The modified OutputTestCases() function looks like this:

void OutputTestCases( XmlWriter& xml, const Stats& stats ) {
            std::vector<TestCaseStats>::const_iterator it = stats.m_testCaseStats.begin();
            std::vector<TestCaseStats>::const_iterator itEnd = stats.m_testCaseStats.end();
            for(; it != itEnd; ++it ) {
                xml.writeBlankLine();
                xml.writeComment( "Test case" );

                XmlWriter::ScopedElement e = xml.scopedElement( "testcase" );
                xml.writeAttribute( "classname", "foo" );
                xml.writeAttribute( "name", it->m_name );
                xml.writeAttribute( "time", "tbd" );

                OutputTestResult( xml, *it );
            }
        }

With the above change the JunitReporter output looks like this:

<testsuites>
  <testsuite errors="0" failures="0" tests="1" hostname="tbd" time="tbd" timestamp="tbd">

    <!--Test case-->
    <testcase classname="foo" name="SimpleTest" time="tbd"/>
  </testsuite>
  <system-out/>
  <system-err/>
</testsuites>

Jenkins seems to be okay with this, though it displays the SimpleTest result as if it were a part of the class foo, which is the ugly part I was talking about (but at least it didn't crash!).

I hope this helps you debug the JunitReporter. Great work on Catch Phil! It is VERY easy to use, which I really appreciate! :)

from catch2.

philsquared avatar philsquared commented on May 21, 2024

I hadn't realised the missing class name attribute was the only thing (or, at least, the key thing) stopping this from working! (I've still not had a chance to get a Jenkins/ Hudson set-up going to try it for myself).

I've plumbed that attribute in now (it was always using an empty string before).
For test cases that are actually based on methods it will use the class name. For free standing test cases it just uses the string, "global". I'm open to suggestions on making that more useful. But, from the sounds of it, that should unblock you from needing your workaround and at least be a smoother experience.

I realise there are some other attributes that are still set to "tbd". I don't know how important they are to getting this working too.

I've committed those changes to the Interation branch. Would appreciate if you (both) could let me know how that works for you.

from catch2.

philsquared avatar philsquared commented on May 21, 2024

@wichert I noticed, while replying above, that our previous discussion never fully concluded. But I was bit confused about this:

"I still do not see why I need to repeat a prefix in the section name when I already create an explicit nesting level in the code. Is there no way to avoid that?"

I don't see any repeated prefix in the last code you posted.

from catch2.

wichert avatar wichert commented on May 21, 2024

@philsquared are you mixing up this discussion and another issue where we talked about how -s behaves?

from catch2.

philsquared avatar philsquared commented on May 21, 2024

No, it was part of this thread:

Hard link here: #5 (comment)

from catch2.

wichert avatar wichert commented on May 21, 2024

I have to admit I don't remember exactly what my thoughts were at the time. Some current thoughts:

  • there is a prefix involved here: you use MyClass as a prefix (through the MyClass* glob.
  • being able to run individual sections within a test case is extremely useful
  • this should also work if those sections are nested
  • perhaps filtering on sections should be a separate commandline option (Python's zope.testrunner uses the -t parameter to do that, nosetests uses --tests). I'm not sure how that will work with nested sections; you may need to define a separating character and advice people not to use that in test names

from catch2.

TypicalFooBar avatar TypicalFooBar commented on May 21, 2024

Phil,

I just used the catch.hpp file in the Integration branch and Jenkins was quite happy with the output! I tested it with different successful and unsuccessful test cases, and it seems to be working just fine.

Thank you for taking time to fix this problem! The other tbd values still need work, but at least Jenkins can read and display the output from the JunitReporter.

Thanks again for your help, and again great job on Catch!

from catch2.

philsquared avatar philsquared commented on May 21, 2024

That's great! Thanks for letting me know, Derek.
At this point, since we have at least seen it running now, and due to this thread going on a bit - mostly on peripheral issues - I'm going to close this issue.

@wichert - if it still doesn't work at all for you please reopen.

For any othe JUnit related issues now please raise a new issue.

from catch2.

SebDyn avatar SebDyn commented on May 21, 2024

I want to reopen this bug because it does not work with me.

I use Jenkins ver. 1.479
Jenkins Plugin: xunit ver. 1.50 ( https://wiki.jenkins-ci.org/display/JENKINS/xUnit+Plugin )
CATCH: single header distribution. Branch Integration, revision 88b7082

Jenkins console output:

Gestartet durch Benutzer anonymous
Building in workspace /var/lib/jenkins/jobs/Test Executables/workspace
[SSH] executing pre build script:

/HOMEDIR/vmwarectrl/opensuse/vmstart_test.sh
catched variables: branch=next, cores=4, mode=release, os=linux
[SSH] exit-status: 0
[workspace] $ /bin/sh -xe /tmp/hudson7292242557306021101.sh
[workspace] $ /bin/sh -xe /tmp/hudson4347723387053783514.sh
[SSH] executing post build script:
BUILD_ID="2012-11-13_13-56-01"

ssh rlampert@opensuse /HOMEDIR/devel/build/exec_test.sh -e expensive -e meshdata
mkdir /var/lib/jenkins/jobs/Test\ Executables/workspace/$BUILD_ID
scp rlampert@opensuse:/HOMEDIR/devel/build/*.xml /var/lib/jenkins/jobs/Test\ Executables/workspace/$BUILD_ID
scp rlampert@opensuse:/HOMEDIR/devel/build/test*.txt /var/lib/jenkins/jobs/Test\ Executables/workspace/$BUILD_ID
touch /var/lib/jenkins/jobs/Test\ Executables/workspace/test.xml
[SSH] exit-status: 0
[xUnit] [INFO] - Starting to record.
[xUnit] [INFO] - Processing JUnit
[xUnit] [INFO] - [JUnit] - 1 test report file(s) were found with the pattern '*.xml' relative to '/var/lib/jenkins/jobs/Test Executables/workspace' for the testing framework 'JUnit'.
ERROR: Publisher org.jenkinsci.plugins.xunit.XUnitPublisher aborted due to exception
java.lang.NullPointerException
    at hudson.tasks.junit.CaseResult.getPackageName(CaseResult.java:266)
    at hudson.tasks.junit.TestResult.freeze(TestResult.java:576)
    at hudson.tasks.junit.TestResultAction.setResult(TestResultAction.java:74)
    at hudson.tasks.junit.TestResultAction.<init>(TestResultAction.java:67)
    at org.jenkinsci.plugins.xunit.XUnitPublisher.recordTestResult(XUnitPublisher.java:253)
    at org.jenkinsci.plugins.xunit.XUnitPublisher.performXUnit(XUnitPublisher.java:123)
    at org.jenkinsci.plugins.xunit.XUnitPublisher.perform(XUnitPublisher.java:93)
    at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
    at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:717)
    at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:692)
    at hudson.model.Build$BuildExecution.post2(Build.java:183)
    at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:639)
    at hudson.model.Run.execute(Run.java:1527)
    at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
    at hudson.model.ResourceController.execute(ResourceController.java:88)
    at hudson.model.Executor.run(Executor.java:236)
Sending e-mails to: XXXX
Finished: FAILURE

The XML generated by CATCH

<testsuites>
  <testsuite errors="0" failures="0" tests="587" hostname="tbd" time="tbd" timestamp="tbd">

    <!--Test case-->
    <testcase classname="global" name="boost/spirit/width" time="tbd"/>

    <!--Test case-->
    <testcase classname="global" name="sos_import/lsdyna/simple" time="tbd"/>

    <!--Test case-->
    <testcase classname="global" name="sos_import/lsdyna/simple/individual_files" time="tbd"/>

    <!--Test case-->
    <testcase classname="global" name="sos_import/lsdyna/simple/project" time="tbd"/>

    <!--Test case-->
    <testcase classname="global" name="sos_import/lsdyna/parser/old" time="tbd"/>

    <!--Test case-->
    <testcase classname="global" name="sos_import/osl3/bin/simple" time="tbd"/>

    <!--Test case-->
    <testcase classname="global" name="sos_import/osl3/bin/xml" time="tbd"/>

    <!--Test case-->
    <testcase classname="global" name="sos_import/lsdyna/osl3/bsp1_part321" time="tbd"/>

    <!--Test case-->
    <testcase classname="global" name="sos_import/lsdyna/fixed_width" time="tbd"/>

    <!--Test case-->
    <testcase classname="global" name="sos_import/lsdyna/fixed_width/output" time="tbd"/>
  </testsuite>
  <system-out>
    2012-11-13 13:50:36 [   INFO] thread   0 | Node data in database:
  </system-out>
  <system-err>
2012-11-13 13:51:57 [  ERROR] thread   1 |         Did not find data object with (design, quantity) = ('0004', 'not-existing') in data base.
  </system-err>
</testsuites>

from catch2.

SebDyn avatar SebDyn commented on May 21, 2024

It looks like if needs name="something". If I add this manually, it works.
If I run the test with 'test -t testcasename', a name is provided and JUnit can use it. A name is only not provided if running all tests (without -t).

from catch2.

philsquared avatar philsquared commented on May 21, 2024

Thanks for reporting this, @SebDyn. I'll take a look, Doesn't sound like it should be too hard to fix.

from catch2.

philsquared avatar philsquared commented on May 21, 2024

@SebDyn, please give the latest commit on integration a try now - that's v0.9, build 2 (yay - new versioning support)

from catch2.

wichert avatar wichert commented on May 21, 2024

Looking at the current download I do not see a version number anywhere, nor do I see tagged versions or anything else mentioning a version number. Can I safely assume that "v0.9 build 2" is the current download you added two days ago?

from catch2.

wichert avatar wichert commented on May 21, 2024

Looks like catch_version is not included in the single-file build. A comment at the top of that file declaring the version number would be very practical as well ;)

from catch2.

wichert avatar wichert commented on May 21, 2024

Hmm.. it seems you already did that on a new branch. Should we be using the integration branch instead of master now?

from catch2.

SebDyn avatar SebDyn commented on May 21, 2024

Hello Phil,

thanks for investigating.

I can not test it, however, because it does not compile on Mac Lion (GCC 4.2.1).
I compile with -Werror -Wall -std=c++98

First of all, there are some warnings:
line 2644 must be
Equals( const Equals& other ) : MatcherImpl<Equals, std::string>(), m_str( other.m_str ){}
line 2660 must be
Contains( const Contains& other ) : MatcherImpl<Contains, std::string>(), m_substr( other.m_substr ){}
line 2676 must be
StartsWith( const StartsWith& other ) : MatcherImpl<StartsWith, std::string>(), m_substr( other.m_substr ){}
line 2692 must be
EndsWith( const EndsWith& other ) : MatcherImpl<EndsWith, std::string>(), m_substr( other.m_substr ){}

Further I get these errors that I can not resolve myself:

../../../ext/catch/catch.hpp: In instantiation of 'Catch::ExpressionLhs<const double&>':
XXX.cpp:118: instantiated from here
../../../ext/catch/catch.hpp:942: error: forming reference to reference type 'const double&'
../../../ext/catch/catch.hpp: In instantiation of 'Catch::ExpressionLhs<const int&>':
YYY.cpp:142: instantiated from here
../../../ext/catch/catch.hpp:942: error: forming reference to reference type 'const int&'
../../../ext/catch/catch.hpp: In member function 'Catch::ExpressionLhs<const T&> Catch::ExpressionDecomposer::operator->(const T&) [with T = double]':
XXX.cpp:118: instantiated from here
../../../ext/catch/catch.hpp:1022: error: no matching function for call to 'Catch::ExpressionLhs<const double&>::ExpressionLhs(const double&)'
../../../ext/catch/catch.hpp:938: note: candidates are: Catch::ExpressionLhs<const double&>::ExpressionLhs(const Catch::ExpressionLhs<const double&>&)
../../../ext/catch/catch.hpp: In member function 'Catch::ExpressionLhs<const T&> Catch::ExpressionDecomposer::operator->
(const T&) [with T = int]':
YYY:142: instantiated from here
../../../ext/catch/catch.hpp:1022: error: no matching function for call to 'Catch::ExpressionLhs<const int&>::ExpressionLhs(const int&)'
../../../ext/catch/catch.hpp:938: note: candidates are: Catch::ExpressionLhs<const int&>::ExpressionLhs(const Catch::ExpressionLhs<const int&>&)

These errors appear when doing
int return_call;
REQUIRE( return_call == 0);
(and the same for a double number).

from catch2.

wichert avatar wichert commented on May 21, 2024

@SebDyn I just reported that compile error as #136 as well.

from catch2.

philsquared avatar philsquared commented on May 21, 2024

@wichert - re integration branch: Yes I'm checking all new stuff, even bug fixes, into integration now.
From there I'll either bunch them into periodic, "releases", to Master - or for urgent fixes I'll merge them across sooner - sorry I didn't make that clear.

On integration I have now added versioning support (which will migrate to Master shortly). The version info includes the branch name, so that should help clarify what I'm talking about in future.

The version info will also be written into the README file, so the current version on each branch should be immediately visible when you hit GitHub (that's only on my local clone at the moment).

@SebDyn - I've responded to @wichert on his #136 issue now - hopefully that will help.

from catch2.

philsquared avatar philsquared commented on May 21, 2024

@SebDyn - actually I don't recognise those Matcher errors. Could you please open a new Issue for them and I'll take another look later. If you can include any sample code that would be perfect.

from catch2.

wichert avatar wichert commented on May 21, 2024

@philsquared I'm afraid it still fails with the same error:

Recording test results
ERROR: Publisher hudson.tasks.junit.JUnitResultArchiver aborted due to exception
java.lang.NullPointerException
    at hudson.tasks.junit.CaseResult.getPackageName(CaseResult.java:266)
    at hudson.tasks.junit.TestResult.tally(TestResult.java:529)
    at hudson.tasks.junit.JUnitParser$ParseResultCallable.invoke(JUnitParser.java:118)
    at hudson.tasks.junit.JUnitParser$ParseResultCallable.invoke(JUnitParser.java:90)
    at hudson.FilePath.act(FilePath.java:851)
    at hudson.FilePath.act(FilePath.java:824)
    at hudson.tasks.junit.JUnitParser.parse(JUnitParser.java:87)
    at hudson.tasks.junit.JUnitResultArchiver.parse(JUnitResultArchiver.java:122)
    at hudson.tasks.junit.JUnitResultArchiver.perform(JUnitResultArchiver.java:134)
    at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
    at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:804)
    at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:779)
    at hudson.model.Build$BuildExecution.post2(Build.java:183)
    at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:726)
    at hudson.model.Run.execute(Run.java:1541)
    at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
    at hudson.model.ResourceController.execute(ResourceController.java:88)
    at hudson.model.Executor.run(Executor.java:236)

from catch2.

SebDyn avatar SebDyn commented on May 21, 2024

In my local copy I modified the code to

                    XmlWriter::ScopedElement e = xml.scopedElement( "testsuite" );
                    xml.writeAttribute( "name", (it->m_name=="") ? "test_suite_name" : it->m_name );

as a quick hack to get it working.... To set a default value != "" for m_name makes no sense because it is overwritten to "" if the -t command line option does not appear. Hence all these tests should be done when writing the XML...

from catch2.

philsquared avatar philsquared commented on May 21, 2024

Interesting. I had added:

virtual void StartGroup( const std::string& groupName ) {
            if( groupName.empty() )
                m_statsForSuites.push_back( Stats( "all tests" ) );
            else
                m_statsForSuites.push_back( Stats( groupName ) );
            m_currentStats = &m_statsForSuites.back();
        }
  • which seems to have the same effect for me. Not sure why it wouldn't be working for you.

from catch2.

SebDyn avatar SebDyn commented on May 21, 2024

There are 2 more bugs that I found when integrating CATCH into Jenkins:

(1) The testsuite name should be unique for each test. In our environment we have multiple executables generating a set of XML files. If each testsuite has the same name, Jenkins will see only the output of a single executable. We resolved this issue by running an 'sed' script that replaces the testsuite name by the name of the executable that produces it.
(2) CATCH stores the output in the XML sections and at the end of each file. Jenkins does not parse these blocks and, hence, the output can not be shown. These sections must appear for each and individually. For a brief description of the output format, please see the website http://stackoverflow.com/questions/4922867/junit-xml-format-specification-that-hudson-supports (second post by Lukas Eder). Personally, I'd prefer an output block separated for each testcase...

Best regards
Sebastian

from catch2.

philsquared avatar philsquared commented on May 21, 2024

I've checked in a change that should address (1)
I'm not quite sure what you mean by (2) - could you elaborate?

from catch2.

SebDyn avatar SebDyn commented on May 21, 2024

Hello Phil,

thanks for fixing this. I will check this later on. Regarding point (2):

CATCH produces something like this:

<testsuites disabled="" errors="" failures="" name="" tests="" time="">
    <testsuite disabled="" errors="" failures="" hostname="" id=""
        name="" package="" skipped="" tests="" time="" timestamp="">
        <testcase assertions="" classname="" name="" status="" time="">
            <error message="" type=""/>
        </testcase>
    </testsuite>
    <system-out>
        all the output to std::cout
    </system-out>
    <system-err>
        all the output to std::cerr
    </system-err>
</testsuites>

The problem here is that I can not access the blocks <system-out> and <system-err> from Jenkins when appearin within <testsuites>. Even putting it to <testsuite> won't help me (although at least JUnit supports it, I did not find this in Jenkins, though) because I can not assign an output message to a specific test case.

What I need is something like this:

<testsuites disabled="" errors="" failures="" name="" tests="" time="">
    <testsuite disabled="" errors="" failures="" hostname="" id=""
        name="" package="" skipped="" tests="" time="" timestamp="">
        <testcase assertions="" classname="" name="" status="" time="">
            <error message="" type=""/>
            <system-out>
                all the output to std::cout
            </system-out>
            <system-err>
                all the output to std::cerr
            </system-err>
        </testcase>
    </testsuite>
</testsuites>

That means the position of the system-err and system-out blocks should be assigned to the test cases, not to the whole test suites.

best regards
Sebastian

from catch2.

philsquared avatar philsquared commented on May 21, 2024

(I've escaped your inline elements too)

I think some of your previous comment was swallowed due to XML (non)escaping too. It makes a bit more sense now.
I'll change it to dump the stdout/stderr for each test case

from catch2.

wichert avatar wichert commented on May 21, 2024

@philsquared Would it be helpful for you if I setup my jenkins to run the tests from Catch itself, so you can see the errors or output?

from catch2.

philsquared avatar philsquared commented on May 21, 2024

Yes please!

from catch2.

wichert avatar wichert commented on May 21, 2024

@philsquared Is there a makefile somewhere I can point jenkins at?

from catch2.

wichert avatar wichert commented on May 21, 2024

I have jenkins running the Catch selftests. Currently I am using a custom branch which adds a makefile for jenkins to use, which is pull request #141. You can see the jenkins output here: https://code.simplon.biz/jenkins/job/Catch/ .

from catch2.

philsquared avatar philsquared commented on May 21, 2024

Thanks for doing that - that's really useful!

Looking at the results it seems to be basically working, no?
It is not breaking down the results by section (and looking at the reporter code that's simply because that has not been implemented!).
It also doesn't seem to be capturing all the contextual information it should be.

I'll look at both those things.
Is there anything else more fundamental I'm missing?

from catch2.

wichert avatar wichert commented on May 21, 2024

It does seem to be working. The breakdown of tests run is not quite there yet as you mention: you have to click through (root) and global before you see a list of sections. Contrast that to the Euphorie test results where the breakdown is based on filename, then class containing the tests and finally test method.

from catch2.

philsquared avatar philsquared commented on May 21, 2024

Unfortunately I get a 404 when I try to follow that link

from catch2.

wichert avatar wichert commented on May 21, 2024

Sorry about that. I've adjusted the permissions so you should be able to access it now.

from catch2.

SebDyn avatar SebDyn commented on May 21, 2024

Hello Phil,
For me, the current implementation works perfectly! Thanks for the effort!
Best regards
Sebastian

from catch2.

SebDyn avatar SebDyn commented on May 21, 2024

May I add another point to my wish list again?

Test failures may lead to several results:

  • some test cases fail
  • the program unexpectedly terminates

The first case works rather well now. To catch the second case, we introduced a script that checks the exit code. The Main() routine of CATCH was wrapped such that a zero exit code is returned if any test cases failed. This is different if an uncatched exception or even a segmentation fault appears. In this case the exit code is scanned, and the script generates its own XML for JUnit/Jenkins that contains the error information on 'calling the executable'. With the new strategy of catching the std output, however, I can not track the output in case of a segfault since it is caught by CATCH. I do not know if it is possible to 'move' the std output if no segfault happens and to 'copy' the std output if a segfault happens. If this is not possible, maybe you can simply add an option to send output to both, the XML and to console? It would also be helpful if the console output would separate the std output of the individual test cases and sections somehow (eg. "Now start test case 'xy/z'").

Best regards
Sebastian

from catch2.

SebDyn avatar SebDyn commented on May 21, 2024

Hi Phil!

Today I encountered a small bug in the JUnit reporter: On some occasions (Eg. if CHECK_THROWS fails) a wrong XML is written, where the element is empty, eg. < message=""> is written instead of . I've seen in the switch-case-block you simply do nothing for certain outcomes. , I changed it to:

                case ResultWas::Unknown:
                case ResultWas::FailureBit:
                case ResultWas::Exception:
                case ResultWas::DidntThrowException:
                    stats.m_element = "failure";
                    break;

Best regards
Sebastian

from catch2.

philsquared avatar philsquared commented on May 21, 2024

Hmm... Sorry @SebDyn, I don't believe I saw your last two comments before (I suspect because of the way my mail client does threading).

For your segfault issue I intend a more comprehensive handling, as mentioned recently in: #160.

For the CHECK_THROWS issue I can confirm that DidntThrowException should have handled (the others not, as they are flags that are just there to stop warnings).
I've added the equivalent of your fix (on integration) and will push those changes shortly.

from catch2.

wichert avatar wichert commented on May 21, 2024

I tested this again today with the current (v0.9 build 38) single include from the integration branch. It works, but the created structure is not very optimal:

  • the output has single top level package called (root)
  • underneath the (root) package is a single class called global
  • underneath the global class all TEST_CASEs are lasted as test names

This completely wastes two levels of test structure which would be very useful to use, and if you use the same test case names in multiple source files (for example TEST_CASE("Constructor") it is impossible to tell them apart.

I would suggest to change the output as follows:

  • use the filename as the package name. So instead of a single (root) package you would have file1.cc, file2.cc. It would be incredibly nice to have a relative path from the top level build directory included, but I can image that is not possible.
  • move TEST_CASE to the send class level.
  • expose the top level SECTION inside a test case as the test name.

from catch2.

philsquared avatar philsquared commented on May 21, 2024

@wichert thanks for the continuing feedback.

I recently (finally!) got Jenkins installed on my laptop (and I'm told it's now installed on the server where I run my CI builds (thanks Paul) - but I've not had a chance to set that up yet).
My first impression was, "wow it's much worse than I thought". Mostly in unfixable ways, due to limitations in the JUnit/Ant format itself. It's a real shame that it has become the de-facto standard - or at least the closest we have to one. Might have to see if we can change that.

For a while I was thinking it's so bad that the feedback is much nicer if you just run the console reporter and capture that!

But I will go back and improve things. As for your specific suggestions:

"use the filename as the package name"

  • sounds entirely sensible. ISTR thinking the same thing myself. Not sure about the path - will look into that.

"move TEST_CASE to the send class level"

  • this feels wrong, but in some ways it's appropriate. The "global" pseudo-class feels wrong too, so we're probably better off going for the more useful wrongness ;-) I'll look at that too.

"expose the top level SECTION inside a test case as the test name."

  • seems to follow on - although rather than the "top level SECTION" I'd concatenate the whole section path (much as I do in the console reporter now).

from catch2.

wichert avatar wichert commented on May 21, 2024

While junit is the most popular output format it is of course not the only one. Looking at the list of Jenkins plugins there are a bunch of alternatives: Cpptest, Gallio/MbUnit, JavaTest, JSUnit, NUnit and xUnit which supports a whole bunch of formats. Perhaps one of those will be a better fit.

from catch2.

wichert avatar wichert commented on May 21, 2024

It looks like all those plugins just convert various format to junit internally, so you are still bound to the three-level package/class/test structure.

from catch2.

philsquared avatar philsquared commented on May 21, 2024

Therein, as they say, lies the rub.
Furthermore we're not just talking about Jenkins. Most CI servers support a range of formats but JUnit/Ant is usually the lowest-common-denominator.
So until someone comes up with a compelling alternative that they can encourage broad adoption of we're stuck with it.
It's not so much the three-tier structure that's the issue.
There's a general, "built for Java" air to it (which is natural, given its origins - it wasn't devised as a general purpose format). It's quite limited in what can be reported when and, perhaps most significantly, the failure counts are given as attributes in top level elements - which means you can't even start writing it until all tests have been run!

I wouldn't hold Catch's own XML format up as a shining paradigm of cross framework/ language support either (but maybe we should do something about that). But it has a few characteristics that are in the right direction already - especially that error counts are given in separate elements and at the end - so the report can be streamed as it is being written to.
But for CI use the richness of information that is captured is more important. Having places to put file/ line info and assertion expansions are especially welcome.

from catch2.

SebDyn avatar SebDyn commented on May 21, 2024

Hello,

here are my 5 cents since I do not fully agree with Wichert's ideas

I do not recommend to use the source file name for the class name. In my case I only have a single test case per source file…

In my environment I post-process the JUnit XML and replace the class name by the name of the test executable (without path). Each executable collects a set of test cases grouping them semantically in a single executable. So far I could live with "package name=(root)".

Here is my recommendation:
the "top level package" should be configurable (by command line). The same for the "class name". In this case I would use the following ordering:

"top level package" = name of executable
"class name" = either "test_cases" collecting the tests of CATCH
or "exit_code" collecting information on the exit status of the executable, additional info on crashes (eg. X11' DISPLAY variable or segfaults), or time out (eg. maximum allowed time of 20 minutes exceeded), or debugger output, etc.

I agree with having the SECTION definitions as part of the test names, eg.:

TEST_CASE("name_1/name_2", "description"){
CHECK(1==0)
SECTION("section_1")
{
CHECK(1==0)
}

could result in failed tests
(test_executable_name).test_cases.name_1/name_2
(test_executable_name).test_cases.name_1/name_2/SECTION:section_1

Then an important note on a bug (feature?) in Jenkins: Even if multiple tests failed within a single test case, Jenkins only displays ONE of them! As a workaround all failed tests should be concatenated into a single section.

It would be extremely useful to have the times used by a test case available in the JUnit output. It would be nice if CATCH detects if any "boost" time headers were included and use the boost time functions if available

#include <boost/date_time/posix_time/posix_time.hpp> //  before including catch.hpp

in catch.hpp:

#ifdef POSIX_TIME_HPP__
boost::posix_time::ptime t1,t2;
t1 = boost::posix_time::microsec_clock::universal_time();
…
t2 = boost::posix_time::microsec_clock::universal_time();
std::cout << (t2-t1).total_seconds();
#endif 

Best regards
Sebastian

PS: I am sorry for not replying, Phil. I use an older revision of CATCH in a production environment and currently I do not want to upgrade.

On May 14, 2013, at 8:46 AM, Phil Nash wrote:

Therein, as they say, lies the rub.
Furthermore we're not just talking about Jenkins. Most CI servers support a range of formats but JUnit/Ant is usually the lowest-common-denominator.
So until someone comes up with a compelling alternative that they can encourage broad adoption of we're stuck with it.
It's not so much the three-tier structure that's the issue.
There's a general, "built for Java" air to it (which is natural, given its origins - it wasn't devised as a general purpose format). It's quite limited in what can be reported when and, perhaps most significantly, the failure counts are given as attributes in top level elements - which means you can't even start writing it until all tests have been run!

I wouldn't hold Catch's own XML format up as a shining paradigm of cross framework/ language support either (but maybe we should do something about that). But it has a few characteristics that are in the right direction already - especially that error counts are given in separate elements and at the end - so the report can be streamed as it is being written to.
But for CI use the richness of information that is captured is more important. Having places to put file/ line info and assertion expansions are especially welcome.


Reply to this email directly or view it on GitHub.

from catch2.

martinmoene avatar martinmoene commented on May 21, 2024

Comment by SebDyn above reformatted:

Hello,

here are my 5 cents since I do not fully agree with Wichert's ideas.

I do not recommend to use the source file name for the class name. In my case I only have a single test case per source file…

In my environment I post-process the JUnit XML and replace the class name by the name of the test executable (without path). Each executable collects a set of test cases grouping them semantically in a single executable. So far I could live with "package name=(root)".

Here is my recommendation:
the "top level package" should be configurable (by command line). The same for the "class name". In this case I would use the following ordering:

"top level package" = name of executable
"class name" = either "test_cases" collecting the tests of CATCH
or "exit_code" collecting information on the exit status of the executable, additional info on crashes (eg. X11' DISPLAY variable or segfaults), or time out (eg. maximum allowed time of 20 minutes exceeded), or debugger output, etc.

I agree with having the SECTION definitions as part of the test names, eg.:

TEST_CASE("name_1/name_2", "description"){
CHECK(1==0)
SECTION("section_1")
{
CHECK(1==0)
}

could result in failed tests
(test_executable_name).test_cases.name_1/name_2
(test_executable_name).test_cases.name_1/name_2/SECTION:section_1

Then an important note on a bug (feature?) in Jenkins: Even if multiple tests failed within a single test case, Jenkins only displays ONE of them! As a workaround all failed tests should be concatenated into a single section.

It would be extremely useful to have the times used by a test case available in the JUnit output. It would be nice if CATCH detects if any "boost" time headers were included and use the boost time functions if available

#include <boost/date_time/posix_time/posix_time.hpp> //  before including catch.hpp

in catch.hpp:

#ifdef POSIX_TIME_HPP__
boost::posix_time::ptime t1,t2;
t1 = boost::posix_time::microsec_clock::universal_time();
…
t2 = boost::posix_time::microsec_clock::universal_time();
std::cout << (t2-t1).total_seconds();
#endif 

Best regards
Sebastian

PS: I am sorry for not replying, Phil. I use an older revision of CATCH in a production environment and currently I do not want to upgrade.

On May 14, 2013, at 8:46 AM, Phil Nash wrote:

Therein, as they say, lies the rub.
Furthermore we're not just talking about Jenkins. Most CI servers support a range of formats but JUnit/Ant is usually the lowest-common-denominator.
So until someone comes up with a compelling alternative that they can encourage broad adoption of we're stuck with it.
It's not so much the three-tier structure that's the issue.
There's a general, "built for Java" air to it (which is natural, given its origins - it wasn't devised as a general purpose format). It's quite limited in what can be reported when and, perhaps most significantly, the failure counts are given as attributes in top level elements - which means you can't even start writing it until all tests have been run!

I wouldn't hold Catch's own XML format up as a shining paradigm of cross framework/ language support either (but maybe we should do something about that). But it has a few characteristics that are in the right direction already - especially that error counts are given in separate elements and at the end - so the report can be streamed as it is being written to.
But for CI use the richness of information that is captured is more important. Having places to put file/ line info and assertion expansions are especially welcome.


Reply to this email directly or view it on GitHub.

from catch2.

wichert avatar wichert commented on May 21, 2024

While I see some of the points from @SebDyn I have to disagree with others.

I indeed also tend to have very few and often just one test case per source file, so using the test case name only and ignoring the file name would be fine for me.

My current codebases are not large enough to warrant multiple CATCH test runners (I do have others, but those are python-based for python wrappers) so using the executable name did not occur to me. For large codebases I can see that that could be useful. I wonder what percentages of code bases is large enough to warrant that. For my situations it would be a shame to loose that hierarchy level.

from catch2.

philsquared avatar philsquared commented on May 21, 2024

The code base I'm currently working on by day has about 12 Catch test executables (plus a few NUnit executables).
It's not so much about the volume as the partitioning. Each of our executables tests a specific component in our system. Some are quite large. Some have just a handful of tests in.

I think process name is the logical fit for the package name. There's possibly scope for making that configurable, though. When I get back to looking at this again I'll take that into consideration.

Thanks for your additional comments @SebDyn - I'll ruminate on them more when I'm back in the right context to think about it.

from catch2.

m-mcgowan avatar m-mcgowan commented on May 21, 2024

Totally awesome library - so novel and unique and usable!

I just hit a little snag with the junit reporter causing UnitTH to crash - there was a missing name attribute in the testsuite element. The name comes from the testGroup which is set to empty in

context.testGroupStarting( "", 1, 1 ); // deprecated?

It would be great if we could group tests into some higher level grouping,e.g.

TEST_SUITE("Chugger") {
    TEST_CASE(...) {

    };
    TEST_CASE(...) {
    } 
}

(I realize this is outside the scope of this issue, which is the junit reporter, but having the TEST_SUITE defined will then provide a name for the <test-suite> elements in the junit report.) In my case, I'd probably organize at least each cpp module as a test group, and for large modules, break these down into subgroups - loosely mimicking the package structure that junit had.

from catch2.

wichert avatar wichert commented on May 21, 2024

The test grouping is also requested in #320.

from catch2.

m-mcgowan avatar m-mcgowan commented on May 21, 2024

Yes, by me - sorry for the cross post, but the two are related.

I'm finding tags a bit of a pain, and I often forget. For example, I just made unit tests for a PRNG, in random.cpp. Each test then has as a minimum one tag [random]. Would be great to simply wrap the whole lot in a big group.

from catch2.

philsquared avatar philsquared commented on May 21, 2024

I've moved this part of the discussion back over to #320

from catch2.

Trass3r avatar Trass3r commented on May 21, 2024

Is there anything left for this one?

from catch2.

SebDyn avatar SebDyn commented on May 21, 2024

Is there anything left for this one?

I just tried recent Catch 1.1 build 1 and frankly I need to reply with: No. It is - again - not really working.

Here is the reason:

  • Old CATCH (working):

    <testsuite errors="0" failures="0" tests="107" hostname="0" time="0.140043" timestamp="0">
    
  • Current Catch (not working):

    <testsuite name="all tests" errors="0" failures="0" tests="1074" hostname="0" time="19.5793" timestamp="0">
    

I use Jenkins 1.596.2 (stable) with JUnit reporter. The interesting thing is: Jenkins does not return with an error. It just does not display the test results when using the current CATCH output.

from catch2.

SebDyn avatar SebDyn commented on May 21, 2024

Ok, it seems that Jenkins has a bug somewhere. The "name" keyword ist a mandatory field in JUnit 6 standard. We continue investigating this issue.

from catch2.

alt- avatar alt- commented on May 21, 2024

The fields with "tbd" really need to go or be fixed.

Jenkins' Junit plugin performs duplication checks on inputs by comparing the name, id and timestamp fields (https://github.com/jenkinsci/junit-plugin/blob/master/src/main/java/hudson/tasks/junit/TestResult.java#L240).

If the timestamp field would be missing, it would work correctly (strictEq), but the dummy value "tbd" matches the check and Jenkins throws away all other test results.

from catch2.

horenmar avatar horenmar commented on May 21, 2024

Timestamp should now be filled with proper UTC-based time, at the time of writing the results, and most other issues raised in this thread be addressed as well.

If there are some issues remaining, please open a new issue for them.

from catch2.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.