Git Product home page Git Product logo

junit.testlogger's Introduction

JUnit Test Logger

JUnit xml report extension for Visual Studio Test Platform.

Build Status Build Status NuGet Downloads

Packages

Logger Stable Package Pre-release Package
JUnit NuGet MyGet Pre Release

If you're looking for Nunit, Xunit or appveyor loggers, visit following repositories:

Usage

The JUnit Test Logger generates xml reports in the Ant Junit Format, which the JUnit 5 repository refers to as the de-facto standard. While the generated xml complies with that schema, it does not contain values in every case. For example, the logger currently does not log any properties. Please refer to a sample file to see an example. If you find that the format is missing data required by your CI/CD system, please open an issue or PR.

To use the logger, follow these steps:

  1. Add a reference to the JUnit Logger nuget package in test project

  2. Use the following command line in tests

    > dotnet test --logger:junit
    
  3. Test results are generated in the TestResults directory relative to the test.csproj

A path for the report file can be specified as follows:

> dotnet test --logger:"junit;LogFilePath=test-result.xml"

test-result.xml will be generated in the same directory as test.csproj.

Note: the arguments to --logger should be in quotes since ; is treated as a command delimiter in shell.

All common options to the logger are documented in the wiki. E.g. token expansion for {assembly} or {framework} in result file. If you are writing multiple files to the same directory or testing multiple frameworks, these options can prevent test logs from over-writing each other.

Customizing Junit XML Contents

There are several options to customize how the junit xml is populated. These options exist to provide additional control over the xml file so that the logged test results can be optimized for different CI/CD systems.

Platform Specific Recommendations:

After the logger name, command line arguments are provided as key/value pairs with the following general format. Note the quotes are required and key names are case sensitive.

> dotnet test --test-adapter-path:. --logger:"junit;key1=value1;key2=value2"

MethodFormat

This option alters the testcase name attribute. By default, this contains only the method. Class, will add the class to the name. Full, will add the assembly/namespace/class to the method.

We recommend this option for GitLab users.

Allowed Values
  • MethodFormat=Default
  • MethodFormat=Class
  • MethodFormat=Full

FailureBodyFormat

When set to default, the body of a failure element will contain only the exception which is captured by vstest. Verbose will prepend the body with 'Expected X, Actual Y' similar to how it is displayed in the standard test output. 'Expected X, Actual Y' are normally only contained in the failure message. Additionally, Verbose will include standard output from the test in the failure message.

We recommend this option for GitLab and CircleCI users.

Allowed Values
  • FailureBodyFormat=Default
  • FailureBodyFormat=Verbose

License

MIT

junit.testlogger's People

Contributors

codito avatar darrylmelander avatar gitfool avatar peter-darton-i2 avatar siphonophora avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

junit.testlogger's Issues

Member data not included in output name

Hello.

I am currently getting a similar issue to #42. I thought originally it was the same issue, but after updating to the latest version (3.0.110) I'm still getting the issue. After reading the issue properly, I realised that my cause was slightly different.

Providing the following test in xunit (member data instead of static inputs)

public static IEnumerable<object[]> ValidationTests
{
    get
    {
        List<object[]> tests = new List<object[]>();
        tests.Add(new object[] { new ValidationTest() });
        tests.Add(new object[] { new ValidationTest() });
        return tests;
    }
}

[Theory]
[MemberData(nameof(ValidationTests))]
public async Task When_ValidOrInvalidDataIsProvided_Then_ValidationErrorsOccurAccordingly(ValidationTest test)
{
}

I'm getting the following output

    <testcase classname="My.Test.Class.Name" name="name.When_ValidOrInvalidDataIsProvided_Then_ValidationErrorsOccurAccordingly" time="0.0147381" />
    <testcase classname="My.Test.Class.Name" name="name.When_ValidOrInvalidDataIsProvided_Then_ValidationErrorsOccurAccordingly" time="0.0118888" />
    <testcase classname="My.Test.Class.Name" name="name.When_ValidOrInvalidDataIsProvided_Then_ValidationErrorsOccurAccordingly" time="0.0003662" />

I would expect similar to #42 that each test would include perhaps the ToString representation of ValidationTest

Include stdout for tests that pass

When a test passes there is no way to include the messages for that test in the element. It would be helpful if there were a configuration option to enable verbose messaging on successful tests.

Token Expansion Doesn't Work When Running Tests at Solution Level

I'm running tests on an entire solution, as documented here:
https://devblogs.microsoft.com/dotnet/whats-new-in-our-code-coverage-tooling/

My command line looks like:

dotnet test --settings CodeCoverage.runsettings --collect "Code Coverage;Format=cobertura" --logger:"junit;LogFilePath=..\reports\unit-tests\;LogFileName={assembly}.test-result.xml;MethodFormat=Class;FailureBodyFormat=Verbose" --results-directory ./TestResults

I have also tried

dotnet test --settings CodeCoverage.runsettings --collect "Code Coverage;Format=cobertura" --logger:"junit;LogFilePath=..\reports\unit-tests\{assembly}.test-result.xml;MethodFormat=Class;FailureBodyFormat=Verbose" --results-directory ./TestResults

My CodeCoverage.runsettings file looks like:

<?xml version="1.0" encoding="utf-8"?>
<!-- File name extension must be .runsettings -->
<RunSettings>
  <DataCollectionRunSettings>
    <DataCollectors>
      <DataCollector friendlyName="Code Coverage" uri="datacollector://Microsoft/CodeCoverage/2.0" assemblyQualifiedName="Microsoft.VisualStudio.Coverage.DynamicCoverageDataCollector, Microsoft.VisualStudio.TraceCollector, Version=11.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a">
        <Configuration>
          <Format>cobertura</Format>
          <IncludeTestAssembly>false</IncludeTestAssembly>
          <CodeCoverage>  
            <EnableStaticNativeInstrumentation>False</EnableStaticNativeInstrumentation>
            <EnableDynamicNativeInstrumentation>False</EnableDynamicNativeInstrumentation>
            <ModulePaths>
              <Include>
                <ModulePath>Precisely.Identity.*</ModulePath>
              </Include>
            </ModulePaths>
            <Attributes>
              <Exclude>
                <!-- Don't forget "Attribute" at the end of the name -->
                <Attribute>^System\.Diagnostics\.DebuggerHiddenAttribute$</Attribute>
                <Attribute>^System\.Diagnostics\.DebuggerNonUserCodeAttribute$</Attribute>
                <Attribute>^System\.CodeDom\.Compiler\.GeneratedCodeAttribute$</Attribute>
                <Attribute>^System\.Diagnostics\.CodeAnalysis\.ExcludeFromCodeCoverageAttribute$</Attribute>
              </Exclude>
            </Attributes>
            <Sources>
              <Exclude>
                <Source>.*\\*.g.cs</Source>
                <Source>\\_\\.*</Source>
              </Exclude>
            </Sources>
          </CodeCoverage>
        </Configuration>
      </DataCollector>
    </DataCollectors>
  </DataCollectionRunSettings>
</RunSettings>

I just get a single unit test output file

image

Test Logger Does not Respect LogFilePrefix

When I run:

dotnet test --logger:"junit;LogFilePrefix=Foo"

I expect to see a file output with the name: Foo_net7.0_20221109102716, instead what I get is TestResults.xml

AppVeyor build not running

Are we missing some config, since I transferred this repo over? The Travis build happened right after I merged a PR, but AppVeyor hasn't done anything in the last couple hours.

Release v3.0.114

A new release has been published on NuGet with version v3.0.114, but there has been no associated release in GitHub or release notes added anywhere. There is also no tag for this version added to the repository to indicate what commits are included in the release.

Without a proper release being performed, it is not possible to update to the latest version of the package confidently.

methodformat always includes namespace and class name regardless of setting

I'm using junit.testlogger 2.1.78 and dotnet 3.1.402.

I have a test class called LayerConverterTest in a directory test/MyProject.Test/Json. It's namespace is MyProject.Test.Json

If I run dotnet test --no-build --logger:"junit;LogFilePath=..\artifacts\{assembly}-test-result.xml;MethodFormat=Default;FailureBodyFormat=Verbose"

The resulting report wrongly includes the full namespace and class in the name attribute.

<?xml version="1.0" encoding="utf-8"?>
<testsuites>
  <testsuite name="MyProject.Test.dll" tests="4" skipped="0" failures="0" errors="0" time="0.1136953" timestamp="2020-10-01T11:23:36" hostname="executor://xunit/VsTestRunner2/netcoreapp" id="0" package="MyProject.Test.dll">
    <properties />
    <testcase classname="MyProject.Test.Json.LayerConverterTest" name="MyProject.Test.Json.LayerConverterTest.CanConvert_LayerSubclass_True(type: typeof(MyProject.Json.Layers.Image))" time="0.0070400" />
    <testcase classname="MyProject.Test.Json.LayerConverterTest" name="MyProject.Test.Json.LayerConverterTest.CanConvert_LayerSubclass_True(type: typeof(MyProject.Json.Layers.PreComp))" time="0.0000645" />
    <testcase classname="MyProject.Test.Json.DataTest" name="MyProject.Test.Json.DataTest.Deserialise_ValidJson_PropsPopulated" time="0.0544729" />
    <testcase classname="MyProject.Test.Json.LayerConverterTest" name="MyProject.Test.Json.LayerConverterTest.Read_ImageJson_ReturnImageObject" time="0.0521179" />
    <system-out>Junit Logger does not log standard output</system-out>
    <system-err>Junit Logger does not log error output</system-err>
  </testsuite>
</testsuites>

If I change the method format and run dotnet test --no-build --logger:"junit;LogFilePath=..\artifacts\{assembly}-test-result.xml;MethodFormat=Full;FailureBodyFormat=Verbose"

The resulting report includes the namespace and class name twice.

<?xml version="1.0" encoding="utf-8"?>
<testsuites>
  <testsuite name="MyProject.Test.dll" tests="4" skipped="0" failures="0" errors="0" time="0.0837441" timestamp="2020-10-01T11:46:24" hostname="executor://xunit/VsTestRunner2/netcoreapp" id="0" package="MyProject.Test.dll">
    <properties />
    <testcase classname="MyProject.Test.Json.LayerConverterTest" name="MyProject.Test.Json.LayerConverterTest.MyProject.Test.Json.LayerConverterTest.CanConvert_LayerSubclass_True(type: typeof(MyProject.Json.Layers.Image))" time="0.0107667" />
    <testcase classname="MyProject.Test.Json.LayerConverterTest" name="MyProject.Test.Json.LayerConverterTest.MyProject.Test.Json.LayerConverterTest.CanConvert_LayerSubclass_True(type: typeof(MyProject.Json.Layers.PreComp))" time="0.0000384" />
    <testcase classname="MyProject.Test.Json.DataTest" name="MyProject.Test.Json.DataTest.MyProject.Test.Json.DataTest.Deserialise_ValidJson_PropsPopulated" time="0.0395664" />
    <testcase classname="MyProject.Test.Json.LayerConverterTest" name="MyProject.Test.Json.LayerConverterTest.MyProject.Test.Json.LayerConverterTest.Read_ImageJson_ReturnImageObject" time="0.0333726" />
    <system-out>Junit Logger does not log standard output</system-out>
    <system-err>Junit Logger does not log error output</system-err>
  </testsuite>
</testsuites>

Gitlab Attachments with .NET - XML missing attachment markup

I originally posted this on another issue and I will delete that comment shortly. The issue I commented on is: #40

We are a .NET shop and utilize NUnit and are a Gitlab customer. We use JUnitXml.Testlogger 3.0.134 configured per your recommendation and Gitlabs own docs and it works great... until we tried to add attachments. No matter what I seem to do, I cannot get the necessary markup into the test results xml output. I thought it may be a matter of me simply using and extensibility point to insert the markup myself using a globally accessible object or method for the JUnitXml.Testlogger lib but I quickly reviewed the source of it and the core logger nothing jumped out at me.

What I have done per other issues in this project and in the NUnit Testlogger which is how I started pulling on this thread where support for attachments was discussed and I traced support back to the core logger for is:

  • Make sure the attachments are available from the root of the CI Root in the pipeline (had originally be using temp files but moved them under the same directory as the test results xml themselves.
  • We are using the following command to run the tests: dotnet test ./v${VERSION}.csproj --test-adapter-path:. --logger:"junit;LogFilePath=..\artifacts\sdktest\{assembly}-${VERSION}-test-result.xml;MethodFormat=Class;FailureBodyFormat=Verbose" --configuration Release
  • We are adding attachments using TestContext.AddTestAttachment(filename, description) where filename is accessible (bullet 1).
  • Using NUnit 4.0.1 (.NET 8 but same behavior was observed in .NET 7).
  • Attachments work as expected in VS using MSTest (attachments included in test results, clickable, view contents).
  • Artifacts are confirmed as stored in the stage, along with the test results XML.

I would think the easiest solution would be to figure out how to get the attachment detail into the resultant XML before its written, either because it is handled by the JUnitXML or core logger OR because it was manually inserted but would love to avoid doing it as a post-processing of the files after they are written but before the stage completes and Gitlab parses the results. The format that Gitlab uses for XML attachments is: https://github.com/testmoapp/junitxml?tab=readme-ov-file#attachments-in-test-output

Does anyone have any pointers on how to achieve this?

Release v3.0.125

Apparently the same thing happened again as it did with #54

Maybe it would be benefical to include a release checklist in the repo to mitgate this problem in the future?

Skipped tests are classified as passed

there is no special handling for Outcome.Skipped so skipped tests appear as successful in xml output.

They need a <skipped/> element so tools like jenkins can recognise and classify the tests appropriately

if (result.Outcome == TestOutcome.Failed)
{
var failureBodySB = new StringBuilder();
if (this.FailureBodyFormatOption == FailureBodyFormat.Verbose)
{
failureBodySB.AppendLine(result.ErrorMessage);
// Stack trace included to mimic the normal test output
failureBodySB.AppendLine("Stack Trace:");
}
failureBodySB.AppendLine(result.ErrorStackTrace);
var failureElement = new XElement("failure", failureBodySB.ToString());
failureElement.SetAttributeValue("type", "failure"); // TODO are there failure types?
failureElement.SetAttributeValue("message", result.ErrorMessage);
testcaseElement.Add(failureElement);
}

Test Fixture Tests are not logged

Looks like this is the same root cause as spekt/nunit.testlogger#44

TestResultEventArgs.Result.TestCaseFullyQualifiedName produced by a test with TestFixtureData looks like this:

TestFixtureData
Namespace.Class("Arg1","Arg2).Method

TestFixtureData and TestCase data
Namespace.Class("Arg1","Arg2).Method("Arg1","Arg2)

TryParseName returns false in these cases, and the tests aren't logged.

Plan for Parallel test support

Thoughts from #22 and #23

  1. The logger isn't built to support parallel test runs.
  2. Instance members that store test output need to be re-thought, or at least locking needs to be added.
  3. Parallel runs will likely require additional tests against internal members that respond to test events.

Junit result xml files are not being generated for versions > 3.0.98

Dotnet Project

Docker base image used : mcr.microsoft.com/dotnet/sdk:6.0.405-alpine3.17

Command used:
dotnet test --test-adapter-path:. --collect "Code Coverage" /p:CollectCoverage=true /p:CoverletOutputFormat="opencover" /p:CoverletOutput=tests/TestResults /p:UseSourceLink=true --logger:"junit;LogFileName=junit-result.xml;MethodFormat=Class;FailureBodyFormat=Verbose" --logger "trx;LogFileName=testresults.trx;Parser=Legacy" --results-directory valid/

Issue:
Junit result xml files are not being generated for versions > 3.0.98

But if either we REMOVE the flag --collect "Code Coverage" then the junit xml's are generated
or
instead pass --collect "XPlat Code Coverage" then the junit xml's are generated

having --collect " code coverage" the junt result xmls are not generated

Attachments not attached

Forgive me if I've missed a configuration option for this, but I can't see any code to handle attachments so think I'm accurate.

I'm using MSTest and making use of the AddResultFile method to attach files to my test cases. I'd like those files to be included in the junit test results - it appears that the closest thing to a 'standard' is the Jenkins plugin which is also supported by GitLab: https://plugins.jenkins.io/junit-attachments/.

Both GitLab and Jenkins support text written to stdout/stderr in a particular format - [[ATTACHMENT|/absolute/path/to/some/file]]. I'm not clear what represents an 'absolute path' in the context of CI/CD given that once files upload to the CI/CD servers the real absolute path isn't meaningful - my guess is that they mean a path relative to the base path for uploading result files.

Change testsuite name property in JUnit xml

I'm using dotnet test with the junit logger option to run my tests. The tests are compiled into a binary before they are run and the name of the binary is used as the testsuite name in the JUnit XML file, example - E2ETests.dll.

<testsuites>
<testsuite name="E2ETests.dll" tests="5" skipped="1" failures="2" errors="0" time="76.0077336" timestamp="2024-04-11T16:31:15" hostname="hostname" id="0" package="E2ETests.dll">

I would like to change the name of the testsuite in the XML file without modifying the binary. Is there an existing method or plan to allow changing the testsuite name that can be provided to dotnet test --logger option?

Save the XML without BOM mark

Currently, the tool generates the xml as utf-8 encoded with BOM mark. Can we do the following?

  1. Encode without a BOM mark. You can pass a UTFEncoding(false) to the XmlTextWriter constructor to achieve this. https://stackoverflow.com/questions/4942825/xdocument-saving-xml-to-file-without-bom?answertab=votes#tab-top

  2. Provide a flag to control this behaviour if this is a breaking change, so I can run this like:

dotnet test --test-adapter-path:. --logger:"junit;LogFilePath=test-result.xml;SkipUtf8Bom=true" test.csproj

Gitlab docs / recomendations

Following #34 verify if this is an issue for gitlab.

Tidy up the docs if needed, remove the test adapter cli option which isn't needed. Update the example on gitlab

test with vstest.console.exe

Hey Guys,

I'm trying to get this working with vstest.console.exe and some VC++ MSTest tests. I believe this is a delegate chosen by dotnet test.

I've tried playing with /TestAdapterPath and /TestAdapterLoadingStrategy, but I just cant seem to get your dotnet assembly's on the PATH sufficiently for this executable.

The best I;ve been able to get is

vstest.console /TestAdapterPath:C:\Users\geoff\Code\HelloStaticLib\JunitXml-TestLogger-3.0.124 /ResultsDirectory:vs-test-results /Logger:junit ./x64/Debug/HelloStaticLibTests.dll

to produce

Could not find a test logger with AssemblyQualifiedName, URI or FriendlyName 'junit'.

Is dotnet test simply entirely different from vstest.console.exe and thus my usage simply isnt supported?

wrong amount of tests in gitlab

After I integrated the library, I noticed the following oddities:

I get only 2 tests displayed instead of 13.

If I use the default --logger "trx;LogFileName=testreport.trx" statement and convert the result with trx2junit` (https://github.com/gfoidl/trx2junit) I get the correct result shown in gitlab.

junit.testlogger

gitlab pipeline

grafik

grafik

grafik

junit.testlogger result file

<?xml version="1.0" encoding="utf-8"?>
<testsuites name="foo.common.test.dll" tests="13" failures="0" time="0.098">
  <testsuite name="foo.common.test.dll" tests="13" skipped="0" failures="0" errors="0" time="0.098" timestamp="2019-12-03T 08:02:10Z" hostname="executor://xunit/VsTestRunner2/netcoreapp">
    <testcase classname="Haprotec.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest" file="D:\Repositories\gitlab.haprotec\customers\foo\foo-zls\src\foo.common.test\bin\Debug\netcoreapp2.2\foo.common.test.dll" time="0.011" />
    <testcase classname="Haprotec.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest" file="D:\Repositories\gitlab.haprotec\customers\foo\foo-zls\src\foo.common.test\bin\Debug\netcoreapp2.2\foo.common.test.dll" time="0.001" />
    <testcase classname="Haprotec.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest" file="D:\Repositories\gitlab.haprotec\customers\foo\foo-zls\src\foo.common.test\bin\Debug\netcoreapp2.2\foo.common.test.dll" time="0.001" />
    <testcase classname="Haprotec.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest" file="D:\Repositories\gitlab.haprotec\customers\foo\foo-zls\src\foo.common.test\bin\Debug\netcoreapp2.2\foo.common.test.dll" time="0.001" />
    <testcase classname="Haprotec.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest" file="D:\Repositories\gitlab.haprotec\customers\foo\foo-zls\src\foo.common.test\bin\Debug\netcoreapp2.2\foo.common.test.dll" time="0.001" />
    <testcase classname="Haprotec.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest" file="D:\Repositories\gitlab.haprotec\customers\foo\foo-zls\src\foo.common.test\bin\Debug\netcoreapp2.2\foo.common.test.dll" time="0.001" />
    <testcase classname="Haprotec.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest" file="D:\Repositories\gitlab.haprotec\customers\foo\foo-zls\src\foo.common.test\bin\Debug\netcoreapp2.2\foo.common.test.dll" time="0.001" />
    <testcase classname="Haprotec.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest" file="D:\Repositories\gitlab.haprotec\customers\foo\foo-zls\src\foo.common.test\bin\Debug\netcoreapp2.2\foo.common.test.dll" time="0.001" />
    <testcase classname="Haprotec.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest" file="D:\Repositories\gitlab.haprotec\customers\foo\foo-zls\src\foo.common.test\bin\Debug\netcoreapp2.2\foo.common.test.dll" time="0.001" />
    <testcase classname="Haprotec.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest" file="D:\Repositories\gitlab.haprotec\customers\foo\foo-zls\src\foo.common.test\bin\Debug\netcoreapp2.2\foo.common.test.dll" time="0.001" />
    <testcase classname="Haprotec.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest" file="D:\Repositories\gitlab.haprotec\customers\foo\foo-zls\src\foo.common.test\bin\Debug\netcoreapp2.2\foo.common.test.dll" time="0.001" />
    <testcase classname="Haprotec.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest" file="D:\Repositories\gitlab.haprotec\customers\foo\foo-zls\src\foo.common.test\bin\Debug\netcoreapp2.2\foo.common.test.dll" time="0.001" />
    <testcase classname="Haprotec.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Foo.Model.Workstation.WorkstationProcessTests.ConvertToAndFromJsonTest" file="D:\Repositories\gitlab.haprotec\customers\foo\foo-zls\src\foo.common.test\bin\Debug\netcoreapp2.2\foo.common.test.dll" time="0.076" />
  </testsuite>
</testsuites>

trx logger in combination with trx2junit (https://github.com/gfoidl/trx2junit)

gitlab pipeline

grafik

grafik

trx output file

<?xml version="1.0" encoding="utf-8"?>
<TestRun id="1df4ac78-d1d2-4772-a94d-a78f9467af02" name="fu-cisrv@VM-SRV-BUILD01 2019-12-03 09:31:10" runUser="HAPROTEC\fu-cisrv" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
  <Times creation="2019-12-03T09:31:10.0478303+01:00" queuing="2019-12-03T09:31:10.0478306+01:00" start="2019-12-03T09:31:07.7056010+01:00" finish="2019-12-03T09:31:10.1095046+01:00" />
  <TestSettings name="default" id="2faaea12-30ce-407d-930f-0868268b16de">
    <Deployment runDeploymentRoot="fu-cisrv_VM-SRV-BUILD01_2019-12-03_09_31_10" />
  </TestSettings>
  <Results>
    <UnitTestResult executionId="f760fce6-8fd2-4e1e-90eb-48dc84ce7ea3" testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" testName="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = RequiredProgramMissing, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;required program is empty string&quot; })" computerName="VM-SRV-BUILD01" duration="00:00:00.0010000" startTime="2019-12-03T09:31:09.9875331+01:00" endTime="2019-12-03T09:31:09.9875333+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="f760fce6-8fd2-4e1e-90eb-48dc84ce7ea3" />
    <UnitTestResult executionId="4f76818f-f0fe-44ee-ad45-ae4e37a914c8" testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" testName="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = InsertOccursMoreThanOnce, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;2 insert steps&quot; })" computerName="VM-SRV-BUILD01" duration="00:00:00.0010000" startTime="2019-12-03T09:31:09.9232688+01:00" endTime="2019-12-03T09:31:09.9232691+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="4f76818f-f0fe-44ee-ad45-ae4e37a914c8" />
    <UnitTestResult executionId="2c92b5f0-c338-4902-afa3-0712cd8ab05d" testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" testName="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = Valid, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;valid 1&quot; })" computerName="VM-SRV-BUILD01" duration="00:00:00.0170000" startTime="2019-12-03T09:31:09.9152102+01:00" endTime="2019-12-03T09:31:09.9152201+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="2c92b5f0-c338-4902-afa3-0712cd8ab05d" />
    <UnitTestResult executionId="2cedac3b-2c09-4ce5-8357-7fa97e7ed9e7" testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" testName="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = DownholderOccursMoreThanOnce, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;2 ApplyDownholder steps&quot; })" computerName="VM-SRV-BUILD01" duration="00:00:00.0010000" startTime="2019-12-03T09:31:09.9233742+01:00" endTime="2019-12-03T09:31:09.9233745+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="2cedac3b-2c09-4ce5-8357-7fa97e7ed9e7" />
    <UnitTestResult executionId="e4640910-8bdd-4bdd-af80-b913af980cf6" testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" testName="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = RequiredProgramMissing, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;required program missing 1&quot; })" computerName="VM-SRV-BUILD01" duration="00:00:00.0230000" startTime="2019-12-03T09:31:09.9234467+01:00" endTime="2019-12-03T09:31:09.9234470+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="e4640910-8bdd-4bdd-af80-b913af980cf6" />
    <UnitTestResult executionId="9f45dfe1-0987-4eb9-91f2-a915b245dcab" testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" testName="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = RequiredProgramMissing, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;required program missing 2&quot; })" computerName="VM-SRV-BUILD01" duration="00:00:00.0010000" startTime="2019-12-03T09:31:09.9874781+01:00" endTime="2019-12-03T09:31:09.9874791+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="9f45dfe1-0987-4eb9-91f2-a915b245dcab" />
    <UnitTestResult executionId="51a48048-15c5-45fb-96ce-b5fbf893219a" testId="5ff569a7-252c-f194-6e94-793560722f95" testName="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ConvertToAndFromJsonTest" computerName="VM-SRV-BUILD01" duration="00:00:00.0900000" startTime="2019-12-03T09:31:10.0030089+01:00" endTime="2019-12-03T09:31:10.0030102+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="51a48048-15c5-45fb-96ce-b5fbf893219a" />
    <UnitTestResult executionId="115f246c-ba4b-4617-8b4b-a4106d5a5eb7" testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" testName="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = DownholderMissing, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;ApplyDownholder missing&quot; })" computerName="VM-SRV-BUILD01" duration="00:00:00.0010000" startTime="2019-12-03T09:31:09.9233393+01:00" endTime="2019-12-03T09:31:09.9233395+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="115f246c-ba4b-4617-8b4b-a4106d5a5eb7" />
    <UnitTestResult executionId="6f8d731a-8c04-43cb-b9fa-0586fb3409a2" testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" testName="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = InsertMustBeFirst, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;insert not first&quot; })" computerName="VM-SRV-BUILD01" duration="00:00:00.0010000" startTime="2019-12-03T09:31:09.9233047+01:00" endTime="2019-12-03T09:31:09.9233050+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="6f8d731a-8c04-43cb-b9fa-0586fb3409a2" />
    <UnitTestResult executionId="82d92887-fc56-46f9-bc78-dece1a070903" testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" testName="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = DownholderMustBeLast, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;ApplyDownholder not last&quot; })" computerName="VM-SRV-BUILD01" duration="00:00:00.0010000" startTime="2019-12-03T09:31:09.9234087+01:00" endTime="2019-12-03T09:31:09.9234089+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="82d92887-fc56-46f9-bc78-dece1a070903" />
    <UnitTestResult executionId="67705551-1e29-4b32-85eb-15daf020d0cb" testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" testName="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = InsertMissing, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;missing insert&quot; })" computerName="VM-SRV-BUILD01" duration="00:00:00.0010000" startTime="2019-12-03T09:31:09.9232303+01:00" endTime="2019-12-03T09:31:09.9232306+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="67705551-1e29-4b32-85eb-15daf020d0cb" />
    <UnitTestResult executionId="97768a77-c095-4f6f-9cb8-9115e9129ff2" testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" testName="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = InsertMissing, Process = WorkstationProcess { NumberOfStations = 1, Process = [...] }, TestName = &quot;empty object&quot; })" computerName="VM-SRV-BUILD01" duration="00:00:00.0010000" startTime="2019-12-03T09:31:09.9231691+01:00" endTime="2019-12-03T09:31:09.9231698+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="97768a77-c095-4f6f-9cb8-9115e9129ff2" />
    <UnitTestResult executionId="3358e12f-9256-4700-b575-eaaedced0965" testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" testName="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = Valid, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;valid 2&quot; })" computerName="VM-SRV-BUILD01" duration="00:00:00.0010000" startTime="2019-12-03T09:31:09.9227473+01:00" endTime="2019-12-03T09:31:09.9227479+01:00" testType="13cdc9d9-ddb5-4fa4-a97d-d965ccfc6d4b" outcome="Passed" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" relativeResultsDirectory="3358e12f-9256-4700-b575-eaaedced0965" />
  </Results>
  <TestDefinitions>
    <UnitTest name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ConvertToAndFromJsonTest" storage="d:\gitlabrunnerservice\builds\fc40b836\0\customers\Foo\Foo-zls\src\Foo.common.test\bin\release\netcoreapp2.2\Foo.common.test.dll" id="5ff569a7-252c-f194-6e94-793560722f95">
      <Execution id="51a48048-15c5-45fb-96ce-b5fbf893219a" />
      <TestMethod codeBase="D:\GitlabRunnerService\builds\fc40b836\0\customers\Foo\Foo-zls\src\Foo.common.test\bin\Release\netcoreapp2.2\Foo.common.test.dll" adapterTypeName="executor://xunit/VsTestRunner2/netcoreapp" className="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ConvertToAndFromJsonTest" />
    </UnitTest>
    <UnitTest name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest" storage="d:\gitlabrunnerservice\builds\fc40b836\0\customers\Foo\Foo-zls\src\Foo.common.test\bin\release\netcoreapp2.2\Foo.common.test.dll" id="e0fd19bc-4a34-f8f6-22c4-4053401d6283">
      <Execution id="2c92b5f0-c338-4902-afa3-0712cd8ab05d" />
      <TestMethod codeBase="D:\GitlabRunnerService\builds\fc40b836\0\customers\Foo\Foo-zls\src\Foo.common.test\bin\Release\netcoreapp2.2\Foo.common.test.dll" adapterTypeName="executor://xunit/VsTestRunner2/netcoreapp" className="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest" />
    </UnitTest>
  </TestDefinitions>
  <TestEntries>
    <TestEntry testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" executionId="f760fce6-8fd2-4e1e-90eb-48dc84ce7ea3" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" executionId="4f76818f-f0fe-44ee-ad45-ae4e37a914c8" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" executionId="2c92b5f0-c338-4902-afa3-0712cd8ab05d" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" executionId="2cedac3b-2c09-4ce5-8357-7fa97e7ed9e7" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" executionId="e4640910-8bdd-4bdd-af80-b913af980cf6" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" executionId="9f45dfe1-0987-4eb9-91f2-a915b245dcab" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="5ff569a7-252c-f194-6e94-793560722f95" executionId="51a48048-15c5-45fb-96ce-b5fbf893219a" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" executionId="115f246c-ba4b-4617-8b4b-a4106d5a5eb7" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" executionId="6f8d731a-8c04-43cb-b9fa-0586fb3409a2" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" executionId="82d92887-fc56-46f9-bc78-dece1a070903" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" executionId="67705551-1e29-4b32-85eb-15daf020d0cb" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" executionId="97768a77-c095-4f6f-9cb8-9115e9129ff2" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestEntry testId="e0fd19bc-4a34-f8f6-22c4-4053401d6283" executionId="3358e12f-9256-4700-b575-eaaedced0965" testListId="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
  </TestEntries>
  <TestLists>
    <TestList name="Ergebnisse nicht in einer Liste" id="8c84fa94-04c1-424b-9868-57a2d4851a1d" />
    <TestList name="Alle geladenen Ergebnisse" id="19431567-8539-422a-85d7-44ee4e166bda" />
  </TestLists>
  <ResultSummary outcome="Completed">
    <Counters total="13" executed="13" passed="13" failed="0" error="0" timeout="0" aborted="0" inconclusive="0" passedButRunAborted="0" notRunnable="0" notExecuted="0" disconnected="0" warning="0" completed="0" inProgress="0" pending="0" />
    <Output>
      <StdOut>[xUnit.net 00:00:00.00] xUnit.net VSTest Adapter v2.4.1 (64-bit .NET Core 4.6.28008.02)&#xD;
[xUnit.net 00:00:00.85]   Discovering: Foo.common.test&#xD;
[xUnit.net 00:00:00.94]   Discovered:  Foo.common.test&#xD;
[xUnit.net 00:00:00.94]   Starting:    Foo.common.test&#xD;
[xUnit.net 00:00:01.29]   Finished:    Foo.common.test&#xD;
</StdOut>
    </Output>
  </ResultSummary>
</TestRun>

converted xml file from trx2junit

<?xml version="1.0" encoding="utf-8"?>
<testsuites>
  <testsuite name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" hostname="VM-SRV-BUILD01" package=".NET Core" id="0" tests="13" failures="0" errors="0" skipped="0" time="0.140" timestamp="2019-12-03T09:31:10">
    <properties />
    <testcase name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ConvertToAndFromJsonTest" classname="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" time="0.090" />
    <testcase name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = RequiredProgramMissing, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;required program is empty string&quot; })" classname="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" time="0.001" />
    <testcase name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = InsertOccursMoreThanOnce, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;2 insert steps&quot; })" classname="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" time="0.001" />
    <testcase name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = Valid, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;valid 1&quot; })" classname="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" time="0.017" />
    <testcase name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = DownholderOccursMoreThanOnce, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;2 ApplyDownholder steps&quot; })" classname="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" time="0.001" />
    <testcase name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = RequiredProgramMissing, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;required program missing 1&quot; })" classname="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" time="0.023" />
    <testcase name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = RequiredProgramMissing, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;required program missing 2&quot; })" classname="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" time="0.001" />
    <testcase name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = DownholderMissing, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;ApplyDownholder missing&quot; })" classname="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" time="0.001" />
    <testcase name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = InsertMustBeFirst, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;insert not first&quot; })" classname="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" time="0.001" />
    <testcase name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = DownholderMustBeLast, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;ApplyDownholder not last&quot; })" classname="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" time="0.001" />
    <testcase name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = InsertMissing, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;missing insert&quot; })" classname="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" time="0.001" />
    <testcase name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = InsertMissing, Process = WorkstationProcess { NumberOfStations = 1, Process = [...] }, TestName = &quot;empty object&quot; })" classname="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" time="0.001" />
    <testcase name="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests.ValidateTest(testData: ValidationTestData { ExpectedResult = Valid, Process = WorkstationProcess { NumberOfStations = 2, Process = [...] }, TestName = &quot;valid 2&quot; })" classname="Haprotec.Customer.Foo.Model.Workstation.WorkstationProcessTests" time="0.001" />
    <system-out />
    <system-err />
  </testsuite>
</testsuites>

More Than One Test Project Per Solution

My team has several solutions with multiple test projects each. It seems that JUnitTestLogger only outputs the results for the last test project ran. Not all of them. This means that a lot of our tests don't get reported which skews our test results charts.

Length cannot be less than zero

I am getting an exception when using this logger.
My tests are written in xUnit, my command looks like:

dotnet test tests/UnitTests --logger:"junit;LogFilePath=..\UnitTests.xml;MethodFormat=Class;FailureBodyFormat=Verbose" --test-adapter-path:.

I am getting the following error:

JunitXML Logger: Threw an unhandeled exception.
Length cannot be less than zero.
Parameter name: length
System.Private.CoreLib

image

Is there a way I get more information that might actually be helpful?

Version: 2.1.10

No test result file when outputing an escape sequence

While investigating why my test result was not created for my projects, I realized that it was due to some tests that were outputting escape sequences.

To reproduce the issue, you can just use this test:

[Test]
public void Test()
{
    Console.WriteLine("test\0");
}

Unable to parse test names when using with googletestadapter

I'm using googletest within my c++ project, where parameterized tests have names of the form
${TestCaseName}/${TestClassName}.${TestMethodName}/${TestNumber} [${TestParameters}...]
junit.testlogger apparently fails to parse those kind of test names:

Test Logger: Unable to parse one or more test names provided by your test runner. These will be logged using Namespace='UnknownNamespace', Type='UnknownType'. The entire test name will be used as the Method name. Please open a ticket so we can review this issue

Expected outcome for the test method name is everything after the .

warning about configuration 'TargetFramework'

Every time I run dotnet test --logger:"junit" I get this warning in the output:

Starting test execution, please wait...
JunitXML Logger: The provided configuration item 'TargetFramework' is not valid and will be ignored. Note, names are case sensitive.

using dotnet 3.1.402

Syncromatics JUnitTestLogger

@codito Not sure if you have seen https://github.com/syncromatics/JUnitTestLogger its an older port of the xunit logger. I had not seen it before about a week ago when I got asked about the differences between that logger and this one by the gitlab team.

I noticed that #12 was opened there as well, and that they have several issues open which have been resolved in this logger. It doesn't look to be under development, so I'm wondering if we should reach out to them about the best option going forward.

Gitlab CI number of tests is wrong

Version="2.1.81"

Edit: I updated to the latest version and have the same issue.

Tests:
  image: mcr.microsoft.com/dotnet/core/sdk:3.1
  only:
    - merge_requests
    - branches
  script:
    - dotnet test --test-adapter-path:. --logger:"junit;LogFilePath=../artifacts/{assembly}-result.xml;MethodFormat=Class;FailureBodyFormat=Verbose"
  stage: build
  artifacts:
    when: always
    reports:
      junit:
        - $CI_PROJECT_DIR/artifacts/*-result.xml

This is my test step for GitLab CI, and I have tests set up with MSTest that use parameters.

        [DataTestMethod]
        [DataRow(true)]
        [DataRow(false)]
        public async Task Some_Test_Method(bool result)
        {
            ...
        }

The results, however, don't show the parameters that are passed in, so they get combined into one test by Gitlab.

    <testcase classname="My.Test.Class.Name" name="name.Some_Test_Method" time="0.0147381" />
    <testcase classname="My.Test.Class.Name" name="name.Some_Test_Method" time="0.0118888" />
    <testcase classname="My.Test.Class.Name" name="name.Some_Test_Method" time="0.0003662" />

Expected Result: The name of the testcase will include the parameter values like the GitLab CI Documentation states.

Do I have something misconfigured?

Run tests from VS TestExplorer

Hello!

I'm using your library via a .runsettings file in VS 2022 to specify the Logger.
If I run the Unit Tests by using command "dotnet vstest" and specifying the settings file everything runs perfectly, but if I use TestExplorer to run the same tests (after setting the same runsettings file from menu "Test->Configure Run Settings") I have some problems with the report, e.g.: the timestamp is always "0001-01-01T00:00:00"; the {assembly} attribute for the log file name does not work; etc.

Thank You

Capture StdOut from Console.WriteLine calls

Context

At work we've recently switched a selenium project form .NET Framework to .NET core and are using dotnet test to execute the tests. Since we're using gitlab CI we needed a junit report, so we used this package's logger and in theory it works great.

The only problem, why we've "had to" switch to a combination of "trx logger + trx2junit" is, that we use Console.WriteLine(...) quite a lot to print information during the tests and sadly this logger does not seem to support those.

Request

It would be great to have the values send to Console.WriteLine(...)

  • be shown in the stdout from the dotnet test --logger junit-call (that way the logging would show up in the job's log) OR / AND
  • be added to the report XML, so we can just download the report-artifact if we need to look at the logging

As an aside: Thanks a lot for this project, and (i think) contributing to the gitlab documentation!

junitxml.testlogger is missing NuGet package README file

We've noticed that your package on NuGet.org is missing a README file.

Why READMEs are Important

Our customer research indicates that one of the top problems that package consumers face is insufficient package documentation such as README files. Adding a README file to your package will help users quickly understand what the package is and what it does. Since your README will be the first impression for users when they view your package on NuGet.org, it is crucial for authors to write and include high-quality READMEs for their packages.

Get Started with READMEs

How to Add a README

If you're new to NuGet READMEs, follow guidance on how to add one to your package.

How to write high quality of READMEs

Follow a blog post with README best practices and a template file to help you get started.

We Value Your Feedback

We value your feedback. If you encounter any issues or have suggestions, please reply to this issue.

Thank you for your contribution to the NuGet community.

Indicate framework in produced xml file

I have a xUnit test project with multiple TargetFrameworks (netcoreapp3.1 and net471), and I'm generating multiple test results files using LogFilePath={assembly}-{framework}.xml.

However, when importing these files using Jenkins JUnit plugin, the framework information is lost and we're unable to differentiate between test execution frameworks.

Do you know of any workaround? Maybe an option that makes it possible to specify/modify the testsuite name in the produced xml?

NUnit Description option for Junit test reports

Hiii, I have the following question:
Is it possible to have the NUnit descriptions available in a Junit test report as name?
This would be cool to have as an extra option in the CLI.

Example of description:
image

Illegal xml characters in data to be logged block log creation

Hello, I'm trying to use junit test logger for building and uploading results from inside dotnet/sdk:6.0 container.

When I run test command

dotnet test --logger:"junit;LogFilePath=test-result.xml"

Tests are run but no test-result.xml file is created.

It works fine if run from Windows

When I run it with -d diag.log and inspect log file I've noticed strange exception related to junit logger

TpTrace Error: 0 : 2176, 4, 2022/11/01, 01:40:56.788, 10717012278100, vstest.console.dll, MulticastDelegateUtilities.SafeInvoke: 1: Invoking callback 2/Spekt.TestLogger.Core.TestRunBuilder for <Subscribe>b__5_3.InternalTestLoggerEvents.SendTestRunComplete, failed after 10 ms with: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation.
 ---> System.ArgumentException: '�', hexadecimal value 0x1B, is an invalid character.
   at System.Xml.XmlEncodedRawTextWriter.WriteElementTextBlock(Char* pSrc, Char* pSrcEnd)
   at System.Xml.XmlEncodedRawTextWriter.WriteString(String text)
   at System.Xml.XmlEncodedRawTextWriterIndent.WriteString(String text)
   at System.Xml.XmlWellFormedWriter.WriteString(String text)
   at System.Xml.Linq.ElementWriter.WriteElement(XElement e)
   at System.Xml.Linq.XElement.WriteTo(XmlWriter writer)
   at System.Xml.Linq.XContainer.WriteContentTo(XmlWriter writer)
   at System.Xml.Linq.XNode.GetXmlString(SaveOptions o)
   at System.Xml.Linq.XNode.ToString()
   at Microsoft.VisualStudio.TestPlatform.Extension.Junit.Xml.TestLogger.JunitXmlSerializer.Serialize(LoggerConfiguration loggerConfiguration, TestRunConfiguration runConfiguration, List`1 results, List`1 messages)
   at Spekt.TestLogger.Core.TestRunCompleteWorkflow.Complete(ITestRun testRun, TestRunCompleteEventArgs completeEvent) in /home/runner/work/testlogger/testlogger/src/TestLogger/Core/TestRunCompleteWorkflow.cs:line 38
   at Spekt.TestLogger.Core.TestRunBuilder.<Subscribe>b__5_3(Object _, TestRunCompleteEventArgs eventArgs) in /home/runner/work/testlogger/testlogger/src/TestLogger/Core/TestRunBuilder.cs:line 55
   --- End of inner exception stack trace ---
   at System.RuntimeMethodHandle.InvokeMethod(Object target, Span`1& arguments, Signature sig, Boolean constructor, Boolean wrapExceptions)
   at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)
   at System.Delegate.DynamicInvokeImpl(Object[] args)
   at System.Delegate.DynamicInvoke(Object[] args)
   at Microsoft.VisualStudio.TestPlatform.Utilities.MulticastDelegateUtilities.SafeInvoke(Delegate delegates, Object sender, Object args, String traceDisplayName).

My csproj packages

  <ItemGroup>
    <PackageReference Include="Microsoft.AspNetCore.Mvc.Testing" Version="6.0.8" />
    <PackageReference Include="Microsoft.EntityFrameworkCore.InMemory" Version="6.0.8" />
    <PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.1.0" />
    <PackageReference Include="MSTest.TestAdapter" Version="2.2.8" />
    <PackageReference Include="MSTest.TestFramework" Version="2.2.8" />
    <PackageReference Include="coverlet.collector" Version="3.1.2" />
    <PackageReference Include="JunitXml.TestLogger" Version="3.0.114" />
  </ItemGroup>

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.