Question

We have a few thousand native and .NET unit tests. In Visual Studio 2012, I can run and see the results, grouped by the C++/C# project.

I'd like to get something like this view, preferably grouped by solution (product) and then project (.dll), to the business people. At the bare minimum I'd like to at least have number of tests run and failed per solution.

Is there any proper way to do this with TFS?

I've looked everywhere and keep running into walls,

  • TFS build test results don't seem to store any information about the test categories, so I can't use those to group by solution

  • .vsmdi lists and .testsettings files have been phased out in VS 2012 and TFS 2012. We had separate lists for each solution before...now it's just *test*.dll

  • Test Plans and Custom SSRS reports seem to be completely useless for this much granularity of test results (why?). TfsTestWarehouse has barely anything - just enough for total tests passed/failed per build.

  • Parsing TRX files and writing HTML reports seems to work best using tools like trx2html, but I still can't run tests by solution.
Was it helpful?

Solution 2

I ended up solving this by adding the Project and Solution information in a custom Assembly Attribute (i.e. to the test .dll) at build time, through a custom MSBuild task. Here are roughly the steps I followed (from memory).

First, I created the custom Attribute:

[AttributeUsage(AttributeTargets.Assembly)]
public class ProjectAttribute: Attribute {
    public string Project { get; set; }
    public string Solution { get; set; }
    public ProjectAttribute(string project, string solution)
    {
        this.Project = project;
        this.Solution = solution;
    }
}

This custom attribute was defined in an Assembly that was referenced by all unit test projects.

I then created a very simple/rudimentary inline MSBuild task, CreateProjectAttribCs that would dynamically create an extra C# file with one line. Something like:

[assembly: ProjectAttribute(Project="$(ProjectName)") Solution="$(Solution)"]

And I then added this file to the <Compile> Item Group, inside a custom MSBuild target called before Compile (again, just going from memory):

<Target Name="CreateProjectAttribCs" BeforeTargets="Compile">
    <CreateProjectAttribCs File="ProjectAttribute.cs" />
    <ItemGroup>
        <Compile Include="ProjectAttribute.cs" />
    </ItemGroup>
</Target>
<Target Name="CleanupProjectAttribCs" AfterTargets="Compile>
    <Delete Files="ProjectAttribute.cs" />
</Target>

For C++ projects I'd added the Project and Solution info to a String Table resource "injected" in a similar way to the ProjectAttrib.cs file.

One major annoyance with all of this was that developers would have to add this custom MSBuild .targets file (which would contain the custom targets and the assembly reference) by editing the .csproj or .vcxproj.

To circumvent that a bit, I also created a custom Visual Studio Project Template for our team's unit tests so that everything was already added and my fellow devs would never have to see the innards of an MSBuild project.

The hardest part was adding the Project/Solution info. Once I had that it was easy to read the custom attributes on the test assemblies or String Table resource in a native .dll, and add the info to data parsed/transformed from the test results to a custom test result database and report.

OTHER TIPS

TRX files are just XMLs, there's no need to parse them. You can write an XSLT transformation to present the data in the format you need. A nice thing about XSLT is that it has built-in aggregation, grouping, sorting etc capabilities.

In case TRX files themselves do not contain solution information (which is likely), then you'll have to do a two-stage report generation: prepare the data, generate the report.

The preparation would be a relatively simple command line tool, which would go over your sln files and build a map of which projects belong to while solutions (search the web, i bet there're already a bunch of scripts for that).

And the generation part would be using that mapping as an argument to the transformation and report generation to properly aggregate the data.

I know, it's a bit of a generic response, but hope it helps at least a bit.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top