Change the Bug Status report to reflect additional states to the Bug

by Ewald Hofman 16. July 2010 13:42

The bug work item has 3 (Agile: Active, Resolved, Closed) or 4 states (CMMI: Proposed, Active, Resolved, Closed). Those states are used also in the reporting. When you use additional states you will see that the colors are not in the trend that the default states have. Also the order of the states might not be in the correct order. The last requirement I have in this case (because this bug work item has 10+ states) is to aggregate some states into categories to see less states.

This post shows you have you can achieve these requirements.


In order to support the new states in the report, you have to change the rdl.

  1. Go to to find out how you can start up a new Visual Studio project to modify the report
  2. Now download the Bug Status (you can find the instructions how to do that in Reflect the rename of a state for the Bug Work Item)
  3. And add the Bug Status report to the project
  4. Open the report by double clicking it. You will see something similar like this

  5. First we are going to solve the coloring
    1. Now right click on the orange / blue area to open the context menu for the data series

    2. Choose for the Series Properties
    3. And go to the Fill section

    4. Now click on the fx button to open the expression for the color

    5. You see the list of states and its colors. The color is specified as (Transparancy, Red, Blue, Green). The transparancy of 80 means (because it is a Hex value) 50%. Play around with colors to get the result you want.
  6. Now we also need to change the sorting of the states
    1. To do that click on the data series again and scroll completely to the right

    2. You see that there is a window shown with the Chart Data.
    3. Click on the triangle in the Series Groups

    4. And choose the “Series Group Properties…”
    5. Go to the Sorting section

    6. Click on the fx button to change the sort by expression

    7. Again you can modify this expression to get the correct sorting. The !1, !2 and !3 are just textual values to ensure that they are alphabetically sorted before Active, Assigned and so on.
  7. The third modification you might want to make is to aggregate a few states. If you want to do that, you should change the value for the Series Groups from System_State to an expression.


Reporting | VSTS 2010

Reflect the rename of a state for the Bug Work Item

by Ewald Hofman 16. July 2010 13:20

The bug work item has 3 (Agile: Active, Resolved, Closed) or 4 states (CMMI: Proposed, Active, Resolved, Closed). Those states are used also in the reporting. When you rename the state, it is not automatically picked up by the reporting. Luckily Microsoft has foreseen that not everybody uses the same name for states and introduced a parameter.

To reflect the new state name in the report (in this post, I assume that you have renamed the state Active to Assigned), you have to change the value of the parameter. You can do that by following these steps (I use SQL Server 2008 R2):

  1. Go to the report site for the team project (you can find that by right clicking the Reports node in the team explorer)

  2. It will lead you to the following site

  3. Click on the Bugs. It will open all the reports that are in the Bugs folder
  4. Now hover over the Bug Status report and click on the triangle that pops up. This will open a menu of actions you can perform on the report

  5. Choose the Manage action, which leads to the following page

  6. Click on the Parameters section (on the left side). You will see the parameters that are specified in the report including there default values

  7. In the list is also the parameter ActiveName, which defines the name of the state Active.
  8. Because we changed the Active state to Assigned, you have to change the Default Value to “Assigned”.
  9. Now scroll down until you see at the bottom the Apply button (don’t forget that one!!)
  10. The report will now pick up the new state. However there is a little bug in the report. There is one location in which the “Active” state is still hard coded.
  11. Go back to the action menu that you saw in step 4 and now choose the Download action.
  12. Download it to your local harddrive.
  13. Open the rdl file in notepad.
  14. Replace all occurrences of “Active” (including quotes) with the value Parameters!ActiveName.Value
  15. Save the changes and go to the page you see in step 5
  16. Now choose the Replace action and choose the local file

  17. And press OK.


Reporting | VSTS 2010

The “Active bugs by Assignment” and “Resolved Bugs by Assignment” chart is not visible in the Bug Status report

by Ewald Hofman 16. July 2010 12:54

I have a TFS 2010 demo box on Windows Server 2008 R2 and SQL Server 2008.

While I was busy in a team project based on the default Agile or CMMI process template, I saw in the Bug Status report something strange. Although I had bugs assigned (of course it was a demo environment, because I never deliver bugs in real life :) ), the chart that should have been shown was not drawn.


So I went into the data warehouse to confirm that the data was correct and I even added a column to the tablix as a double check


And it confirmed what I already thought: the data is correct. So why is the report not showing?

First I updated my SQL Server to the latest version (SQL Server 2008 R2), but that did not make any difference.

Another few hours of trying and modifying did not solve it. I couldn’t get the hang on it. Until I had help from John Socha-Leialohawho was able to solve the issue. What he did is the following:

  1. Download the report from the Reporting site
  2. Open the report in BIDS(the screen shots in this example are from BIDS for SQL Server 2008 R2, so might be slightly different if you use another version)
  3. Select the chart by clicking on it
  4. Right click on the chart to open the context menu

  5. Choose “Change Chart Type” from the context menu
  6. You will see that the stacked bar is chosen

  7. Change the chart type to the Bar chart type

  8. And press OK
  9. Now change the chart type back to “Stacked bar”
  10. Follow the same steps for the Resolved Bugs chart.
  11. And upload the report to the reporting site again (be aware the when you look at the report on the Report site, that it is using a cached report. Just use other parameter values to use the new layout)

And here is the new report that shows the correct result.



Reporting | VSTS 2010

Refresh the warehouse automatically

by Ewald Hofman 7. July 2010 13:17

When you are modifying the reports in Team Foundation Server, you want to add new data (such as work items) to test the reports. I have written a blog post how you can do that manually, but I wanted to do that with a little app to save time.

Attached is the source code and the exe (be aware to download the config too when you use the exe) with which you can achieve it.

You can use the application as following: RefreshWarehouse TfsServerName Port Collection
For example: "RefreshWarehouse localhost 8080 defaultcollection" to refresh the warehouse on the local tfs server. (25.75 kb)

RefreshWarehouse.exe (31.50 kb)

RefreshWarehouse.exe.config (1.66 kb)


Reporting | VSTS 2010

TFS 2010 did not update the Warehouse and Cube

by Ewald Hofman 27. April 2010 14:30

After the upgrade from TFS 2008 to TFS 2010 the reporting was not working anymore. I got the following error message in the eventlog and the GetProcessingStatus webservice (you can find it when you open http://localhost:8080/tfs/TeamFoundation/Administration/v3.0/WarehouseControlService.asmx on your application tier):

[Full Analysis Database Sync]: ---> Microsoft.TeamFoundation.Server.WarehouseException: TF221122: An error occurred running job Full Analysis Database Sync for team project collection or Team Foundation server TEAM FOUNDATION. ---> Microsoft.TeamFoundation.Framework.Server.AnalysisServiceConnectionException: Error encountered when creating connection to Analysis Services. Contact your Team Foundation Server administrator. ---> Microsoft.AnalysisServices.ConnectionException: A connection cannot be made. Ensure that the server is running. ---> System.IO.IOException: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host. ---> System.Net.Sockets.SocketException: An existing connection was forcibly closed by the remote host at System.Net.Sockets.Socket.Receive(Byte[] buffer, Int32 offset, Int32 size, SocketFlags socketFlags) at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size) --- End of inner exception stack trace --- at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size) at System.IO.BufferedStream.Read(Byte[] array, Int32 offset, Int32 count) at Microsoft.AnalysisServices.DimeRecord.ForceRead(Stream stream, Byte[] buffer, Int32 length) at Microsoft.AnalysisServices.DimeRecord.ReadHeader() at Microsoft.AnalysisServices.DimeRecord..ctor(Stream stream) at Microsoft.AnalysisServices.DimeReader.ReadRecord() at Microsoft.AnalysisServices.TcpStream.GetResponseDataType() --- End of inner exception stack trace --- at Microsoft.AnalysisServices.XmlaClient.Connect(ConnectionInfo connectionInfo, Boolean beginSession) at Microsoft.AnalysisServices.Server.Connect(String connectionString, String sessionId) at Microsoft.AnalysisServices.Server.Connect(String connectionString) at Microsoft.TeamFoundation.Warehouse.AnalysisServicesUtil.Connect(Server server, String serverName) --- End of inner exception stack trace --- at Microsoft.TeamFoundation.Warehouse.AnalysisServicesUtil.Connect(Server server, String serverName) at Microsoft.TeamFoundation.Warehouse.AnalysisDatabaseSyncJobExtension.GetLastProcessedTimeStamp(String serverName, String dbName) at Microsoft.TeamFoundation.Warehouse.AnalysisDatabaseSyncJobExtension.DetermineProcessType(ControllerConfig controllerConfig, AnalysisDatabaseProcessingType desiredProcessingType, Boolean& lastProcessingFailed, Boolean& needCubeSchemaUpdate) at Microsoft.TeamFoundation.Warehouse.AnalysisDatabaseSyncJobExtension.RunInternal(TeamFoundationRequestContext requestContext, TeamFoundationJobDefinition jobDefinition, DateTime queueTime, String& resultMessage) at Microsoft.TeamFoundation.Warehouse.WarehouseJobExtension.Run(TeamFoundationRequestContext requestContext, TeamFoundationJobDefinition jobDefinition, DateTime queueTime, String& resultMessage) --- End of inner exception stack trace ---

I don’t know the root cause yet, but changing the Service Account for TFS from NETWORK SERVICE to a real account solved the issue. You can change the account in the TFS Admin Console, in the Application Tier node. You have there an action to change the service account.


VSTS 2010 | Reporting

Disable the requirement on the secure connection for Reporting Services

by Ewald Hofman 22. April 2010 05:22

When I installed a brand new TFS 2010 instance, I was encountered with the following error message

   [ Reporting ] TF255455: SQL Server Reporting Services is configured to require a secure connection. However, no HTTPS URL is configured with a valid certificate. Use the Reporting Services Configuration Manager to configure or remove HTTPS support. For more information see

The link that was provided did not work for (maybe it has been already fixed), however I found the solution on the MSDN Forums provided by Ed Holloway:

This issue is happening due to the way SQL RS enables (and enforces) HTTPS on a machine.  If the SQL installer finds any certificates on the machine with the purpose set to * (anything) or server authentication it will set the SQL RS SecureConnectionLevel to 2, which requires HTTPS.  (This can happen in a few other scenarios too, but this is the common one)  So even though you may not have intended for RS to use SSL in this case, it is trying to enforce it.

These directions will disable SSL for SQL RS.

To work around this
C:\Program Files\Microsoft SQL Server\MSRS10.<ServerInstance>\Reporting Services\ReportServer\rsreportserver.config
<Add Key="SecureConnectionLevel" Value="2"/>  (Change from current value to 0)

Here is what the values mean:
3 Most secure—Use SSL for absolutely everything.
2 Secure—Use SSL for rendering and methods that pass credentials but don't insist on it for all SOAP calls.
1 Basic Security—Accepts http but rejects any calls that might be involved in the passing of credentials.
0 Least Secure—Don't use SSL at all.


VSTS 2010 | Reporting

OLAP PivotTable Extensions – your MDX reporting help

by Ewald Hofman 7. December 2009 01:48

In TFS you can create your own reports based on the Analysis warehouse. In order to be able to query the cube, you need to use the MDX language. This language is not very intuitive for someone who is used to query transactional databases, since you now have to deal with cubes, slices, facts and measures.

When you use Excel, you case easily query the Analysis warehouse. Excel will create the MDX query for you. There is a nice tool on CodePlex that gives you the ability to see the MDX query that Excel uses. You can find the tool at

You have now the power of the usability of Excel and the layout formatting of Reporting Services.

Go make a great impression on your boss!


Reporting | VSTS 2008

Refresh the TFS Warehouse manually

by Ewald Hofman 25. November 2009 08:18

Reporting (SSRS and Excel) in TFS makes use of Sql Server Analysis Server, or OLAP cube. This cube is by default refreshed every two hours. This is in most cases frequently enough to get the correct information. You can however change this default behaviour. You can either change the interval of the warehouse refreshment or you can refresh the cube manually.

Be aware that SSRS is caching the reports, which means that when the data might not be refreshed in the SSRS report as you expected. For changing the caching settings of the reports see

Change the interval

To change the interval follow the following steps

1. Go to the desktop of the Application Tier

2. Open Internet Explorer and type http://localhost:8080/tfs/TeamFoundation/Administration/v3.0/WarehouseControlService.asmx in the adress bar. You get a list of all available web services.


3. To change the interval, choose the ChangeSetting webservice

4. You will see a new page where you can enter the setting and its new value.


Enter in the SettingId: RunIntervalSeconds
and in the newValue the number of seconds (3600 is one hour)

5. Click on Invoke to change the setting

Manually refresh the cube

You can also choose to refresh the cube once. To do that follow the following steps.

1. Open the list of web services again with http://localhost:8080/tfs/TeamFoundation/Administration/v3.0/WarehouseControlService.asmx

2. Select the ProcessWarehouse web service from the list


3. Enter in the collectionName the name of the Project Collection you want to refresh. The default collection is named DefaultCollection, but you can change that in the collection that is in use on your TFS environment.

4. Click on Invoke

5. Go back to the list of webservices and click on the ProcessAnalysisDatabase webservice, and enter in the processType textbox the value Full.


6. Click on Invoke

7. The data warehouse is now refreshed. This can take some time when you have a large database. To check the status of the warehouse update, you can check the GetProcessingStatus webservice. You can leave the parameters for the webservice blank to retrieve all information. The warehouse update is ready when all jobs have the “Idle” job status.


Reporting | VSTS 2010

Calculate Code Metrics automatically for all your applications and see the trend

by Ewald Hofman 7. October 2009 04:32

In Visual Studio 2008, Microsoft has added the code metrics, which gives you information about the quality and maintainability of the application. Unfortunately you can not easily start the code metrics automically, and you can not integrate it from Team Build.

There is NDepend, which has the ability to integrate with team build, that gives you access to the same code metrics. But NDepend does not have a repository so you can build a trend of the metrics for all your applications stored in your TFS source control.

So I created a solution which works around these problems.

The idea of the application is quite simple:

  1. Download all sources in TFS to the local drive
  2. One by one open the solution
  3. For each solution calculate the code metrics
  4. Store the results of the code metrics in a database for reporting and trend analysis

It is do-able, but it was a long story to get a solution that works. I like to share the solution with you.


  • EntLib 4.0
  • SQL Server 2005
  • Visual Studio 2008 Development Edition or Team Suite
  • Team Foundation Server 2008
  • Microsoft Excel 2003 (not tested on Excel 2007)

To start the application, do the following:

  1. Download the sources at the bottom of the page.

  2. Open the solution

  3. Deploy the CodeMetricsDatabase project to the SQL Server of your choice

  4. Set the connection string to the SQL Server the database project is deployed to

    1. Open the app.config

    2. Change the CodeMetrics connection string in the connectionStrings section

  5. Start the application with the following parameters:

    1. TFS server name (for example http://tfsserver:8080)

    2. Start node in TFS source control (for example $/MyTeamProject/MySolution)

    3. The local path where the sources are stored (for example c:\CodeMetricsCache)

    4. And optional you can pass the Team Project you are only interested in (289.11 kb)


To be able to see the trend of the metrics, I have created a report which you can install on the TFS Reporting site. To install the report, you must add a datasource called "CodeMetrics". This datasource must have the connection information to the database where you store the code metrics.

The second step is to update the rdl below. You can now see the trend of the code metrics of all your applications.

Code Metrics Trend.rdl (34.71 kb)


Reporting | VSTS 2008

[Update] Time sheet

by Ewald Hofman 4. October 2009 01:08

A lot of software shops want to use TFS as the time tracking tool. Out of the box, TFS does not provide this information, because all time is entered cumulative per work item. But you can find the information when you look at the transaction database table where all the changes to the work items are stored.

This is exactly what I did. I created a report to read from the transaction table to present the information per user per day. This leads to report as below.


The post shows you how you can find this information yourself. At the bottom of the post you can find the rdl.

The base of the report is based on the transaction table of the work items. This table is called "Work Item History". The tricky part of the query is to find for each transaction record (which is a change to - or revision of - the work item) its previous revision. When you have that information, you can calculate the difference between the two records. This difference is the amount of time the developer has added to the completed work, and thus the hours registered.

The Sql statement that gives you this information is the following.

            dbo.[Work Item History]
    INNER JOIN    dbo.[Work Item]
                    ON    dbo.[Work Item History].[Work Item]        = dbo.[Work Item].__ID
    LEFT JOIN    (            dbo.[Work Item History] [Previous Work Item History] 
                INNER JOIN    dbo.[Work Item] [Previous Work Item]
                                ON    [Previous Work Item History].[Work Item]    = [Previous Work Item].__ID 
                ON    [Previous Work Item].System_Id            = [Work Item].System_Id
                AND    [Previous Work Item].System_Rev            = [Work Item].System_Rev - 1
                AND    [Previous Work Item History].[Record Count]    = -1

For each transaction there is a negative record to set the values to 0 and a positive record to set the values to the actual values. For the previous revision we are only interested in the positive record, which is decorated with the "Record Count = -1.

When you have this starting point, you can add more tables to support filtering the data, such as on Team Project, Area path, Iteration path, Assignee, etc. I did add those four filters to the report. This resulted in the report that is attached to the blog post.

This report is view only. Notion solution has created a nice product that adds the ability to enter your time sheet from within Visual Studio. For more information on that solution, see

TimeSheet.rdl (29.97 kb)


Reporting | VSTS 2008 | Work items

Time Entry report

by Ewald Hofman 9. July 2009 00:55

A frequently heard complaint about using the time fields in the work item is that there is already another system in the organization to enter the working hours in. Using the time fields in the work item would mean a double time management system, which can be out of sync.

To come around this, I created a new report that reads from the work items how many hours a developer has entered for a specific day. The report you can use is attached to the blog.

Time Entry.rdl (18.21 kb)


Reporting | VSTS 2008

Upload SSRS report to all subfolders

by Ewald Hofman 16. June 2009 00:50

When you develop a new report for TFS and you want to deploy the report to all team projects, you have to do a lot of work to upload the file and to set the datasources.

This post uses TFS 2008 and SQL Server 2005

There is an easy way to automate that.

1. Add the web service to Reporting Service

Add a new web reference to the project. The url of the web service is http://<servername>/ReportServer/ReportService2005.asmx?wsdl

2. Add the library to upload the file

using System.Collections.ObjectModel;
using System.IO;
using System.Linq;
using System.Net;
using System.Xml;
using UploadLibrary.ReportService2005;
namespace UploadLibrary
    public class Report
        public Report(string[] localFiles)
            LocalFiles = localFiles;
        public string[] LocalFiles { get; set; }
        public void Upload()
            foreach (var subFolder in SubFolders())
                foreach (var localFile in LocalFiles)
                    Upload(subFolder, localFile);
        private void Upload(string subFolder, string file)
            var fi = new FileInfo(file);
            string reportName = fi.Name.Substring(0, fi.Name.Length - fi.Extension.Length);
            string reportPath = "/" + subFolder + "/" + reportName;
            var rs = new ReportingService2005 { Credentials = CredentialCache.DefaultCredentials };
            //Read the file from the file system
            FileStream stream = File.OpenRead(file);
            var bytes = new byte[stream.Length];
            stream.Read(bytes, 0, (int)stream.Length);
            //Search for an exiting item in the subfolders (recursively)
            CatalogItem[] items = rs.FindItems("/" + subFolder,
new SearchCondition
Name = "Name",
Condition = ConditionEnum.Equals,
Value = reportName
            // Only add the report in the current subfolder
            if (items.Count(c => c.Path == reportPath) > 0)
                //Report exists -> update the report
                rs.SetReportDefinition(reportPath, bytes);
                //Report does not exist -> create a new report
                rs.CreateReport(reportName, "/" + subFolder, false, bytes, new Property[0]);
            //Relocate the datasources to the parent folder
            rs.SetItemDataSources(reportPath, UsedDatasources(file));
        private static string[] SubFolders()
            var rs = new ReportingService2005 { Credentials = CredentialCache.DefaultCredentials };
            var items = rs.ListChildren("/", false);
            return items.Where(i => i.Type == ItemTypeEnum.Folder).Select(i => i.Name).ToArray();
        /// <summary>
        /// Returns an array of all datasources that are in use by the report.
        /// </summary>
        private DataSource[] UsedDatasources(string file)
            var datasources = new Collection<DataSource>();
            // Read the rdl file
            var doc = new XmlDocument();
            var nsmgr = new XmlNamespaceManager(doc.NameTable);
            nsmgr.AddNamespace("rdl", "");
            // Go to the datasources node in the rdl file
            var nodes = doc.SelectNodes("rdl:Report/rdl:DataSources/rdl:DataSource", nsmgr);
            if (nodes != null)
                // Iterate through the datasources in the rdl file
                foreach (XmlNode node in nodes)
                    // Read the name of the datasource
                    var datasourceName = node.Attributes["Name"].InnerText;
                    // Add it to the result
                    datasources.Add(new DataSource
                        Item = new DataSourceReference { Reference = "/" + datasourceName },
                        Name = datasourceName
            // Return the result
            return datasources.ToArray();

3. Add any UI you like

You can add a winform project to your solution. On the form add a button called uploadReport and a textbox called fileName

private void uploadReport_Click(object sender, EventArgs e)
    Cursor = Cursors.WaitCursor;
        if (!File.Exists(fileName.Text))
            MessageBox.Show(this, "Invalid filename", "Validation", MessageBoxButtons.OK,
        //Start upload
        var rpt = new Report(new string[] { fileName.Text });
        //Successful upload.
        MessageBox.Show(this, "The report is uploaded to all folders on the server!", "Successful", MessageBoxButtons.OK, MessageBoxIcon.Information);
    catch (Exception ex)
        MessageBox.Show(this, ex.Message, "Exception", MessageBoxButtons.OK, MessageBoxIcon.Error);
        Cursor = Cursors.Default;



Add SSRS report as dashboard to sharepoint

by Ewald Hofman 2. June 2009 06:58

It is possible to add a SSRS report to your Project Portal. In this way you can create a nice dashboard on you portal home page. This dashboard can consist of multiple reports, or can vary for the different roles in your team.

On the Site Actions menu, choose the Edit Page action

Click on Add a Web Part

Choose the Page Viewer Web Part, and click the Add button

Now a new web part is added to your page

Click on “open the tool pane”, and a nice tool windows pops up

Now you have to add the link to the report. To specify the link you can use: _layouts/tfsredirect.aspx?IsReport=1&ReportName=Remaining+Work

Where Remaining+Work is the name of the report you want to show.

To further customize your report, you can check out the Appearance, Layout and Advanced property groups

Advanced options

When you want to have more options when showing the report, you can add the full URL to the report and use the URL Access Parameters of Reporting Services (

The base link to the report is http://MyTfsServer/reportserver?/MyTeamProject/MyReport.

For example, when you have the following values

TFS Server = TfsServer

Team Project = Demo

Report = Work Item Hierarchy

That will make the following url: http://TfsServer/reportserver?/Demo/Work+Item+Hierarchy.

The reportserver may differ. You can find this to open the report site from the Team menu item in Visual Studio. When it says http://TfsServer/Reports_XXX, then the link will be http://TfsServer/reportserver_XXX?/Demo/Work+Item+Hierarchy.

Now you have access to the report, it is time to customize the report. First you have to indicate what should be done with the report. Default you want to render (show) the report. For this you have to add &rs:Command=Render to the URL. This will make http://TfsServer/reportserver?/Demo/Work+Item+Hierarchy&rs:Command=Render

When adding a report as dashboard to the portal, you want to hide the toolbar. To do this you have to add the &rc:Toolbar=false to the URL. This will make http://TfsServer/reportserver?/Demo/Work+Item+Hierarchy&rs:Command=Render&rc:Toolbar=false

Of course you want to tweak the default parameters also to only show the information you are interested in. To do that add &[Parameter name]=[Parameter value]. You can find the parameter name in the properties of the report itself.

So if you want to filter on the Area, then you have to specify AreaParam as parameter name. To filter on the Area \Demo\MyFirstArea, you will have to create the following URL: http://TfsServer/reportserver?/Demo/Work+Item+Hierarchy&rs:Command=Render&rc:Toolbar=false&AreaParam=\Demo\MyFirstArea

The Iteration path is a multiselect parameter to pass multiple values for this, just add twice the IterationParam to the URL. To add the filter on the iteration \Demo\Iteration0 and \Demo\Iteration1, use the following URL: http://TfsServer/reportserver?/Demo/Work+Item+Hierarchy&rs:Command=Render&rc:Toolbar=false&AreaParam=\Demo\MyFirstArea&IterationParam=\Demo\Iteration0&IterationParam=\Demo\Iteration1

[Added at 09-06-2009: update to Iteration and Area parameters for reports based on the OLAP cube like Remaining Work]

Many of the reports have an iteration and area parameter. To be able to add a value to these hierarchical parameter is somewhat tricky. In a blog post by Andrew Lynes you can find how you can solve this.

For reports that are based on the OLAP cube (such as the Remaining Work), the filtering is a bit different. It is out of scope of this post to describe how it works, but what you have to know is how you can add a filter based on iteration. The syntax is


The 156 is a node in the iteration hierarchy, which you can find in the Iteration table in the TfsWarehouse database. It is unfortunately not possible to have a nice looking URL to filter on Iteration. When you want to show the Remaining work with a filter on iteration, the URL is something like:



Report on Work breakdown in TFS

by Ewald Hofman 13. May 2009 15:32


In Team Foundation Server 2005 and 2008 it is possible to define links between work items. By doing this, you can imply a work breakdown for your project. This work breakdown is not explicit, since these links are not hierarchical. Therefore is the only report in TFS that does something with the links not very usable. It shows all work items that have some relations.

When work item A is related to work item B, the report shows you two lines:

  1. A is related to B
  2. B is related to A

When you make the agreement in a project that:

  • The highest level of the work breakdown is always a scenario in the Agile process template, and Change Request or Requirement in the CMMI process template.
  • The second level are the tasks and the bugs that are related to the root level,
  • The third level are the bugs that are related to a task

You can now create a report that shows you the implied hierarchy of your work breakdown.

When we take it even to a further level, it is also possible to create a rollup of hours. On the second and the third level, the hours are retrieved from the warehouse, and the hours at root level are a summary of the all related tasks and bugs. When you do this, you get insight in the progression of the project.


In TFS, there is the operational database, which is unsupported to use. The data from the operational database is aggregated in the TfsWarehouse database. The last data store of TFS is the SSAS cube which is used to create high performance calculations on your data.

For this report, we need the TfsWarehouse database. To get the three levels of the work breakdown we have to add the Work Item table three times to the query and link the tables together via the Related Work Item table. Then add some of the other tables to get all the other information you need. This might result for the CMMI process template in a query as below. To create the report for the Agile process template, you have to modify some parts like the Estimation field. I have attached the rdl file, which you can use as base for your own report.

    [Team Project]    = tp.[Team Project]
,    [Iteration Path]    = Iteration.[Iteration Path]
,    [Area Path]    = Area.[Area Path]
,    L0_WIType    = SUBSTRING(level0.System_WorkItemType, 1, 1)
,    L0_State        = level0.System_State
,    L0_Title        = level0.System_Title
,    L0_Estimate    = ISNULL(cw_level0._Microsoft_VSTS_CMMI_Estimate, 0)
,    L0_Completed    = ISNULL(cw_level0._Microsoft_VSTS_Scheduling_CompletedWork, 0)
,    L0_ETC        = CASE WHEN level0.System_State = 'Proposed' THEN ISNULL(cw_level0._Microsoft_VSTS_CMMI_Estimate, 0) ELSE ISNULL(cw_level0._Microsoft_VSTS_Scheduling_RemainingWork, 0) END
,    L0_EAC        = CASE WHEN level0.System_State = 'Proposed' THEN ISNULL(cw_level0._Microsoft_VSTS_CMMI_Estimate, 0) ELSE ISNULL(cw_level0._Microsoft_VSTS_Scheduling_RemainingWork, 0) + ISNULL(cw_level0._Microsoft_VSTS_Scheduling_CompletedWork, 0) END
,    L0_FinishDate    = cw_level0.Osellus_Iris_Activity_EndDate
,    L0_AssignedTo    = level0_assignedto.Person
,    L0_ResolvedBy    = level0.Microsoft_VSTS_Common_ResolvedBy
,    L0_ResolvedDate    = cw_level0.Microsoft_VSTS_Common_ResolvedDate
,    L0_ClosedBy    = level0.Microsoft_VSTS_Common_ClosedBy
,    L0_ClosedDate    = cw_level0.Microsoft_VSTS_Common_ClosedDate
,    L0_Reason    = level0.System_Reason
,    L0_WI        = level0.System_Id
,    L1_WIType    = SUBSTRING(level1.System_WorkItemType, 1, 1)
,    L1_State        = level1.System_State
,    L1_Title        = level1.System_Title
,    L1_Estimate    = ISNULL(cw_level1._Microsoft_VSTS_CMMI_Estimate, 0)
,    L1_Completed    = ISNULL(cw_level1._Microsoft_VSTS_Scheduling_CompletedWork, 0)
,    L1_ETC        = CASE WHEN level1.System_State = 'Proposed' THEN ISNULL(cw_level1._Microsoft_VSTS_CMMI_Estimate, 0) ELSE ISNULL(cw_level1._Microsoft_VSTS_Scheduling_RemainingWork, 0) END
,    L1_EAC        = CASE WHEN level1.System_State = 'Proposed' THEN ISNULL(cw_level1._Microsoft_VSTS_CMMI_Estimate, 0) ELSE ISNULL(cw_level1._Microsoft_VSTS_Scheduling_RemainingWork, 0) + ISNULL(cw_level1._Microsoft_VSTS_Scheduling_CompletedWork, 0) END
,    L1_FinishDate    = cw_level1.Osellus_Iris_Activity_EndDate
,    L1_AssignedTo    = level1_assignedto.Person
,    L1_ResolvedBy    = level1.Microsoft_VSTS_Common_ResolvedBy
,    L1_ResolvedDate    = cw_level1.Microsoft_VSTS_Common_ResolvedDate
,    L1_ClosedBy    = level1.Microsoft_VSTS_Common_ClosedBy
,    L1_ClosedDate    = cw_level1.Microsoft_VSTS_Common_ClosedDate
,    L1_Reason    = level1.System_Reason
,    L1_WI        = level1.System_Id
,    L1_FirstL2_WI    = (SELECT Min(w.System_Id) FROM [Related Current Work Items] r INNER JOIN [Current Work Item] c ON c.__TrackingId = r.[Current Work Item_TrackingIdLeft] INNER JOIN [Work Item] w ON w.__ID = c.[Work Item] AND w.System_WorkItemType IN (@Level2WIT) WHERE r.[Current Work Item_TrackingIdRight] = cw_level1.__TrackingId)
,    L2_WIType    = SUBSTRING(level2.System_WorkItemType, 1, 1)
,    L2_State        = level2.System_State
,    L2_Title        = level2.System_Title
,    L2_Estimate    = ISNULL(cw_level2._Microsoft_VSTS_CMMI_Estimate, 0)
,    L2_Completed    = ISNULL(cw_level2._Microsoft_VSTS_Scheduling_CompletedWork, 0)
,    L2_ETC        = CASE WHEN level2.System_State = 'Proposed' THEN ISNULL(cw_level2._Microsoft_VSTS_CMMI_Estimate, 0) ELSE ISNULL(cw_level2._Microsoft_VSTS_Scheduling_RemainingWork, 0) END
,    L2_EAC        = CASE WHEN level2.System_State = 'Proposed' THEN ISNULL(cw_level2._Microsoft_VSTS_CMMI_Estimate, 0) ELSE ISNULL(cw_level2._Microsoft_VSTS_Scheduling_RemainingWork, 0) + ISNULL(cw_level2._Microsoft_VSTS_Scheduling_CompletedWork, 0) END
,    L2_FinishDate    = cw_level2.Osellus_Iris_Activity_EndDate
,    L2_AssignedTo    = level2_assignedto.Person
,    L2_ResolvedBy    = level2.Microsoft_VSTS_Common_ResolvedBy
,    L2_ResolvedDate    = cw_level2.Microsoft_VSTS_Common_ResolvedDate
,    L2_ClosedBy    = level2.Microsoft_VSTS_Common_ClosedBy
,    L2_ClosedDate    = cw_level2.Microsoft_VSTS_Common_ClosedDate
,    L2_Reason    = level2.System_Reason
,    L2_WI        = level2.System_Id
            [Team Project]            tp
    INNER JOIN    [Current Work Item]        cw_level0
                ON    cw_level0.[Team Project]        = tp.__ID
    INNER JOIN    [Work Item]            level0
                ON    level0.__ID            = cw_level0.[Work Item]
                AND    level0.System_WorkItemType    IN (@RootWIT)
    INNER JOIN    [Person]                level0_assignedto
                ON    level0_assignedto.__ID        = cw_level0.[Assigned To]
    INNER JOIN    Iteration
                ON    Iteration.__ID            = cw_level0.Iteration
    INNER JOIN    Area
                ON    Area.__ID            = cw_level0.Area
    LEFT JOIN    (        [Related Current Work Items]        rel_level0_level1
            INNER JOIN    [Current Work Item]        cw_level1
                        ON    cw_level1.__TrackingId        = rel_level0_level1.[Current Work Item_TrackingIdLeft]
            INNER JOIN    [Work Item]            level1
                        ON    level1.__ID            = cw_level1.[Work Item]
                        AND    level1.System_WorkItemType     IN (@Level1WIT)
            LEFT JOIN    [Person]                level1_assignedto
                        ON    level1_assignedto.__ID        = cw_level1.[Assigned To]
                ON    cw_level0.__TrackingId        = rel_level0_level1.[Current Work Item_TrackingIdRight]
    LEFT JOIN    (        [Related Current Work Items]        rel_level1_level2
            INNER JOIN    [Current Work Item]        cw_level2
                        ON    cw_level2.__TrackingId        = rel_level1_level2.[Current Work Item_TrackingIdLeft]
            INNER JOIN    [Work Item]            level2
                        ON    level2.__ID            = cw_level2.[Work Item]
                        AND    level2.System_WorkItemType    IN (@Level2WIT)
            LEFT JOIN    [Person]                level2_assignedto
                        ON    level2_assignedto.__ID        = cw_level2.[Assigned To]
                ON    cw_level1.__TrackingId        = rel_level1_level2.[Current Work Item_TrackingIdRight]
    tp.[Team Project]        = @Project
AND    Area.[Area Path]        LIKE @AreaParam + '%'
AND    Iteration.[Iteration Path]    IN (@IterationParam)

WorkBreakDown.rdl (164,69 kb)


Reporting | Work items

Powered by BlogEngine.NET
Theme by Mads Kristensen


Widget Statistics not found.

There is an error in XML document (0, 0).X

Recent comments

Comment RSS