SSRS Deployment Automation

SQL Server reporting comes to life when SSRS (SQL Server Reporting Services) runs in SharePoint integrated mode

Report deployment is a manual process from SQL Server Data Tools within Visual Studio.  If a report has to go to multiple locations, it can be a hassle to track all the locations and ensure proper deployment and updates.

Companies often have a wide range of SSDT (SQL Server Data Tool) Solutions that contain reports. These reports today are deployed to SSRS running in Native Mode. We have developed a deployment solution to enable a single-stage deployment of reports to SharePoint to the correct and secure location per report. This tool facilitates report deployments to multiple locations across multiple VS solutions. It utilizes a single database table that contains the full mapping of Visual Studio reports to all the SharePoint locations each report may be published.

The report deployment tool is able to run from any location that can connect to both the destination SharePoint and the SQL Server housing the mapping table. The tool requires credentials to be able to publish to SharePoint.

The report deployment tool can be used to deploy and manage report publishing from multiple Visual Studio solutions to multiple SharePoint site collections:
1. Automate SharePoint Location Inventory
2. Automated inventory of all SSRS Visual Studio reports across solutions
3. Each SharePoint location can have a description associated, as can each report for publishing
4. Automatically create libraries and create the views, metadata and add the content types for as many libraries as needed
5. Each report can be published to more than one location. Reports can be mapped in bulk
6. Update all reports in SharePoint
7. Reports in SharePoint need to be rebound to data sources when published. The rebinding can be automatically done for you
8. Define the sequence that reports appear in SharePoint
9. Category and Title fields are published automatically to SharePoint
10. Creation of libraries and folders fully configured for reports can be done visually within the tool
11. Retract or delete a report from all associated locations
The tool defaults to not overwriting data sources. This is to avoid overwriting credentials entered for use by the data source. A checkbox is available to force the overwriting of data sources.

Read More

Using URL Parameters for an SSRS Report in SharePoint Integrated Mode

SSRS running in SharePoint integrated mode provides a great experience, but how can one pass in parameters to such a report (rdl) published to SharePoint?

here’s a URL that works for what you intended:
http://WebAppName/sites/SiteName/WebName/_vti_bin/reportserver?http://WebAppName/sites/SiteName/WebName/LibName/ReportName.rdl&rc:Parameters=Collapsed

Here’s how to construct the URL:
1. Get the SPWeb; this is the lowest site for the RDL report. In this case it is http://WebAppName/sites/SiteName/WebName which is in the SPWeb below the site collection.
2. Add on “/_vti_bin/reportserver?”
3. Add on the full URL of the RDL; you can get that by going to the All Files view, and clicking the ellipses (“…”) and clicking on the full URL and copy. Or right click on the RDL link and copy URL.
4. Add on your &parameters

Parameter examples include:

Collapsing parameter pane: &rc:Parameters=Collapsed

Zooming to whole page or page width: &rc:Zoom=Page Width

Jump to report section: &rc:Section=4

Happy reporting!

Custom SQL Reporting in MS-Project Server 2013

Custom SQL Reporting in MS-Project Server 2013

It is easy to navigate the database schema in MS-Project Server to generate reports.  The SQL can be embedded in an ODC, or can be used within PowerPivot.  If joining Task and Project data, there’s a challenge of rollups.  The first challenge is avoiding double-counting from summary tasks.  The solution is to exclude them on the join, adding this condition:

where TaskIsSummary=0

The next source for double-counting are external tasks; those exposed through cross-linking tasks in separate projects. We can exclude both this way:

where TaskIsSummary=0 and TaskIsExternal = 0

The next problem is if merging task and project tables, project values would roll up incorrectly, however such numeric fields can be pro-rated to the project work, as long as we avoid divide-by-zero errors, here’s how, referencing a custom field called “Budgeted Costs”; note how its value is proportionate to the task work:

, case
when [MSP_EpmProject_UserView].[Budgeted Costs] = 0 THEN 0
when MSP_EpmTask_UserView.TaskRegularWork = 0 THEN 0
when MSP_EpmProject_UserView.ProjectWork = 0 THEN 0
else
[MSP_EpmProject_UserView].[Budgeted Costs] * ( MSP_EpmTask_UserView.TaskRegularWork/ MSP_EpmProject_UserView.ProjectWork )
END as [Budgeted Costs]
FROM dbo.MSP_EpmProject_UserView INNER JOIN dbo.MSP_EpmTask_UserView
ON MSP_EpmProject_UserView.ProjectUID = MSP_EpmTask_UserView.ProjectUID
where TaskIsSummary=0 and TaskIsExternal = 0
ORDER BY MSP_EpmProject_UserView.ProjectName, MSP_EpmTask_UserView.TaskIndex, MSP_EpmTask_UserView.TaskName

One step further, we can do the same using task assignment data, here’s what that looks like using the assignment work:

, case
when [MSP_EpmProject_UserView].[Budgeted Costs] = 0 THEN 0
when MSP_EpmAssignment_UserView.AssignmentWork = 0 THEN 0
when MSP_EpmProject_UserView.ProjectWork = 0 THEN 0
else
[MSP_EpmProject_UserView].[Budgeted Costs] * ( MSP_EpmAssignment_UserView.AssignmentWork/ MSP_EpmProject_UserView.ProjectWork )
END as [Budgeted Costs]
,[MSP_EpmResource_UserView].[Cost Type]
,[MSP_EpmResource_UserView].[Resource Departments]
,[MSP_EpmResource_UserView].[RBS]
,[MSP_EpmResource_UserView].[Resource Title] FROM dbo.MSP_EpmProject_UserView INNER JOIN dbo.MSP_EpmTask_UserView ON MSP_EpmProject_UserView.ProjectUID = MSP_EpmTask_UserView.ProjectUID LEFT OUTER JOIN dbo.MSP_EpmAssignment_UserView ON MSP_EpmTask_UserView.TaskUID = MSP_EpmAssignment_UserView.TaskUID AND MSP_EpmTask_UserView.ProjectUID = MSP_EpmAssignment_UserView.ProjectUID LEFT OUTER JOIN dbo.MSP_EpmResource_UserView ON MSP_EpmAssignment_UserView.ResourceUID = MSP_EpmResource_UserView.ResourceUID
where TaskIsSummary=0 and TaskIsExternal = 0
ORDER BY MSP_EpmProject_UserView.ProjectName, MSP_EpmTask_UserView.TaskIndex, MSP_EpmTask_UserView.TaskName

Metadata Warehouse of SharePoint Content

I often write custom reports to be able to analyze huge data from a SharePoint farm. I can ansewr questions such as:
– Number of documents updated per user per month over time.
– What are the metadata fields and values in use, and their frequency by department.

I typically export as a CSV, using pipe delimited format (to avoid misinterpreting the ubiquitous comma), and import into a spreadsheet.

What happens when the dataset is too large for Excel? The data I analyze often pushes the limits of Excel or my machine. In that case, I’ll import into SQL Server. To make it repeatable, I’ll create a import process, that drops and recreates the target table, defines the fields, and how to handle errors for each, then export as an SSIS DTSX package.

To run a DTSX package, I’ll import into Business Intelligence Studio as a package and run from there.

Once in SQL Server, one can handle larger datasets, with all the tools and techniques of SQL for dealing with really big data.

The data can be exposed as an SSRS report.