Jul 232012

A client with SQL Server 2005 asked if there was a good way to import and deploy packages automatically, rather than having to do each package manually every time. This particular client has packages deployed to SQL Server Integration Services. I have written a separate blog post on the Automating Deployment of SSIS Packages in SQL 2005. Here is the solution for importing packages.

BIDS Solution:

Open (or create) the BIDS solution for the related packages which you want to import and deploy. The related packages should all reside in the same folder on Integration Services, otherwise the deployment piece will not work as it can’t deploy to multiple folders. You will need a separate BIDS solution for each SSIS folder, and you will need to create a separate _ImportPackages.dtsx package for each solution.

In order to make this somewhat scalable, the solution folder structures will need to incorporate the exact same folder name as the folder on SSIS, since the SSIS folder name will be used in the import query. For example, if your group of packages is deployed to a folder called “Maintenance” on SSIS, you will want your solution to be called something like “Maintenance ETL”. In this way, once you create your _ImportPackages.dtsx you will be able to copy it to the next solution and simply edit the folder name in the query and the Batch File connection manager. You’ll see what I mean.

Create SSIS Package to Import Deployed Packages:

Create a new SSIS package called _ImportPackages.dtsx. I put the underscore in the name so it will float to the top of the SSIS solution when you open it, thereby reminding you to run the package and get the latest copies of the packages deployed to SSIS. The package will look like this:




Create a connection to the msdb database on the server where the SSIS packages are being stored.


Although the packages are deployed out to Integration Services


They are actually stored in the MSDB database, which is what you will write your source query against.




Create a Flat File Connection Manager called “Batch File”. Create a flat file called “_Import Packages.bat”. It will house the dtutil commands in a .bat file which will import all the dtsx packages from the MSDB. Choose a location within your BIDS SSIS solution folder, since you will have one of these batch files per BIDS SSIS solution.


There is one column called dtutil_command.


The column width is 4000 to accommodate the length of the dtutil command lines.


STEP 1 in the Package: Delete Batch File

Drag a File System task from the Toolbox into your Control Flow. Rename it “Delete Batch File”, and configure it to the operation “Delete file” with a Source Connection to the ”Batch File connection” you created above.


STEP 2 in the Package – Populate Batch File

Drag a Data Flow task from the Toolbox into the Control Flow surface. Rename it to Populate Batch File. Connect the “Delete Batch File” task to the “Populate Batch File” task on success. Move to the Data Flow tab and drag an OLE DB Source on the Data Flow surface. Configure it as follows:


Here is that Query again, so you can cut and paste it:

‘dtutil /SQL "’
+ case
when len(FolderName) = 0 then ”
else FolderName + ”
+ [name]
+ ‘" /ENCRYPT FILE;"C:BI ProjectsETL’
+ case
when len(FolderName) = 0 then ”
else FolderName + ‘ ETL’ + FolderName + ‘ ETL’
+ [name]
+ ‘.dtsx";1’
+ ‘ /QUIET’ –suppresses prompt to delete if an older file with same name exists )
as nvarchar(4000))
as dtutil_command
from msdb.dbo.sysdtspackages90 pkg
join msdb.dbo.sysdtspackagefolders90 fld
on pkg.folderid = fld.folderid
WHERE foldername = ‘Maintenance’ —CHANGE VARIABLE
order by FolderName, [name]


The results of this query will look like this:

dtutil /SQL "MaintenancePackage1" /ENCRYPT FILE;"C:BI ProjectsETLMaintenance ETLMaintenance ETLPackage1.dtsx";1 /QUIET
dtutil /SQL "MaintenancePackage2" /ENCRYPT FILE;"C:BI ProjectsETLMaintenance ETLMaintenance ETLPackage2.dtsx";1 /QUIET

Notice that the folder name from SSIS is also used in the folder structure on the file system. You will need to adjust the query to suit your naming convention for your BIDS solutions.

Drag a Flat File Destination onto your Data Flow tab and connect the OLE DB source to the Flat File destination. Configure the Flat File destination to the Batch File connection.


Step 3 in the Package: Run Batch file

Drag and Execute Process task from the Toolbox onto the Control Flow tab. Connect the “Populate Batch File” task to the “Run Batch File” task on success. Configure the Executable on the Process tab by navigating to the _Import Packages.bat you created earlier.


Now you can run the SSIS package you just created. It will import all the packages in the specified SSIS folder into the solution folder you specified in the SQL Query.

***Please note that this will not add any new packages to your BIDS solution. It will only refresh existing packages with any deployed changes. You will still need to add the packages within the BIDS solution to the SSIS Packages folder even though they are sitting in the file system. You don’t want to simply ‘Add Existing Package’ either, as you will end up with a duplicate file in your BIDS directory.  I’ve blogged a simple way to Add Multiple Packages to a BIDS solution

Jul 232012

After much playing with SSIS packages I worked out how to Automate Import of Packages to File System using SQL to write out a dtutil batch file.  Then I worked on how to deploy packages back out to the SQL store from the file system. I imagine it might have been accomplished in a similar fashion as the import using a For Each File loop.  Instead I came across a very simple solution for deploying packages to the SQL Store. There is a free CODEPLEX download called BIDS Helper. It doesn’t even require an install,  It’s just an Add In for Visual Studio.

Simply download the appropriate zip file version from here and unpack it into your My DocumentsVisual Studio <version>Addins folder. Close and reopen BIDS and the deploy functionality is there.


More detailed instructions on using it are here.

**Please note, if you have passwords embedded in your packages and want the ability to automatically deploy while maintaining the correct protection level you might want to use Package Configurations to maintain the passwords.  In this way the passwords won’t get stripped from the package on deployment since they will be stored in SQL Server or elsewhere.

Jul 232012

A very simple way to add multiple new packages to a solution is to place them in a temporary folder other than the solution folder (otherwise they will be duplicated), and then in Windows Explorer to multi-select them, copy them then navigate to the SSIS Packages folder in your BIDS solution paste them.

If you used the package created in the Automate Import of SSIS Packages to File System post, then first move the packages out of the solution folder into another folder. I will temporarily move them into one folder upwards. C:BI ProjectsETLMaintenance ETL


Now multi-select the packages you have moved and Copy them (CTRL+C – or- right click Copy).


Navigate to your BIDS solution and select the SSIS Packages folder. Paste the packages. (CTRL+V – or- right click Paste)


The packages are now part of the solution.


They have also been added to the solution folder.


Don’t forget to delete the packages from your temporary folder once they have been successfully added to your solution.

May 292012

In a continuation of my last blog post on Finding Report Subscription Errors, there was a tricky little reason why the subscription was failing for some of the parameter values, even though the reports could all be run manually.  Just like the Current User Filter in SharePoint, it is CASE SENSITIVE.

I got this error in the trace log:

library!WindowsService_113!308!05/29/2012-10:31:17:: e ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportParameterException: , Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportParameterException: Default value or value provided for the report parameter ‘Manager’ is not a valid value.;

But all of the values being fed to my subscription were valid values which worked when manually running the report.  Why did some of the reports in the subscription render successfully and others not?  It turns out the underlying query in SSRS for the Manager parameter dropdown had a mix of either lowercase or the first two letters capitalized. The query I was using in my data-driven subscription was all lowercase.  So only those in the report dropdown which were lowercase were running successfully. 

I solved the problem by changing the underlying query feeding the SSRS Report Parameter to lowercase, and ensuring that my subscription query was also lowercase.  This resolved all the errors.

May 292012

I had an issue trying to find what was causing a data-driven report subscription error.  The SSRS report is deployed to SharePoint, and Reporting Services is in SharePoint Integrated mode. The subscription was showing last results as “Done: 15 processed of 15 total; 7 errors.” It took awhile to find the pieces I needed to figure out what was causing the error.

I found some information about looking at Report History in SharePoint and creating a New Snapshot, which would give me the last known error.  However, since some of the reports had run successfully this did not work. 

Next I tried looking at the Report Server database in the ExecutionLog tables.  I isolated the query results to just the one report subscription by writing a query like this:

Use ReportServer
select * from ExecutionLog3
where  RequestType = ‘Subscription’ AND timeStart >’2012-05-29 11:30:00.000′

This only returned the successes, not the failures.

Finally, I tried looking in the Report Server Trace Log file.  There was very little in the log file and nothing to do with my subscription.  I knew that the Trace Log file should hold the information I needed.  After much poking around I realized that I had made a fundamental error in my assumption about the architecture. The Trace Log file resides on the SharePoint server, not the Report Services database server. 

Find the Trace Log

The Trace log files can be found on the SharePoint server, usually here:  C:Microsoft SQL ServerMSRS10_50.MSSQLSERVERReporting ServicesLogFiles

Find the ScheduleID for your Subscription

Find the Schedule ID of the most recent subscription by querying the ReportServer database:

‘SubnDesc’ = s.Description,
‘SubnOwner’ = us.UserName,
‘LastStatus’ = s.LastStatus,
‘LastRun’ = s.LastRunTime,
‘ReportPath’ = c.Path,
‘ReportModifiedBy’ = uc.UserName,
‘ScheduleId’ = rs.ScheduleId,
‘SubscriptionId’ = s.SubscriptionID
from ReportServer.dbo.Subscriptions s
join ReportServer.dbo.Catalog c on c.ItemID = s.Report_OID
join ReportServer.dbo.ReportSchedule rs on rs.SubscriptionID = s.SubscriptionID
join ReportServer.dbo.Users uc on uc.UserID = c.ModifiedByID
join ReportServer.dbo.Users us on us.UserID = s.OwnerId

Search for the ScheduleID in your Trace Log and find the Error message

Open the appropriate log file based on the time stamp being the most recent after the subscription ran, and search for the ScheduleID.  Once you find the first entry for your ScheduleID, look for anything that starts with e ERROR

The error message can look like this:

library!WindowsService_113!308!05/29/2012-10:31:17:: e ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportParameterException: , Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportParameterException: Default value or value provided for the report parameter ‘Manager’ is not a valid value.;


In preparing this post, I found the following articles to be useful:


May 152012

This is a neat little trick.  My client wanted to filter a report by employee manager, but didn’t have a good way to maintain the list of employees and their managers.  There were a few possibilities, but we wanted to use a system that already maintained the relationship between employee and manager.  We decided to pull the information out of Active Directory, using an SSIS package.

Addendum Sept. 12, 2012:  The approach outlined below works fine for a simple data set where the query will not return more records than the AD paging file size will allow (typically 1000 records), and you are only querying single value attributes.  However, I recommend you use this Script Task approach instead of using a Data Flow.  It is more powerful and flexible yet still simple method of importing any size data set from Active Directory, regardless of paging file size.  Read the step-by-step here:  Get around Active Directory Paging on SSIS import

Find the Active Directory Server

The first thing I needed to know was which server Active Directory resided on.  I didn’t want to have wait for I.T. to get back to me with that answer.  There is a very easy way to find this out if you are logged into the same domain.  Just run a command prompt and use the command


This will return the active directory server name and IP address.

Create the ADO.NET Connection in SSIS

Create your SSIS solution and your package.  Now create the connection to the Active Directory server.  There is a little trick to this.  In order to use the connection in a data flow you need to create an ADO.NET connection.  During the create process, in the Provider drop down, change the Provider to OLE DB Provider for Microsoft Directory Services.  Then enter your server name and test as normal.  If you were to create an OLE DB connection instead of ADO.NET and change the provider, it would not work for Data Flow connections.  It must be an ADO.NET connection type. 


Use ADO NET Source in your Data Flow

On the data flow tab, drag an ADO.NET Source onto the design surface.  Open up the ADO.NET Source Editor and choose the connection you created in the previous step.  Choose “SQL Command” for Data Access mode.  Now you will write a query to pull the desired information from Active Directory.  Your query will look something like this:

SELECT displayName, Mail, Title, physicalDeliveryOfficeName,  telephoneNumber, Manager, name, sAMAccountName
  FROM ‘LDAP://yourservername’
WHERE objectClass=’user’

or this, depending on how you like to structure your LDAP queries.

<LDAP://yourservername>; (objectClass=user); displayName, Mail, Title, physicalDeliveryOfficeName, telephoneNumber, Manager, name, sAMAccountName

You must put single quotes around the server.  For more information on what attributes are available to be imported from Active Directory I found this to be helpful:  http://www.kouti.com/tables/userattributes.htm

You can test your query in SSMS like this:

SELECT displayName, Mail, Title, physicalDeliveryOfficeName, telephoneNumber, Manager, name, sAMAccountName

FROM Openquery (adsi, ‘SELECT displayName, Mail, Title, physicalDeliveryOfficeName, telephoneNumber, Manager, name, sAMAccountName

FROM “LDAP://yourservername” WHERE objectClass = ”user” ‘)

For this SQL Query to work you will need to add your AD server as a linked servier on your SQL server and name it ADSI.   In SSMS open up Server Objects and right click on Linked Servers.  Select New Linked Server.   On the General tab select “OLE DB Provider for Microsoft Directory Services” as the Provider.  Type in your AD server name in the Product Name and the Data source.  Leave the other fields blank and click OK. 

When running your query, if you get an error like this:  “Cannot fetch a row from OLE DB provider “ADsDSOObject” for linked server “adsi”,  there is a good chance your query is returning more rows than the AD paging file size allows.  In that case you can do one of two things: 1) filter your query further to return fewer rows (try filtering on a specific sAMAccountName to be sure that the query syntax isn’t the problem) or 2) use a C# script which is not affected by paging file size as outlined here Get around Active Directory Paging on SSIS import .

Data Formatting

Each field will be of NTEXT type.  You can leave them as NTEXT if this is an acceptable format for your destination, or you can add Derived Columns which convert the values in an expression, like this:





Add your Destination

Add your data flow destination and map the fields as you wish.  In this case I created a table in the data warehouse to receive the Active Directory data.

You can now finish your package and deploy it.  This is a simple but very powerful method.  I found it very useful and I know I’m going to be using it again.

Addendum Sept. 12, 2012: This approach works fine for a simple data set where the query will not return more records than the AD paging file size will allow (typically 1000 records), and you are only querying single value attributes. However, I recommend you use this Script Task approach instead of using a Data Flow. It is more powerful and flexible yet still simple method of importing any size data set from Active Directory, regardless of paging file size. Read the step-by-step here: Get around Active Directory Paging on SSIS import


In preparing this post, I found the following articles to be useful:



Apr 242012

This is an addendum to my previous blog post How to use a SharePoint list as a data source in your SSIS package.

If you are populating a SharePoint lists with a field of date type, you need to format the date as text, in the following format, in order for SharePoint to accept the input from SSIS.

‘yyyy-mm-dd’      or for one digit months or days      ‘yyyy-m-d’

So if you’re using a SQL Query you can write something really ugly, like this:

CAST(DATEPART(Year, GETDATE()) AS varchar(4)) + ‘-‘ + CAST(DATEPART(Month, GETDATE()) AS varchar(2)) +  ‘-‘ + CAST(DATEPART(Day, GETDATE()) AS varchar(2))  AS [Adjusment Date]

Apr 192012

There are many times when my clients want to be able to refresh data on demand, without having to wait for a nightly data warehouse refresh.  Quite often this occurs when they are updating data in SharePoint lists which get fed into the data warehouse and from there into cubes and reports.  There is a very simple way to allow a user to trigger the data refresh of just their specific piece of data, using Nintex Workflow.  If you don’t know about Nintex, it is a third party product which adds a drag-and-drop workflow designer and advanced workflow features to SharePoint.  It is inexpensive, and a great way to empower users to manage their business processes.

In this example a user has updated a series of SharePoint lists with current vendor scorecard data and would like to see it reflected in the cube and reports.


You will need to create an SSIS package which triggers the data refresh desired by the user.  When building my SSIS packages which populate a data warehouse I like to keep it modular, with related actions in individual packages, and then use a master package to trigger the packages in the correct order.  In this way I can trigger smaller jobs simply by creating additional master packages to run the relevant packages which already exist.  This is best practice since I only have to maintain the logic in one package, which can then be used by multiple master packages.  I normally create one package for each of my dimension (attribute) tables and one for each of my fact (transactional) tables wherever possible, unless they are interdependent.  I try to keep each package as modular as possible so I can trigger them in different sequences where necessary.

Using a master package to trigger a sequence of packages has many advantages over doing multiple steps in SQL Server Agent.  I only trigger one master package using a SQL Server Agent Job.  It allows me to keep the order of running a sequence of packages in SSIS where I can add annotations to remind myself why certain packages need to be run before others. This helps me remember the interdependencies when I am making changes.  It also allows me to control the checkpoints, which gives me the ability to rollback multiple steps where necessary.  And it helps me when reviewing the SSIS logs.  I can see how long the master package takes to run from beginning to end, and compare that over time.


For this example I have a master package which populates the entire data warehouse and refreshes the data cube at the end.  This is run nightly.   I will add an additional master package, to be triggered by the user, which only refreshed the vendor scorecard data from the SharePoint lists into the data warehouse and refreshes the data cube.  We’ll call it MASTER Vendor Scorecard Refresh.



Now create a SQL Server Agent Job with one step which triggers the SSIS package you just created.  Do not schedule this job, since it will be triggered manually by the user. 


You will want to add a notification to email you when the job completes while you are doing your testing. You could change the notification to when the job fails once you know it’s working correctly.


Create a SharePoint list, called Vendor Scorecard Data Refresh.  Add one item to the list.  You could name the item Trigger Vendor Scorecard Data Refresh Workflow.  In your List tool bar select Workflow Settings, Create a Workflow in Nintex Workflow.



Leave the template as Blank and click the Create button.


In the bottom left of your screen choose the workflow action type of Integration.


From the resulting actions which appear in the Workflow Actions pane, drag the Execute SQL action onto your workflow.


Double click in the center of the Execute SQL icon to configure the action.  Select OLEDB, key in your connection string. 

Data Source=yourserver;Initial Catalog=yourdatabase;Provider=SQLNCLI10.1;Integrated Security=SSPI;

check the box for Use Windows authentication.  Key in the username and password that has permissions to to trigger the SQL Server Agent Job. Enter the query which will trigger the job.

EXEC msdb.dbo.sp_start_job N’Vendor Scorecard Refresh’



Click on Save to save the action.  Publish the workflow by clicking on the Publish button.



Navigate to your Vendor Scorecard Data Refresh SharePoint list.  Click on the dropdown beside your one item and select Workflows.


Select the workflow you want to start.  You can store more than one workflow on a SharePoint list, but this can get confusing to users so I recommend you keep it to one workflow per SharePoint list unless there is a very good reason to do otherwise.


Click the start button to trigger the workflow.


If you set up notifications on the SQL Server Agent job you will receive an email when the job completes.  The email will tell you if the job was successful. 


If you did not set up notifications you can check the history of the job. The history will not appear on the job until the job completes, so you will need to wait a few minutes for the steps in your SSIS package to complete before you can check if it ran successfully. In SSMS right click on the SQL Server Agent Job and select View History.  You will see a green checkmark beside the date and time if the job ran successfully.  If it fails you will see a red X.


That’s it, your done.  Users can now trigger their own data updates as required.

Mar 292012

It is not an uncommon requirement to be able to publish an SSRS report to SharePoint and then use it in a Web part filtered by the user viewing it.  You include the domainuserid in your main report query and filter the query on this value using a report parameter where the user value can by typed in.

However, in many cases the report does double duty as a report stored in a Report library, where users could choose multiple or ALL users.  In that case you want to have a list of Available Values in your parameter so that someone viewing the report in the library can pick from a list of users rather than have to know everyone’s user id.  The tricky part is formatting the domainuserid in your parameter so SharePoint can use it, and adding yourself to the list of available values so you can test that the Web part works as expected.

These are easy to do, but it took me a little bit of time to figure it out, so I’m blogging it here.


The SharePoint Current User Filter expects the domainuserid in the parameter list of available values to be all lowercase.  So when creating the SQL Query for  need to convert them to lower case.  You also want to add yourself to the list for testing purposes, otherwise you will get an error when looking at the Web part in SharePoint.  You can do this using a UNION clause.  Your query for the list of available values should look like this:

‘mydomain’ + LOWER(myuserid) AS UserAccount,
myUserName as UserName
FROM myTable

‘ ALL’ AS UserAccount,
‘ ALL’ AS UserName

‘mydomainmydevuserid’ AS UserAccount,
‘mydevusername’ AS UserName

I won’t go into the details here of how to use ALL in your report, but you can read more here http://thedataqueenblog.azurewebsites.net/2011/06/how-to-default-to-all-in-an-ssrs-multi-select-parameter/

Configure your user parameter as follows:




Publish your report to the relevant SharePoint library, and navigate to the library and test that the report and the parameter is working as expected.


Add the report to a web part as you normally would.  You would add a web page, and configure a web part to be a SQL Server Reporting Services Report Viewer.  Edit the web part and navigate to the report you created.  Open the Parameters section and click on the Load Parameters button.  You can leave the parameter default as “Use Report Default Value”.  Click Apply and OK.


Add a web part and choose the Current User Filter type from the Filters section.  It will say that it is not connected.


Go back to your Report Viewer web part and from the drop down choose Connections –> Get Report Parameters From –> Current User Filter


A dialog box will pop up where you can choose your User parameter and click on Finish.


Check in your changes and view the results.  You will be able to see your web page with the report filtered on your user name.  As a developer, if you do not have any values in this report you should see the report with no values returned, rather than getting an error.  This is because you added your userid to the list of available values at the beginning of this exercise.  You might want to remove yourself from the list once you have tested that the web part is working correctly.

Mar 162012

This is one in a series of posts on using InfoPath. If you are new to SharePoint lists and InfoPath you might want to start with the Best Practices.

Best Practices for Cascading SharePoint Lists using InfoPath Forms

Use a SQL Server data source for an InfoPath pick list

Continuing with the same example I used in the Best Practices post, I’m going to make a change to an existing InfoPath form.  Navigate to the SharePoint list called Supply Chain Scorecard Metrics and add a new item. In the Section drop down, notice that I’ve included the Goal as part of the Section description (Goals are Improve, Maintain and Track). 


What I could do is have the user choose the goal, and then filter the Section list to show only those Sections which have that Goal.  Here’s how we can do that.


Navigate to your Supply Chain Scorecard Metrics list and in the List menu click on the List Settings.  Add a column called Goal_ID with type of Number.  Require that the column contains information.  When the user selects a Goal we will store the Goal ID in the Metrics list, and then use that to filter the drop down list of available Sections the user can choose from.



Navigate to your Supply Chain Scorecard Metrics list and in the List menu click on the Customize Form button to open up the InfoPath form.


Click OK and enter a password if necessary.  Click Yes when it asks if you want to update the fields from the SharePoint list.

Add a row to your InfoPath form above Section by highlighting the Section row, right click and select Insert –> Rows Above. 


From the Fields window, drag the Goal_ID field into the right hand cell in the row you just created.


Delete the label that says Goal_ID:, and type in Goal in the cell to the left of the Goal_ID textbox.


Right click on the Goal_ID text box and select Change Control –> Drop-Down List Box


Right click again and select Drop-Down List Box Properties.


Select Get choices from an external data source and then from the data source drop-down choose Supply Chain Scorecard Section.


Set Value = Goal_ID and Display name = Goal.  Check the box that says Show only entries with unique display names.  Click OK.



Right click on the Section drop down list box and select Drop-Down List Box Properties.Beside the Entries box, click on the tree icon.


Click on the Filter Data button.


Click on the Add button.


Now we will specify on what to filter the Section list.  In the first drop down select Goal ID. Leave the second drop down as “is equal to”.  In the third drop down select “Select a field or group”


Change the data connection in the drop down to Main. Navigate to dataFields and then to Goal_ID.


Click all of the OK buttons. Publish your InfoPath form.

Add a new item.  Select Improve as your Goal.  Notice that the Section drop down has only those Sections that have a Goal of Improve.


And that is how you filter a drop down from another value in your list.

Wordpress SEO Plugin by SEOPressor