Report launcher to run SSRS report subscriptions on demand

On you can find my latest tip Report launcher to run SSRS report subscriptions on demand.

SSRS Subscriptions are great feature, but if you would like to run the subscription on demand out of the predefined schedule, it can be a tricky and you will start to dig into the SSRS database and agent jobs.

The tip describes, how you can easily create a set of simple SSRS reports, which allow very easy on demand launching of subscriptions.

Solving issue with first long starting report on SSRS 2008

As I wrote in my previous blog post First report and report after specific time interval starts a long time on MS SQL 2008 Reporting Services, a first report after specific time takes very long to start.

The issue is caused by the way how SSRS works and SSRS regularly restarts application domain after specific time period. After the application domain is restarted, then upon first request to the SSRS it needs to load all the settings and it takes quite a long time.

There is no real solving to the issue except increasing the interval between the application domain restarts from default 720 minutes to other value which meets your business needs more closer.

However even after increasing the value, then after the period is reached, the application domain is restarted and again the first request will take a long time. It could be ideal to optimize the interval so the app domain restart is done out of business hours. however even then fist report will take a long time.

Here is a possible workaround solution. It rests on the scheduler and execution of a PowerShell script, which stops and starts the SSRS service (which has the same effect as the application domain restart) and after the restart it makes a request to the report manager URL which forces the reporting services to load all the configurations etc. Then all the subsequent request to SSRS are immediate.

So if we set the RecycleTime in the rsreportserver.config to a value which is over one day let’s say 1500 minutes (it is 25 hours) and schedule the execution of the PowerShell script out of the business hours, each morning we will have SSRS ready without any delays. For details about modifying the RecycleTime take a look on my previous post mentioned above.

So here is the PowerShell script:

Stop-Service "SQL Server Reporting Services (MSSQLSERVER)"
Start-Service "SQL Server Reporting Services (MSSQLSERVER)"
$wc = New-Object
$cred = [System.Net.CredentialCache]::DefaultNetworkCredentials
$wc.Credentials = $cred
$src = $wc.DownloadString("http://localhost/Reports/Pages/Folder.aspx

The script above first stops the SQL Server Reporting Service of the default (MSSQLSERVER) instance and immediately starts it again (stopping and starting the service has the same effect as application domain recycling). Then an webClient object is created which is used to fetch the Report Manager page which causes the reporting services to load all the settings. The page is read as string (it doesn’t matter how we read the page. Important is to make a request to initialize the reporting services) and it will take a longer time (like the first report start).

It is also important to get the DefaultNetworkCredentials of the user account under which the script will be executed. It is necessary to assign those credentials to the web client so it can authenticate to the reporting services.

Also it is important to mention that it is necessary to execute the script with elevated administrative privileges to be able to stop and start the service.

You can create a scheduled task using the Scheduled Tasks GUI or execute a below command to create the scheduled task from within a command prompt. The command prompt needs to be running with elevated administrative privileges.

schtasks /create /tn "SSRS Recycle" /ru UserName /rl highest /np /sc daily /sd 08/01/2011 /st 02:00 /tr "powershell.exe -noprofile -executionpolicy RemoteSigned -file c:scriptsSSRSRecycle.ps1"

This command creates a new scheduled task named “SSRS Recycle”, which will be run non interactively with elevated rights as UserName. The task will be executed daily at 02:00 am starting from 1st of August 2011 and will execute a PowerShell script SSRSRecycle.ps1 located in folder C:scripts.

For details about schtasks you can take a look on MSDN Schtasks.exe.

As mentioned in the beginning, it is not real solution to the problem with recycled application domains, however it provides an acceptable work around and you will have every day reporting services ready and available without any delays.

First report and report after specific time interval starts a long time on MS SQL 2008 Reporting Services

Today an interesting article was published on MS SQL Tips: How to Speed up MS SQL Reporting Services on First Run.

It’s about solving a problem with slow loading of first report (or even accessing the report manager) in MS SQL 2005 Reporting Services. On SSRS 2005 the problem by default occurs every 20 minutes of idle time.

The same problem occurs also on MS SQL 2008 and MS SQL 2008R2 Reporting Services, but in 12 hours intervals. As Reporting Services 2008+ doesn’t use the IIS server and use it’s own integrated web service, solving the problem is quite different.

On SQL Server Reporting Services 2008 the problem occurs when an Application Domain is recycled and by default it’s 12 hours (720 minutes). After this time a new application domain is created an all new requests are redirected to the new application domain. Once the application domain is recycled, the new domain has to authenticate to the database server, load all necessary structures and this prolongs the first report startup after the domain is recycled.

If this default behavior (recycling every 12 hours) doesn’t meet your business needs as, you can modify the Report Server configuration file and specify a value which will be closer to your needs.

The configuration is stored in the rsreportserver.config file which is located in the installation directory of the Report Server instance. E.g. for default instance of SQL Server 2008 R2 Reporting Services it is located in ”C:Program FilesMicrosoft SQL ServerMSRS10_50.MSSQLSERVERReporting ServicesReportServer” directory.

The part which interests us is in the Service Node and it’s RecycleTime parameter:


By modifying the RecycleTime, you can shorten or prolong the recycling interval. The interval is specified in minutes and as mentioned above the default value is 720 minutes (12 hours).

Detail on the Application Domains for Report Server Applications can be found on MSDN. Details about the RSReportServer configuration file is also available on MSDN.

As mentioned in the documentations the MaxAppDomainUnloadTime is a time interval in minutes for which the Reporting Services are waiting for any current operation to finish during recycling the Application Domain. As It was mentioned during recycling Application Domain a new Domain is created and all new requests are directed to the new Application Domain and the current one is leave for the MaxAppDomainUnloadTime if there are currently some jobs running. If the jobs are not finished during the MaxAppDomainUnloadTime, the Application Domain is restarted and all jobs terminated.

If you have some some reports, especially scheduled one, which could take longer than the default 30 minutes to process, you can also consider to alter this parameter according to your needs. On the other side, if there is some job running for such a long time, it should be analyzed why it takes so long and whether it could be optimized in some way.

There is a lot of other interesting parameters in this configuration file, which could useful especially in the Scale-out SSRS Deployment scenarios. I will focus on some of them in one of the future posts.

For the overall workaround of this problem using PowerShell, see my later post Solving issue with first long starting report on SSRS 2008.

SSRS 2008 & Multi-Line email body in Data-Driven Subscription delivered by email


When you create a report in SSRS 2008 and then want to create a Data Driven Subscription to this report and want to deliver it by email you can come to a problems with the message in the email body.

Let’s say we have a test report and create Data Driven Subscription to this report. The query in the subscription will return fields SubscriberEmail and EmailBody. In the SubscriberEmail we will have email address to which the report should be delivered and in the EmailBody we will have a message we want to send in the email (e.g..explanation of the report, how to use it etc.)

    '' AS SubscriberEmail,
    N'First Article of Email Body.
Second Article of Email Body.
Third Article of Email Body.' AS EmailBody

The setting could be like below (Comment is email body). Also notice the edit box for the Comment static value.. It’s a small single line TextBox.

SSRS Data Driven Subscription Step 4

SSRS Data Driven Subscription - Email Body

A problem occurs when  you want the email body to contain several articles separated by new lines. All the new lines in the delivered email will be removed and all the articles will be delivered as a single article.

First Article of Email Body.  Second Article of Email Body.  Third Article of Email Body.

In contrast, when you are creating a normal subscription  the Comment text box is multi line and you will receive a correct multi line email.

SSRS Standard Email Subscription

There is no way to send a simple text as multi line email with several articles.


As I mentioned above, you are not able to send a simple test as multi line email with several articles. Although, there is workaround and if the email client is accepting HTML, you can put as the email body a simple HTML source code. Then the email body will be correctly rendered in the email client as multi line text.

    '' AS SubscriberEmail,
    <meta content="text/html; charset=ISO-8859-1" http-equiv="content-type">
    <p>First Article of Email Body.</p>
    <p>Second Article of Email Body.<p>
    <p>Third Article of Email Body.<p>
</html>' AS EmailBody

As I was searching over the Internet I found, that people have the same problem also with SSRS 2005. The issue is solved in SSRS 2008 R2 and in the R2 it works correctly even without the HTML.

SQL Server Reporting Services – Report schedule not deleted in SSRS 2005 and 2008

If you delete a job which handles particular SSRS schedule by mistake, then it can happen, that you will be unable to delete a report schedule.

You delete the schedule and when you refresh the management the schedule you wanted to delete appears again.

This is caused by a trigger [dbo].[Schedule_DeleteAgentJob] trigger on the [dbo].[Schedule] table in the report server database, which doesn’t check for the job if it exists before it tries to delete it. And therefore if you deleted the job by mistake, then there is raised an exception in the trigger and whole transaction with deletion of schedule is rolled back.

You can simply modify the trigger by adding additional check for the existence of the job. And once the modification is applied, your SSRS schedules will work even you delete the job for that schedule manually.

ALTER TRIGGER [dbo].[Schedule_DeleteAgentJob] ON [dbo].[Schedule]
    SELECT ScheduleID from deleted
OPEN id_cursor
DECLARE @next_id uniqueidentifier
FETCH NEXT FROM id_cursor INTO @next_id
WHILE (@@FETCH_STATUS <> -1) -- -1 == FETCH statement failed or the row was beyond the result set.
    if (@@FETCH_STATUS <> -2) -- - 2 == Row fetched is missing.    
    BEGIN  -- Check that job exists before trying to delete it.         
        IF (EXISTS(SELECT 1 FROM msdb.dbo.sysjobs WHERE name = CONVERT(nvarchar(128), @next_id)))
            exec msdb.dbo.sp_delete_job @job_name = @next_id -- delete the schedule    
    FETCH NEXT FROM id_cursor INTO @next_id
CLOSE id_cursor

Both SSRS 2005 and SSRS 2008 have the same problem.

I’m mentioning this also in my previous blog item about scale-out deployment of SSRS on Standard edition of SQL Server.

SQL Server 2005 Reporting Services Scale-out deployment (SSRS 2005 scale-out deployment)


SQL Server 2005 reporting services support scale-out deployment in Enterprise version. Unfortunately the scale-out deployment is not supported in Standard version of SQL Server 2005.

Some time ago I received a task to make automatic failover solution on reporting services across 3 SQL server reporting services instances on 3 different machines. There was setup a load balancing for that machines and it was necessary do create a solution when all 3 instances are fully synchronized and if some of the servers fails, user cannot be affected by this situation. (User must be able to run their reports and must receive all reports to which they subscribed. Also it is necessary when users create some reports, they must be automatically deployed across all 3 nodes).

When you run 2 or more instances of the SSRS standard version against one SSRS database which is normal in the enterprise version, you will receive a message, that the scale-out deployment is not supported.

So you can make a redundancy making 3 separate instances with 3 separate databases. Unfortunately this solution is hard to maintain in situation when you have hundreds of different reports on the server and new one are often created and some modified. Also it is hard to maintain subscriptions for the reports.


I started about solution for that situation and was investigating for some time.

Finally I’ve played with Replication which is supported by the standard version and it works. I’ve configured a fully automatic fail-over solution using the standard version of SSRS, which also correctly handles user subscriptions.

Setting up SSRS scale-out deployment on standard version of SQL Server 2005

The steps for successful setup of scale-out deployment using the standard version you need to setup the primary node first and then all the secondary nodes. The term primary node here means the one which database will be published for the other (secondary) nodes. It is also necessary that all nodes runs the same version of SQL Server Reporting services including the same service packs and patches. Also this assumes, that the report server databases will be running on the same instance of SQL Server (or Cluster).

Setup primary SSRS node

So we install and setup first node of the SSRS services. As we will be using more nodes, we have to backup encryption keys of the primary SSRS instance as it will have to be used also by all other instances.

After the primary instance is fully setup, we will create a Merge publication for the primary database. We will publish only tables and will publish all tables except the [Keys], [Events], [Notifications] tables. We publish only the report server database, not the temporary database.

We need to exclude the Keys table, because each instance has it’s own key stored in the table. Event we later restore the encryption key and it will be the same among all instances, each instance has it’s own name and creates a new entry in the Keys table. And when there is more than one entry in the table, it is automatically detected as scale-out deployment, which is supported only by the Enterprise version.

The Events and Notifications tables are used for monitoring of Subscriptions and events in SSRS. To support correct subscriptions processing we cannot replicate those tables.

The Merge replication is necessary as users are automatically directed to the nodes by load balances and we need automatically update all 3 nodes whenever there is any change on any node.

Setup secondary SSRS nodes

Once the primary node is set-up, we can setup all secondary nodes and the steps will be the same for each additional node.

So we install and setup the secondary node as normal standalone installation of SSRS. Once the node is set-up and configured, we restore encryption key of the primary SSRS instance to be able to share encrypted data among all instances.

After the encryption key is restored, we will setup replication for the database of the secondary node. Then we are ready to subscribe to the publication of the primary node database. You can run the replication agent continuously or schedule it depending how quickly you need the changes among nodes to be synchronized. I suggest also to run the subscription under Server subscription type with each node set up to other priority, so you have automatic conflict resolution when multiple users eventually changes the same report at the same time.

Metadata changes on databases

First we need to update [dbo].[Schedule_DeleteAgentJob] trigger on the [dbo].[Schedule]. This update we do on the Primary node database and will be automatically replicated to the secondary nodes. It is necessary, because there is a bug in the trigger (event in SSRS 2008). For each subscription in SSRS there is a job created. And if you delete the job first, then you are unable to delete the subscription at all as the trigger fails and the delete subscription is rolled back. The problem is, that the existence of the job is not checked in the trigger before deletion. And we need this check, because when the table is replicated then the trigger will try to delete the job again and will fail.

ALTER TRIGGER [dbo].[Schedule_DeleteAgentJob] ON [dbo].[Schedule]
    SELECT ScheduleID from deleted
OPEN id_cursor
DECLARE @next_id uniqueidentifier
FETCH NEXT FROM id_cursor INTO @next_id
WHILE (@@FETCH_STATUS <> -1) -- -1 == FETCH statement failed or the row was beyond the result set.
    if (@@FETCH_STATUS <> -2) -- - 2 == Row fetched is missing.    
    BEGIN  -- Check that job exists before trying to delete it.         
        IF (EXISTS(SELECT 1 FROM msdb.dbo.sysjobs WHERE name = CONVERT(nvarchar(128), @next_id)))
            exec msdb.dbo.sp_delete_job @job_name = @next_id -- delete the schedule    
    FETCH NEXT FROM id_cursor INTO @next_id
CLOSE id_cursor

After update of the trigger we need some metadata changes on all secondary databases. We need to drop tables [dbo].[Event] and [dbo].[Notifications] and Create instead of the aliases to the primary server databases. It is necessary so there is correct processing of the subscriptions and other events. IF we do not make this change, it will often happen, that the event will be processed by more than one node, you will receive exceptions or receive subscribed reports more than once or the processing never ends. By this change it is ensured, that only on node processes the event, as when a node is taking event it puts lock on the tables and no other nodes can process the event.

-- Drop Event Table 
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[Event]') AND type in (N'U')) 
    DROP TABLE [dbo].[Event] 

-- Create Alias for the Event table in the Publisher database 
CREATE SYNONYM [dbo].[Event] 
FOR [PrimaryReportServerDB].[dbo].[Event] 

-- Drop the Notifications table 
IF EXISTS (SELECT * FROM sys.foreign_keys WHERE object_id = OBJECT_ID(N'[dbo].[FK_Notifications_Subscriptions]') AND parent_object_id = OBJECT_ID(N'[dbo].[Notifications]')) 
    ALTER TABLE [dbo].[Notifications] 
        DROP CONSTRAINT [FK_Notifications_Subscriptions] 

IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[Notifications]') AND type in (N'U'))
    DROP TABLE [dbo].[Notifications] 

-- Create Alias for the Notifications table in the Publisher database 
CREATE SYNONYM [dbo].[Notifications] 
FOR [PrimaryReportServerDB].[dbo].[Notifications] 

After this modifications you have fully working Load balancing and fail over of SSRS under SQL Server 2005 standard.

Encryption Keys and Not Supported Scale out deployment feature error message

It is essential to have the same key on all nodes to be able to read the data among nodes. Also as mentioned it is essential not to replicate the Keys table. As each node adds its key under its instance and machine name into the table and it is used for scale out deployment of Enterprise edition of SQL server. Unfortunately when standard edition finds more keys in the table it shows error messages about unsupported scale out deployment.

If this eventually on some or all nodes happen it is necessary to delete all records from the table except the record with value -1 in the Client column. Then it is necessary to restore the encryption key again to be able to access encrypted data.