SQL Server 2016 Mobile Reports – Issue with Time Navigator for dynamic query parameters

SQL Server 2016 brings new kind of reports – Mobile Reports which are DataZen product integrated into SQL Server 2016.

It brings many new possibilities but also some pitfalls a user must be aware when developing reports.

Time Navigator Parameters Problem

One of those pitfalls is a Time Navigator if you want to use it for passing dynamic query parameters. It works correctly if you have a local time zone set to UTC (but nobody is using that time zone). The problem appears if you start to use it with different local time zones and need to pass the selected date ranges to Shared Dataset for dynamic query processing.

The problem is that Time Navigator is providing the SelectedStartTime, SelectedEndTime, ViewportStartTime and ViewportEndTime as UTC Date/Time Strings. This means in the yyyy-MM-ddThh:mm:ss.fffZ format eg. 2016-01-01T00:00:00.000Z. This itself would not be a problem but problem is that the report client is automatically shifting the selected range by the current user Time Zone offset including the daylight saving time at the selected date.

If your current time zone would be UTC where everything works properly then For example if you select in the Time Navigator a date 2016-01-01, the SelectedStartTime will provide 2016-01-01T00:00:0Z and SelectedEndTime will provide 2016-01-02T00:00:00.000Z. This is OK as you proces that information on the Server as interval <2016-01-0100:00:00Z; 2016-01-02T00:00:00.000Z).

However if you are in different time zone, as it was mentioned above, the provided stamps are shifted by the current time zone including a daylight saving. This means if you would be in UTC+1 the SelectedStartTime will provide 2015-12-31T23:00:00Z and SelectedEndTime will provide 2016-01-01T23:00:00Z.

In case of UTC-6 SelectedStartTime will provide 2016-01-01T06:00:00Z and SelectedEndTime will provide 2016-01-02T06:00:00Z.

As you can see, in case you are developing a report for users within single time zone, this would not be a big problem as you can deal with that. However if you are developing reports for enterprise users and the requests are arriving from different time zones, you always receive different time stamps. What more, in the time stamp there is no information about the originating time zone as the parameter is sent as plain string. The biggest issue of that is that because of that time shift handled on the user side, you may receive different start and end dates from what user selected.

Problem Demo

Let’s create an easy demonstration of that problem. Create a new Mobile Report, place a Time Navigator on the surface and in the Time intervals include Years, Months, Days.


Then we need to create a testing Data Set on the Server. Let’s create a very simple Data Set, which will simply return the passed Start Date and End Date back to the client including the information about the parameter.

DECLARE @type sql_variant = @StartDate;
    @StartDate AS StartDate
   ,@EndDate AS EndDate
   ,@Unit AS TimeUnit
   ,SQL_VARIANT_PROPERTY(@type, 'BaseType') AS StartDateDataType

We are adding the @type sql_variant to see what kind of data type is arriving to SQL Server. As we are creating a parameterized Data Set which will be used by Mobile Report, you have to set a default values for the parameters.


Once we have the Data Set ready, add it into the Mobile Report and Set the Parameters Binding.



After that we can add a Data Grid to the design surface which will show us the passed parameters.


And Bind it to our parameterized Data Set


Testing the Parameters

Once we have the testing report ready, we can start playing with it to demonstrate the problem.


First let’s check how it looks when the report is being run in the (UTC) Coordinated Universal Time.

Head to the Settings and se the UTC Time Zone. Also note that the UTC does not allow adjusting for daylight saving time.


And the Result is here:


We can clearly see, that for the whole year 2015 period the Start Date was passed as 2015-01-01T00:00:00.000Z and the end date was passed as 2016-01-01T00:00:00.000Z. This is correct and expected values which should be received. We also see, that we have received the parameter as nvarchar, this means plain string.


Now head again to settings and switch the Time Zone to some of the UTC+1 zones. And Ensure, that you also select one with automatic daylight saving time


And Let’s make some tests.


Here we can see, that for the whole year 2015 both the Start Date and End Date was shifted by the zone offset. So now we received the Start Date as 2014-12-31T23:00:00.000Z and End Date as 2015-12-31T23:00:00.000Z.

Now Select some month where the daylight saving is in effect. For example June.


Here we can see, that the dates were shifted by the time zone offset as well the daylight time saving amount, this means shifted by 2 hours. The result is that Date Start was passed as 2015-05-31T22:00:00.000Z and End Date was passed as 2015-06-30T22:00:00.000Z.

Let’s take a look on different month during which the daylight saving change occurs, for Example March.


Here we can clearly see, that the Start Date is shifted by one hour as the daylight saving was not in effect however the End Date is shifted by 2 hour as the daylight saving was on during that time. So Results are for Start Date 2015-02-28T23:00:00.000Z and End Date 2015-03-31T22:00:00.000Z.


The same behavior we can see during the day to which the daylight saving went into effect. In case of year 2015 it was March 29. In that case we receive Start Date 2015-03-28T23:00:00.000Z and End Date 2015-03-29T22:00:00.000Z


Again head to the settings and adjust the time zone to TUC-6, e.g. Central Time.


And repeat some tests


Here we can see, that the dates were again shifted. This time the oposit way compared to the “plus” time zones. So for the whole year 2015 the Date Start we have 215-01-01T06:00:00.000Z and End Date as 2016-01-01T06:00:00.000Z.

As mentioned the same behavior we would see for all the tests.

Dealing with the issue

Ad mentioned at the beginning of that post. In case you develop a report which will use only users in one time zone and you know that those users are not travelling to different time zones when consuming the report the problem is not so big. as you adjust your eventual parameters processing for your time zone.

However if you develop an enterprise solution when the reports are using users around the globe from different time zones, you have to count with that. As it is clearly visible, you can receive different dates for different Time Zones.

As we saw from the test. in case for the Whole year of 2015 we receive

From that if you focus on whole dates, you can for example shift the dates by +1 in case eg. the Time Part of the received Start/End date string is for example greater or equal to 12:00:00 pm and do nothing with that parameter is the Start/End Date is till 12:00:00 pm.

No 100 % bullet proof solution

This will work for most common scenarios, but is not 100 % bullet proof if you have report users in time zones in more that +12. In case you would have users in +13 or +14 time zone and at the same time users in the -11 time zone, then there is no solution for such situation in case you go to the day level.

Let’s take a look a on a situation when use from UTC+14 (Kiritimati Island) time zone selects date 2015-01-05:


Date Start is 2015-01-04T10:00:00.00Z and Date End is 2015-01-05T10:00:00.00Z

And now another users in the UTC-10 (Hawaii) selects a date 2015-01-04:


Here we again receive Date Start 2015-01-04T10:00:00.000Z and Date End 2015-01-05T10:00:00.000Z.

Because no information about the originating time zone of the user running the report is being passed to the date set on the report server and underlying database, we are not able to distinguish between those two different cases.

I can imagine a scenario in which this behavior would be useful, but in most cases this is not a wanted functionality. There is no setting in the report or report server to enable/disable this behavior and the report developer has to be very careful about that.


Report launcher to run SSRS report subscriptions on demand

On www.mssqltips.com you can find my latest tip Report launcher to run SSRS report subscriptions on demand.

SSRS Subscriptions are great feature, but if you would like to run the subscription on demand out of the predefined schedule, it can be a tricky and you will start to dig into the SSRS database and agent jobs.

The tip describes, how you can easily create a set of simple SSRS reports, which allow very easy on demand launching of subscriptions.

Solving issue with first long starting report on SSRS 2008

As I wrote in my previous blog post First report and report after specific time interval starts a long time on MS SQL 2008 Reporting Services, a first report after specific time takes very long to start.

The issue is caused by the way how SSRS works and SSRS regularly restarts application domain after specific time period. After the application domain is restarted, then upon first request to the SSRS it needs to load all the settings and it takes quite a long time.

There is no real solving to the issue except increasing the interval between the application domain restarts from default 720 minutes to other value which meets your business needs more closer.

However even after increasing the value, then after the period is reached, the application domain is restarted and again the first request will take a long time. It could be ideal to optimize the interval so the app domain restart is done out of business hours. however even then fist report will take a long time.

Here is a possible workaround solution. It rests on the scheduler and execution of a PowerShell script, which stops and starts the SSRS service (which has the same effect as the application domain restart) and after the restart it makes a request to the report manager URL which forces the reporting services to load all the configurations etc. Then all the subsequent request to SSRS are immediate.

So if we set the RecycleTime in the rsreportserver.config to a value which is over one day let’s say 1500 minutes (it is 25 hours) and schedule the execution of the PowerShell script out of the business hours, each morning we will have SSRS ready without any delays. For details about modifying the RecycleTime take a look on my previous post mentioned above.

So here is the PowerShell script:

Stop-Service "SQL Server Reporting Services (MSSQLSERVER)"
Start-Service "SQL Server Reporting Services (MSSQLSERVER)"
$wc = New-Object system.net.webClient
$cred = [System.Net.CredentialCache]::DefaultNetworkCredentials
$wc.Credentials = $cred
$src = $wc.DownloadString("http://localhost/Reports/Pages/Folder.aspx

The script above first stops the SQL Server Reporting Service of the default (MSSQLSERVER) instance and immediately starts it again (stopping and starting the service has the same effect as application domain recycling). Then an webClient object is created which is used to fetch the Report Manager page which causes the reporting services to load all the settings. The page is read as string (it doesn’t matter how we read the page. Important is to make a request to initialize the reporting services) and it will take a longer time (like the first report start).

It is also important to get the DefaultNetworkCredentials of the user account under which the script will be executed. It is necessary to assign those credentials to the web client so it can authenticate to the reporting services.

Also it is important to mention that it is necessary to execute the script with elevated administrative privileges to be able to stop and start the service.

You can create a scheduled task using the Scheduled Tasks GUI or execute a below command to create the scheduled task from within a command prompt. The command prompt needs to be running with elevated administrative privileges.

schtasks /create /tn "SSRS Recycle" /ru UserName /rl highest /np /sc daily /sd 08/01/2011 /st 02:00 /tr "powershell.exe -noprofile -executionpolicy RemoteSigned -file c:scriptsSSRSRecycle.ps1"

This command creates a new scheduled task named “SSRS Recycle”, which will be run non interactively with elevated rights as UserName. The task will be executed daily at 02:00 am starting from 1st of August 2011 and will execute a PowerShell script SSRSRecycle.ps1 located in folder C:scripts.

For details about schtasks you can take a look on MSDN Schtasks.exe.

As mentioned in the beginning, it is not real solution to the problem with recycled application domains, however it provides an acceptable work around and you will have every day reporting services ready and available without any delays.

First report and report after specific time interval starts a long time on MS SQL 2008 Reporting Services

Today an interesting article was published on MS SQL Tips: How to Speed up MS SQL Reporting Services on First Run.

It’s about solving a problem with slow loading of first report (or even accessing the report manager) in MS SQL 2005 Reporting Services. On SSRS 2005 the problem by default occurs every 20 minutes of idle time.

The same problem occurs also on MS SQL 2008 and MS SQL 2008R2 Reporting Services, but in 12 hours intervals. As Reporting Services 2008+ doesn’t use the IIS server and use it’s own integrated web service, solving the problem is quite different.

On SQL Server Reporting Services 2008 the problem occurs when an Application Domain is recycled and by default it’s 12 hours (720 minutes). After this time a new application domain is created an all new requests are redirected to the new application domain. Once the application domain is recycled, the new domain has to authenticate to the database server, load all necessary structures and this prolongs the first report startup after the domain is recycled.

If this default behavior (recycling every 12 hours) doesn’t meet your business needs as, you can modify the Report Server configuration file and specify a value which will be closer to your needs.

The configuration is stored in the rsreportserver.config file which is located in the installation directory of the Report Server instance. E.g. for default instance of SQL Server 2008 R2 Reporting Services it is located in ”C:Program FilesMicrosoft SQL ServerMSRS10_50.MSSQLSERVERReporting ServicesReportServer” directory.

The part which interests us is in the Service Node and it’s RecycleTime parameter:


By modifying the RecycleTime, you can shorten or prolong the recycling interval. The interval is specified in minutes and as mentioned above the default value is 720 minutes (12 hours).

Detail on the Application Domains for Report Server Applications can be found on MSDN. Details about the RSReportServer configuration file is also available on MSDN.

As mentioned in the documentations the MaxAppDomainUnloadTime is a time interval in minutes for which the Reporting Services are waiting for any current operation to finish during recycling the Application Domain. As It was mentioned during recycling Application Domain a new Domain is created and all new requests are directed to the new Application Domain and the current one is leave for the MaxAppDomainUnloadTime if there are currently some jobs running. If the jobs are not finished during the MaxAppDomainUnloadTime, the Application Domain is restarted and all jobs terminated.

If you have some some reports, especially scheduled one, which could take longer than the default 30 minutes to process, you can also consider to alter this parameter according to your needs. On the other side, if there is some job running for such a long time, it should be analyzed why it takes so long and whether it could be optimized in some way.

There is a lot of other interesting parameters in this configuration file, which could useful especially in the Scale-out SSRS Deployment scenarios. I will focus on some of them in one of the future posts.

For the overall workaround of this problem using PowerShell, see my later post Solving issue with first long starting report on SSRS 2008.

SSRS 2008 & Multi-Line email body in Data-Driven Subscription delivered by email


When you create a report in SSRS 2008 and then want to create a Data Driven Subscription to this report and want to deliver it by email you can come to a problems with the message in the email body.

Let’s say we have a test report and create Data Driven Subscription to this report. The query in the subscription will return fields SubscriberEmail and EmailBody. In the SubscriberEmail we will have email address to which the report should be delivered and in the EmailBody we will have a message we want to send in the email (e.g..explanation of the report, how to use it etc.)

    'somebody@domain.com' AS SubscriberEmail,
    N'First Article of Email Body.
Second Article of Email Body.
Third Article of Email Body.' AS EmailBody

The setting could be like below (Comment is email body). Also notice the edit box for the Comment static value.. It’s a small single line TextBox.

SSRS Data Driven Subscription Step 4

SSRS Data Driven Subscription - Email Body

A problem occurs when  you want the email body to contain several articles separated by new lines. All the new lines in the delivered email will be removed and all the articles will be delivered as a single article.

First Article of Email Body.  Second Article of Email Body.  Third Article of Email Body.

In contrast, when you are creating a normal subscription  the Comment text box is multi line and you will receive a correct multi line email.

SSRS Standard Email Subscription

There is no way to send a simple text as multi line email with several articles.


As I mentioned above, you are not able to send a simple test as multi line email with several articles. Although, there is workaround and if the email client is accepting HTML, you can put as the email body a simple HTML source code. Then the email body will be correctly rendered in the email client as multi line text.

    'somebody@domain.com' AS SubscriberEmail,
    <meta content="text/html; charset=ISO-8859-1" http-equiv="content-type">
    <p>First Article of Email Body.</p>
    <p>Second Article of Email Body.<p>
    <p>Third Article of Email Body.<p>
</html>' AS EmailBody

As I was searching over the Internet I found, that people have the same problem also with SSRS 2005. The issue is solved in SSRS 2008 R2 and in the R2 it works correctly even without the HTML.

SQL Server Reporting Services – Report schedule not deleted in SSRS 2005 and 2008

If you delete a job which handles particular SSRS schedule by mistake, then it can happen, that you will be unable to delete a report schedule.

You delete the schedule and when you refresh the management the schedule you wanted to delete appears again.

This is caused by a trigger [dbo].[Schedule_DeleteAgentJob] trigger on the [dbo].[Schedule] table in the report server database, which doesn’t check for the job if it exists before it tries to delete it. And therefore if you deleted the job by mistake, then there is raised an exception in the trigger and whole transaction with deletion of schedule is rolled back.

You can simply modify the trigger by adding additional check for the existence of the job. And once the modification is applied, your SSRS schedules will work even you delete the job for that schedule manually.

ALTER TRIGGER [dbo].[Schedule_DeleteAgentJob] ON [dbo].[Schedule]
    SELECT ScheduleID from deleted
OPEN id_cursor
DECLARE @next_id uniqueidentifier
FETCH NEXT FROM id_cursor INTO @next_id
WHILE (@@FETCH_STATUS <> -1) -- -1 == FETCH statement failed or the row was beyond the result set.
    if (@@FETCH_STATUS <> -2) -- - 2 == Row fetched is missing.    
    BEGIN  -- Check that job exists before trying to delete it.         
        IF (EXISTS(SELECT 1 FROM msdb.dbo.sysjobs WHERE name = CONVERT(nvarchar(128), @next_id)))
            exec msdb.dbo.sp_delete_job @job_name = @next_id -- delete the schedule    
    FETCH NEXT FROM id_cursor INTO @next_id
CLOSE id_cursor

Both SSRS 2005 and SSRS 2008 have the same problem.

I’m mentioning this also in my previous blog item about scale-out deployment of SSRS on Standard edition of SQL Server.

SQL Server 2005 Reporting Services Scale-out deployment (SSRS 2005 scale-out deployment)


SQL Server 2005 reporting services support scale-out deployment in Enterprise version. Unfortunately the scale-out deployment is not supported in Standard version of SQL Server 2005.

Some time ago I received a task to make automatic failover solution on reporting services across 3 SQL server reporting services instances on 3 different machines. There was setup a load balancing for that machines and it was necessary do create a solution when all 3 instances are fully synchronized and if some of the servers fails, user cannot be affected by this situation. (User must be able to run their reports and must receive all reports to which they subscribed. Also it is necessary when users create some reports, they must be automatically deployed across all 3 nodes).

When you run 2 or more instances of the SSRS standard version against one SSRS database which is normal in the enterprise version, you will receive a message, that the scale-out deployment is not supported.

So you can make a redundancy making 3 separate instances with 3 separate databases. Unfortunately this solution is hard to maintain in situation when you have hundreds of different reports on the server and new one are often created and some modified. Also it is hard to maintain subscriptions for the reports.


I started about solution for that situation and was investigating for some time.

Finally I’ve played with Replication which is supported by the standard version and it works. I’ve configured a fully automatic fail-over solution using the standard version of SSRS, which also correctly handles user subscriptions.

Setting up SSRS scale-out deployment on standard version of SQL Server 2005

The steps for successful setup of scale-out deployment using the standard version you need to setup the primary node first and then all the secondary nodes. The term primary node here means the one which database will be published for the other (secondary) nodes. It is also necessary that all nodes runs the same version of SQL Server Reporting services including the same service packs and patches. Also this assumes, that the report server databases will be running on the same instance of SQL Server (or Cluster).

Setup primary SSRS node

So we install and setup first node of the SSRS services. As we will be using more nodes, we have to backup encryption keys of the primary SSRS instance as it will have to be used also by all other instances.

After the primary instance is fully setup, we will create a Merge publication for the primary database. We will publish only tables and will publish all tables except the [Keys], [Events], [Notifications] tables. We publish only the report server database, not the temporary database.

We need to exclude the Keys table, because each instance has it’s own key stored in the table. Event we later restore the encryption key and it will be the same among all instances, each instance has it’s own name and creates a new entry in the Keys table. And when there is more than one entry in the table, it is automatically detected as scale-out deployment, which is supported only by the Enterprise version.

The Events and Notifications tables are used for monitoring of Subscriptions and events in SSRS. To support correct subscriptions processing we cannot replicate those tables.

The Merge replication is necessary as users are automatically directed to the nodes by load balances and we need automatically update all 3 nodes whenever there is any change on any node.

Setup secondary SSRS nodes

Once the primary node is set-up, we can setup all secondary nodes and the steps will be the same for each additional node.

So we install and setup the secondary node as normal standalone installation of SSRS. Once the node is set-up and configured, we restore encryption key of the primary SSRS instance to be able to share encrypted data among all instances.

After the encryption key is restored, we will setup replication for the database of the secondary node. Then we are ready to subscribe to the publication of the primary node database. You can run the replication agent continuously or schedule it depending how quickly you need the changes among nodes to be synchronized. I suggest also to run the subscription under Server subscription type with each node set up to other priority, so you have automatic conflict resolution when multiple users eventually changes the same report at the same time.

Metadata changes on databases

First we need to update [dbo].[Schedule_DeleteAgentJob] trigger on the [dbo].[Schedule]. This update we do on the Primary node database and will be automatically replicated to the secondary nodes. It is necessary, because there is a bug in the trigger (event in SSRS 2008). For each subscription in SSRS there is a job created. And if you delete the job first, then you are unable to delete the subscription at all as the trigger fails and the delete subscription is rolled back. The problem is, that the existence of the job is not checked in the trigger before deletion. And we need this check, because when the table is replicated then the trigger will try to delete the job again and will fail.

ALTER TRIGGER [dbo].[Schedule_DeleteAgentJob] ON [dbo].[Schedule]
    SELECT ScheduleID from deleted
OPEN id_cursor
DECLARE @next_id uniqueidentifier
FETCH NEXT FROM id_cursor INTO @next_id
WHILE (@@FETCH_STATUS <> -1) -- -1 == FETCH statement failed or the row was beyond the result set.
    if (@@FETCH_STATUS <> -2) -- - 2 == Row fetched is missing.    
    BEGIN  -- Check that job exists before trying to delete it.         
        IF (EXISTS(SELECT 1 FROM msdb.dbo.sysjobs WHERE name = CONVERT(nvarchar(128), @next_id)))
            exec msdb.dbo.sp_delete_job @job_name = @next_id -- delete the schedule    
    FETCH NEXT FROM id_cursor INTO @next_id
CLOSE id_cursor

After update of the trigger we need some metadata changes on all secondary databases. We need to drop tables [dbo].[Event] and [dbo].[Notifications] and Create instead of the aliases to the primary server databases. It is necessary so there is correct processing of the subscriptions and other events. IF we do not make this change, it will often happen, that the event will be processed by more than one node, you will receive exceptions or receive subscribed reports more than once or the processing never ends. By this change it is ensured, that only on node processes the event, as when a node is taking event it puts lock on the tables and no other nodes can process the event.

-- Drop Event Table 
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[Event]') AND type in (N'U')) 
    DROP TABLE [dbo].[Event] 

-- Create Alias for the Event table in the Publisher database 
CREATE SYNONYM [dbo].[Event] 
FOR [PrimaryReportServerDB].[dbo].[Event] 

-- Drop the Notifications table 
IF EXISTS (SELECT * FROM sys.foreign_keys WHERE object_id = OBJECT_ID(N'[dbo].[FK_Notifications_Subscriptions]') AND parent_object_id = OBJECT_ID(N'[dbo].[Notifications]')) 
    ALTER TABLE [dbo].[Notifications] 
        DROP CONSTRAINT [FK_Notifications_Subscriptions] 

IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[Notifications]') AND type in (N'U'))
    DROP TABLE [dbo].[Notifications] 

-- Create Alias for the Notifications table in the Publisher database 
CREATE SYNONYM [dbo].[Notifications] 
FOR [PrimaryReportServerDB].[dbo].[Notifications] 

After this modifications you have fully working Load balancing and fail over of SSRS under SQL Server 2005 standard.

Encryption Keys and Not Supported Scale out deployment feature error message

It is essential to have the same key on all nodes to be able to read the data among nodes. Also as mentioned it is essential not to replicate the Keys table. As each node adds its key under its instance and machine name into the table and it is used for scale out deployment of Enterprise edition of SQL server. Unfortunately when standard edition finds more keys in the table it shows error messages about unsupported scale out deployment.

If this eventually on some or all nodes happen it is necessary to delete all records from the table except the record with value -1 in the Client column. Then it is necessary to restore the encryption key again to be able to access encrypted data.