18 July 2013

WCF and the Try-Catch-Abort Pattern

Proxy Classes are used to talk to Windows Communication Foundation (WCF) Services.
A communication channel is open to the WCF Service that must be closed after the calls to the service.
The proper way to close the communication channel is very important.
The WCF proxy is usually generated using Visual Studio or the svcutil tool.
The generated proxy inherits from a base class System.ServiceModel.ClientBase that implements the IDisposable interface.

The service can be called in a using block that automatically calls the Dispose() method.

using (MyService client = new MyService())
{
...
} // Dispose is called here
Dispose() calls the proxy close() method, that sends a message from the client to the service indicating that the connection session is no longer needed.
A problem can arise with this approach if there is an exception when calling the close() method. This is why the using approach is not recommended when calling WCF methods.

If the communication channel is in a faulted state Abort() should be called and not close();
The recommended approach is the Try-Catch-Abort Pattern.
This is a simple pattern where a try...catch block is used to call the service and in the catch of an exception the connection is aborted or closed.

The recomendation by Microsoft in MSDN is:
 
try
{
    ...
    client.Close();
}
catch (CommunicationException e)
{
    ...
    client.Abort();
}
catch (TimeoutException e)
{
    ...
    client.Abort();
}
catch (Exception e)
{
    ...
    client.Abort();
    throw;
}

A generic class for creating and initializing WCF proxy clients can be created to implement the pattern and replace the using block:
 
public class WCFProxy
    {
        public static void Using<t>(Action<t> action)
        {
            ChannelFactory<t> factory = new ChannelFactory<t>("*");

            T client = factory.CreateChannel();

            try
            {
                action(client);
                ((IClientChannel)client).Close();
                factory.Close();
            }
            catch (Exception ex)
            {
                IClientChannel clientInstance = ((IClientChannel)client);
                if (clientInstance.State == System.ServiceModel.CommunicationState.Faulted)

                {
                    clientInstance.Abort();
                    factory.Abort();
                }
                else if (clientInstance.State != System.ServiceModel.CommunicationState.Closed)
                {
                    clientInstance.Close();
                    factory.Close();
                }
                throw (ex);
            }
        }
    }
To use the class is as simple as:
 
WCFProxy.Using((delegate(IMyService client)
{
  client.DoWork();
});

21 March 2013

Fixing SSDT unresolved reference to object error

The SQL Server Data Tools (SSDT) for Visual Studio 2012 are a great set of tools for database development, but as any tool they have some undesirable "features".
One of those "features" is the "unresolved reference to object" error.
The "unresolved reference to object" in a stored procedure is a warning, but for the function it is an error. The error fails the build and consequently the the schema compare and update of the database.
The error normally is caused by a query that references an object on other database.

One example of the error is:
Error 190 SQL71561: Function: [dbo].[myFunction] has an unresolved reference to object [myDatabase].[dbo].[Product]
In the example the function [myFunction] is using the table [Product] from the database [myDatabase], that isn't the same database being managed by the database project of [myFunction].

A possible solution is to add a database reference to the database that has the missing object.
The reference needs a Data-tier Application (dacpac file) that can be easily generated on the solution with the database project that has the missing object. Press the right mouse button over the database project and selected Snapshot Project. The dacpac file is created on the Snapshots folder.
The file should then be copied to a common folder for re-usability.

In the project with the error press the right mouse button over the References and selected Add Database Reference.
The Add Database Reference dialog appears:
1) Select the dacpac file
2) Select the database location. The most common option is "Different database, same server"
3) Confirm that the Database name field is as expected
4) Clear the "Database variable" field in the dialog. If this field has a value the queries must use this variable and not the database name.



Look at the "Example usage" text and verify that it looks as expected. Click "OK" to add the reference and that should take care of the 'unresolved reference' errors.

The database reference resolves the schema comparison issue, but trying to build the project produced the following error:
Error 408 SQL00208: Invalid object name 'db.schema.table'.

To fix this error, go to the project properties and uncheck "Enable extended Transact-SQL verification for common objects".



15 February 2013

Dropdown list autopostback not working when validators fire

The Dropdown server control of the webforms can automatically post back by setting the autopostback property to true and setting the change event.
It works as expected, except when you have validators on the form with client-side validation active. The post back causes the validators to fire and prevent the autopostback if the form is invalid.

If there is the need to post back even if the form is invalid, the Microsoft documentation states that the CausesValidation property must be set to false.
However the issue still persists and the client side onchange event must execute the code Page_BlockSubmit=false
Page_BlockSubmit controls whether the form should be submitted or not.
Here is a full example:

<asp:dropdownlist autopostback="true" 
                 causesvalidation="false"
                 id="myDropDown" 
                 onchange="Page_BlockSubmit = false;"
                 onselectedindexchanged="myDropDown_SelectedIndexChanged"
                 runat="server">  
</asp:dropdownlist>

Note that to apply the best practices the client side event onchange should be set on the code behind.
myDropDown.Attributes.Add("onchange", "myDropDownOnchange();");
function myDropDownOnchange() {
    Page_BlockSubmit=false;
}

07 February 2013

Query Optimization and the SQL Server Cache

Query optimization is a very important task to assure that the server resources are not heavily consumed by a bad performing query.
One of the important tasks when optimizing a query  is to clean the SQL Server Cache so that the results are not influenced by the caching mechanism.

The T-SQL commands, that clean the Sql Server cache, will cause severe performance problems in a production environment, so they are to be used for testing purposes on a development or staging environment only.
They can be executed in a production server in special and controlled conditions. An example is to remove an individual plan of a query that has a bad performance because of a bad plan cache.

This article explains the main concepts involved and then the possible options to Clean the Sql Server Cache.


Execution Plan

The execution plan is the result of the query optimizer attempt to calculate the most efficient way to process the request represented by the SQL query statement.
Sql Server has to build an execution plan for each Transact-SQL (T-SQL) statement it has to execute.
The  execution plan defines how the T-SQL statement can be executed by Sql Server to produce the desired results.

The execution plan is built based on several considerations:
1) The tables it needes to join
2) The Indexes to use 
3) The sub-queries it has to execute
4) How aggregations of Group By are calculated
5) The estimated cost and load the operations place on the system
6) other even more complex considerations

The execution plan is also known as query plan.

As can be easily understandable, SQL has to put a lot of work to build a Execution Plan, so it caches the execution plan in memory to avoid having to do the same work over and over again.
Sql Server uses the Plan Cache to reuse plans. In this way, SQL Server can avoid the overhead of calculating the execution plan for each T-SQL statement and in this way speed up the execution of the queries.

 
Plan Cache

The plan cache is used by Sql Server to store the Execution Plans of the queries it has run.
The plan cache allows Sql Server to reuse Execution Plans for subsequent requests. It stores plans and it's associated information. There are metrics about the number of times a query was executed and the resources it used for example.

The Plan cache can be flush totally in some situations, the principal are:
1) Sql Server Service Restarts
2) Statistics of an object changing
3) Restores of a database
4) Executing T-SQL commands to clean it
5) Insufficient Memory on the server, causing memory pressure
6) Detaching a database
7) Some T-SQL Commands like for example RECONFIGURE, ALTER DATABASE ... MODIFY FILEGROUP or modifying a collation using ALTER DATABASE … COLLATE command

 The Plan cache of a database can also be flushed totally in some situations, where the principal are:
1) Some operations like for example DROP DATABASE or ALTER DATABASE … MODIFY NAME
2) If the database auto-closes
3) The database is set online or offline


Recompilations

Sql Server checks for correctness and for the optimality of a query plan before it executes it.
If one of the checks fails, the statement is compiled again and new query plan is produced.
These compilations are named as recompilations.

The recompilations are necessary to:
1)  Ensure statement correctness
2)  Obtain potentially better query execution plans as data changes

The recompilations can also have the side effect that they can slow down executions considerably.
In this case it is necessary to reduce the number of recompilations.

Parameter sniffing

Parameter sniffing is a complex topic, but according to Microsoft:

"Parameter sniffing" refers to a process whereby SQL Server's execution environment "sniffs" the current parameter values during compilation or recompilation, and passes it along to the query optimizer so that they can be used to generate potentially faster query execution plans. The word "current" refers to the parameter values present in the statement call that caused a compilation or a recompilation.

The query optimizer uses the parameters passed to the statements for performing estimates  when evaluating possible execution plan options. The final effect is that the plan is optimized for those specific parameter values. 
This feature allows more efficient stored procedure execution plans in most cases. There is however an assumption that the parameter values are "typical".
The main problem with this approach is when a parameter is atypical. The asymmetry usually is in the distribution of the data values or how the value is distributed across where the data is a non-uniform distribution, this is called skewed data.
 One example of skewed data is when there is a table with two million records and a column of type BIT. That column has the value 0 for only 1000 records and all the others have the value 0.
The LIKE clause is also very prune to this issue.

Parameter sniffing affects the performance of a query since the execution plan that is generated by the query optimizer depends on parameter sniffing.

Parameter values are sniffed for:
1) Stored Procedures
2) Queries executed using sp_executesql
3) Prepared queries

The following statements help to control the parameter sniffing performance problems:
1) WITH RECOMPILE - the stored procedure
2) Dummy local variables that are set equal to a parameter
3) OPTION(RECOMPILE) - This query hint is used to extended the behavior to queries (SELECT, INSERT, UPDATE, or DELETE). In this case,  both the parameter values and the current values of local variables are sniffed.
4) OPTION(OPTIMIZE FOR ())

The explanation of these options are out of the scope of this article, but you can search for a detailed explanation if you wish to learn more.


Clean the buffers

Sql Sever buffers the data before it is written to disk, this can cause dirty pages.
To guarantee that all these dirty pages, for the current database, are be written to disk and buffers are clean use the CHECKPOINT statement.
CHECKPOINT forces all dirty pages for the current database to be written to disk and cleans the buffers. After you do this, you can issue DBCC DROPCLEANBUFFERS command to remove all buffers from the buffer pool.

The CHECKPOINT is to guarantee that if you performed an operation or if you are in a collaborative environment that you are also testing you don't end up with dirty pages in the buffers.

Use DBCC DROPCLEANBUFFERS to test queries with a cold buffer cache without shutting down and restarting the server.  It serves to empty the data cache. Any data loaded into the buffer cache due to the prior execution of a query is removed.

Example:
CHECKPOINT
GO
DBCC DROPCLEANBUFFERS
GO


Clean the Plan Cache

There are several options to clean the plan cache. The next sections are going to explain the principal options, but don't forget to always Clean the Buffers first.


Clean the entire Plan Cache

The more drastic method is to clean the entire Sql Server plan cache, using the statement FREEPROCCACHE.
This will free the entire plan cache and causes the recompilation of subsequent ad-hoc SQL statements or Stored Procedures.

Usage:
DBCC FREEPROCCACHE


Clean a Database Plan Cache

A less drastic option, is to clear only the entire plan cache for a specific database and not the full Sql Server Plan Cache.

Usage:
DECLARE @dbId INTEGER
SELECT @dbId  = dbid FROM master.dbo.sysdatabases WHERE name = ‘myDatabase’
DBCC FLUSHPROCINDB (@dbId)

Where myDatabase is database to clear the entire plan cache.
The @dbId parameter is the database the number (database ID) to be affected by the FLUSHPROCINDB  command.


Clean a Compiled Plan

In some situations it is useful to clean only a specific compiled execution plan for an Stored Procedure or a Ad-Hoc query.
One example is when some queries are executed in production for testing or to Extract, Transform, Load (ETL) data and should be removed form the Plan Cache so that they don't occupy cache space.

Usage:
SELECT [text], CachedPlans.size_in_bytes, CachedPlans.plan_handle, CachedPlans.objtype, CachedPlans.usecounts
FROM sys.dm_exec_cached_plans AS CachedPlans
CROSS APPLY sys.dm_exec_sql_text(plan_handle)
WHERE CachedPlans.cacheobjtype = N'Compiled Plan'

This query returns all the compiled plans.
The column Text identifies the T-SQL statement executed (Stored Procedure: proc or Ad-Hoc Query: Adhoc for example) and  the plan_handle can be passed to FREEPROCCACHE to remove it:

DBCC FREEPROCCACHE (plan_handle)


Clean  Stored procedure or Trigger Plan cache

The sp_recompile statement causes the recompilation of stored procedures and triggers the next time that they are run.It drops the existing plan from the procedure cache.

Usage:
EXEC sp_recompile N'myObject'';

Where myObject can be a stored procedure, trigger, table, or view in the current database.
If is the name of a stored procedure or trigger, the stored procedure or trigger will be recompiled the next time that it is executed.
If object is the name of a table or view, all the stored procedures or triggers that reference the table or view will be recompiled the next time that they are executed.