SQL, why the query is slower when a datetime parameter is used

Question

Wednesday, May 1, 2013 5:16 PM

I have 2 sets of sql queries which i ran against a database.

SET 1

select *
from transactions trn (nolock)
where trn.transaction_date between '2012-02-04' and '2012-02-05'
and trn.type_code='X'
and trn.transtype = 'Y'

SET 2

declare @startDate datetime, @endDate datetime

select @startDate='2012-02-04'
select @endDate='2012-02-05'

select *
from transactions trn (nolock)
where trn.transaction_date between @startDate and @endDate
and trn.type_code='X'
and trn.transtype = 'Y'

Results

For the 1st set, the query took close to 6 seconds to return 4531.
For the 2nd set, the query took 8 min and 9 secods to return the same 4531 records.

The only difference i am seeing is the start\end date is passed in through a datetime type variable.

Can anyone say why the query set 1 is faster than the query set 2 ? or am i missing something?

All replies (8)

Friday, May 3, 2013 8:42 PM ✅Answered | 1 vote

You have to compare the execution plans.

In the following example hard-wired literal predicate clause is slower:

-- Relative cost 72%
SELECT * FROM Sales.SalesOrderHeader WHERE OrderDate = '2008-04-15 00:00:00.000'
GO
-- Relative cost 28%
DECLARE @dt datetime = '2008-04-15 00:00:00.000';
SELECT * FROM Sales.SalesOrderHeader WHERE OrderDate = @dt;
GO

Optimization article:

http://www.sqlusa.com/articles/query-optimization/

Kalman Toth Database & OLAP Architect sqlusa.com
New Book / Kindle: Pass SQL Exam 70-461 & Job Interview: Programming SQL Server 2012


Saturday, May 4, 2013 2:13 AM ✅Answered | 1 vote

In the following example hard-wired literal predicate clause is slower:

I get the exact same plan and performance with both queries.  The estimate of the query with the literal is spot on (76 estimate and actual) whereas the estimate with the variable is 28 rows versus 76 actual.  The relative estimated cost is proportionally understated.

Literals should typically generate more accurate estimates because the stats histogram can be used at compile time to estimate the row counts based on the actual value specified.  In contrast, the actual value is unknown at compile time when a variable is specified so the overall average density value is used, which is more likely to be more or less than the value specified.  But the downside with literals is the cost of compilation time for each invocation for non-trivial queries.

Dan Guzman, SQL Server MVP, http://www.dbdelta.com


Saturday, May 4, 2013 2:29 AM ✅Answered | 1 vote

The only difference i am seeing is the start\end date is passed in through a datetime type variable.

Can anyone say why the query set 1 is faster than the query set 2 ? or am i missing something?

The title of this thread says parameter but a local variable is not a parameter.  Parameters are declared in a stored procedure, function, sp_executesql or a client application.  Local variables are declared in a T-SQL batch using a DECLARE statement. 

The important point is that the actual value is known at compile time when a literal value or parameter is used but not when a local variable is specified.  SQL Server can generate a better plan when it knows the actual values because the estimates are often more accurate.

Dan Guzman, SQL Server MVP, http://www.dbdelta.com


Thursday, May 2, 2013 5:31 AM | 1 vote

You can refer this thread: http://stackoverflow.com/questions/313935/why-sql-server-go-slow-when-using-variables

It discusses the same issue.

- Vishal

SqlAndMe.com


Thursday, May 2, 2013 6:01 AM | 1 vote

A detailed explanation of this behaviour is provided in the below article by Erland:

Slow in the Application, Fast in SSMS? Understanding Performance Mysteries

Krishnakumar S


Thursday, May 2, 2013 7:34 AM | 3 votes

If you add the hint, does it run faster?

declare @startDate datetime, @endDate datetime

select @startDate='2012-02-04'
select @endDate='2012-02-05'

select * 
from transactions trn (nolock)
where trn.transaction_date between @startDate and @endDate
and trn.type_code='X'
and trn.transtype = 'Y' OPTION (RECOMPILE)

Best Regards,Uri Dimant SQL Server MVP, http://sqlblog.com/blogs/uri_dimant/

MS SQL optimization: MS SQL Development and Optimization
MS SQL Blog: Large scale of database and data cleansing
Remote DBA Services: Improves MS SQL Database Performance


Thursday, May 2, 2013 7:46 AM | 1 vote

Krishnakumar S, i think it has nothing to do with the problem :-) i did not read all of it so maybe i mistaken

If I understand the question is not running query in SSMS vs application but running the same way two different queries: with using variables and without. the link that Vishal Gajjar brought is discusses this.

this blog explains it for details, but it unfortunately it is in hebrew. if need i will trunslate it... but maybe the code and the execution plan image with a bit of imagination Or with Google Translation can help...
http://www.ariely.info/Blog/tabid/83/EntryId/99/Query-Efficiency-Using-parameter-VS-using-a-fixed-data.aspx

anyway just look at your execution plan and you will see the answer

* there is diffrent if you run the query in SP or direct. do you use SP?


Thursday, May 2, 2013 8:03 AM | 1 vote

Hi pituach, I think the article is also applicable to OP's question as well. It clearly describes why a script with a constant value and the same script with a variable can have different execution plan (read section 'How SQL Server Generates the Query Plan'). Don't confuse the article heading (of SSMS ) with the actual question. 

Hope this helps.

Krishnakumar S