I'm trying to determine the relative performance of two different queries and have two ways of measuring this available to me:
1. Run both and time each query
2. Run both and get "Query Cost" from the actual execution plan
Here is the code I run to time the queries...
DBCC FREEPROCCACHE
GO
DBCC DROPCLEANBUFFERS
GO
DECLARE @start DATETIME SET @start = getDate()
EXEC test_1a
SELECT getDate() - @start AS Execution_Time
GO
DBCC FREEPROCCACHE
GO
DBCC DROPCLEANBUFFERS
GO
DECLARE @start DATETIME SET @start = getDate()
EXEC test_1b
SELECT getDate() - @start AS Execution_Time
GO
What I get is the following:
Stored_Proc Execution_Time Query Cost (Relative To Batch)
test_1a 1.673 seconds 17%
test_1b 1.033 seconds 83%
The results of the execution time directly contradict the results of the Query Cost, but I'm having difficulty determining what "Query Cost" actually means. My best guess is that it is an aggregate of Reads/Writes/CPU_Time/etc, so I guess I have a couple of questions:
Is there a definative source to explain what this measure means?
What other "Query Performance" metrics do people use, and what are their relative merits?
It may be important to note that this is a medium sized SQL Server, running MS SQL Server 2005 on MS Server 2003 Enterprise Edition with multiple processors and 100+ concurrent users.
EDIT:
After some bother I managed to get Profiler access on that SQL Server, and can give extra info (Which supports Query Cost being related to system resources, not Execution Time itself...)
Stored_Proc CPU Reads Writes Duration
test_1a 1313 3975 93 1386
test_1b 2297 49839 93 1207
Impressive that taking more CPU with MANY more Reads takes less time :)
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…