Showing posts with label efficient. Show all posts
Showing posts with label efficient. Show all posts

Tuesday, March 27, 2012

Comparing Table Records

Hi everyone,

I’m looking for any recommendations or ideas for achieving a task I am currently doing in a much more efficient way or form. I am currently running a process that takes data in an Excel Worksheet and populates a SQL Table. Each record within that SQL Table is then read by another system to perform workflow related tasks. The problem I am having is reading each SQL record all the time, looking for delta changes. As you can imagine, this can be pretty time consuming if each record contains many column values with thousands of records in the DB.

I’m using a DTS Package to transform the data from Excel to SQL. The package is extremely dumb, in a sense all it does before doing a bulk import is running a “TRUNCATE TABLE [table]” command against the target to clear all the values before reloading.

I created a trigger in my primary table, which listens for delta updates. Upon a record update, it makes the record change within the primary table, additionally writing that record value to another Delta_Table within the Database. The benefit there is my outside system only needs to read the delta table to make updates rather than the primary which eliminates the need to parse through each record one by one, all the time. As you see, this cuts the read time from my outside system more than half.

I guess what I’d like to do is see if there is a better way to load the table without having to truncate all the records. Ideas?

Thanks Everyone!

Can you use Integration services. If so there is a slowly changing dimension component that you set to compare the loading data against the stored data and then decide what to do.

You could load your data into another table then do the following,

DELETE any records that exist in your target table that don't exist in the laoded table

UPDATE any records that exist but are different

INSERT any new records

you can then have a trigger that populates your diff table.

I prefer the Integration services option, I believe something similar is available in DTS

Sunday, March 11, 2012

Compare Queries

I am creating an SP. I want to compare alternative ways of writing the query
to see which one is the most efficient. I am trying to use the execution
plan but it doesn't give an overall cost of the query so I can compare it to
the alternative query. Rather I get a break down of the various steps (this
query uses functions which produce aggregates and these steps are included
in the execution plan) . I want to explore the steps later, at this point I
just want to compare the two queries as a whole. What is the best way of
going about this? Regards, Chris.What I'd do to start is to fire up SQL Server Profiler, turn on the
SQL:BatchCompleted event, and collect the Reads, Writes, CPU, and Duration
columns. Run your queries and compare the output...
Adam Machanic
SQL Server MVP
Author, "Expert SQL Server 2005 Development"
http://www.apress.com/book/bookDisplay.html?bID=10220
"Chris" <nospam@.nospam.com> wrote in message
news:emgxeRbuHHA.3796@.TK2MSFTNGP02.phx.gbl...
>I am creating an SP. I want to compare alternative ways of writing the
>query to see which one is the most efficient. I am trying to use the
>execution plan but it doesn't give an overall cost of the query so I can
>compare it to the alternative query. Rather I get a break down of the
>various steps (this query uses functions which produce aggregates and these
>steps are included in the execution plan) . I want to explore the steps
>later, at this point I just want to compare the two queries as a whole.
>What is the best way of going about this? Regards, Chris.
>|||Chris,
Put both queries in the same query window and run them. Then the
percentiles will reflect both queries. Therefore, if query1 and its
functions use 25% and query2 and its functions use 75%, that suggests that
query1 is more efficient. However, UDFs (and some other functions) do
degrade the reliability of these numbers. Still, it is a good first take.
However, running both queries repeatly and getting a set of actual execution
times for each will, in the final analysis, be a more accurate measure of
the queries.
RLF
"Chris" <nospam@.nospam.com> wrote in message
news:emgxeRbuHHA.3796@.TK2MSFTNGP02.phx.gbl...
>I am creating an SP. I want to compare alternative ways of writing the
>query to see which one is the most efficient. I am trying to use the
>execution plan but it doesn't give an overall cost of the query so I can
>compare it to the alternative query. Rather I get a break down of the
>various steps (this query uses functions which produce aggregates and these
>steps are included in the execution plan) . I want to explore the steps
>later, at this point I just want to compare the two queries as a whole.
>What is the best way of going about this? Regards, Chris.
>|||The wording is a bit unclear. My SP could potentially contain a variety of
variables obtained from sub queries etc. This means there will be more than
just two queries to compare. How would you deal with a more complex scenario
with SP's that use temp tables or subqueries etc.
Regards, Chris.
"Russell Fields" <russellfields@.nomail.com> wrote in message
news:emDSzebuHHA.4916@.TK2MSFTNGP04.phx.gbl...
> Chris,
> Put both queries in the same query window and run them. Then the
> percentiles will reflect both queries. Therefore, if query1 and its
> functions use 25% and query2 and its functions use 75%, that suggests that
> query1 is more efficient. However, UDFs (and some other functions) do
> degrade the reliability of these numbers. Still, it is a good first take.
> However, running both queries repeatly and getting a set of actual
> execution times for each will, in the final analysis, be a more accurate
> measure of the queries.
> RLF
> "Chris" <nospam@.nospam.com> wrote in message
> news:emgxeRbuHHA.3796@.TK2MSFTNGP02.phx.gbl...
>>I am creating an SP. I want to compare alternative ways of writing the
>>query to see which one is the most efficient. I am trying to use the
>>execution plan but it doesn't give an overall cost of the query so I can
>>compare it to the alternative query. Rather I get a break down of the
>>various steps (this query uses functions which produce aggregates and
>>these steps are included in the execution plan) . I want to explore the
>>steps later, at this point I just want to compare the two queries as a
>>whole. What is the best way of going about this? Regards, Chris.
>|||On Jun 29, 1:02 am, "Chris" <nos...@.nospam.com> wrote:
> The wording is a bit unclear. My SP could potentially contain a variety of
> variables obtained from sub queries etc. This means there will be more than
> just two queries to compare. How would you deal with a more complex scenario
> with SP's that use temp tables or subqueries etc.
> Regards, Chris.
> "Russell Fields" <russellfie...@.nomail.com> wrote in message
> news:emDSzebuHHA.4916@.TK2MSFTNGP04.phx.gbl...
>
> > Chris,
> > Put both queries in the same query window and run them. Then the
> > percentiles will reflect both queries. Therefore, if query1 and its
> > functions use 25% and query2 and its functions use 75%, that suggests that
> > query1 is more efficient. However, UDFs (and some other functions) do
> > degrade the reliability of these numbers. Still, it is a good first take.
> > However, running both queries repeatly and getting a set of actual
> > execution times for each will, in the final analysis, be a more accurate
> > measure of the queries.
> > RLF
> > "Chris" <nos...@.nospam.com> wrote in message
> >news:emgxeRbuHHA.3796@.TK2MSFTNGP02.phx.gbl...
> >>I am creating an SP. I want to compare alternative ways of writing the
> >>query to see which one is the most efficient. I am trying to use the
> >>execution plan but it doesn't give an overall cost of the query so I can
> >>compare it to the alternative query. Rather I get a break down of the
> >>various steps (this query uses functions which produce aggregates and
> >>these steps are included in the execution plan) . I want to explore the
> >>steps later, at this point I just want to compare the two queries as a
> >>whole. What is the best way of going about this? Regards, Chris.- Hide quoted text -
> - Show quoted text -
Make each version of your SP. Then capture in profiler
SP:StatementCompleted
with read,writes,duration,CPU
You need to clear database and procedure cache before each
execution of SP to have accurate values
DBCC DROPCLEANBUFFERS
DBCC FREEPROCCACHE

Compare Queries

I am creating an SP. I want to compare alternative ways of writing the query
to see which one is the most efficient. I am trying to use the execution
plan but it doesn't give an overall cost of the query so I can compare it to
the alternative query. Rather I get a break down of the various steps (this
query uses functions which produce aggregates and these steps are included
in the execution plan) . I want to explore the steps later, at this point I
just want to compare the two queries as a whole. What is the best way of
going about this? Regards, Chris.What I'd do to start is to fire up SQL Server Profiler, turn on the
SQL:BatchCompleted event, and collect the Reads, Writes, CPU, and Duration
columns. Run your queries and compare the output...
Adam Machanic
SQL Server MVP
Author, "Expert SQL Server 2005 Development"
http://www.apress.com/book/bookDisplay.html?bID=10220
"Chris" <nospam@.nospam.com> wrote in message
news:emgxeRbuHHA.3796@.TK2MSFTNGP02.phx.gbl...
>I am creating an SP. I want to compare alternative ways of writing the
>query to see which one is the most efficient. I am trying to use the
>execution plan but it doesn't give an overall cost of the query so I can
>compare it to the alternative query. Rather I get a break down of the
>various steps (this query uses functions which produce aggregates and these
>steps are included in the execution plan) . I want to explore the steps
>later, at this point I just want to compare the two queries as a whole.
>What is the best way of going about this? Regards, Chris.
>|||Chris,
Put both queries in the same query window and run them. Then the
percentiles will reflect both queries. Therefore, if query1 and its
functions use 25% and query2 and its functions use 75%, that suggests that
query1 is more efficient. However, UDFs (and some other functions) do
degrade the reliability of these numbers. Still, it is a good first take.
However, running both queries repeatly and getting a set of actual execution
times for each will, in the final analysis, be a more accurate measure of
the queries.
RLF
"Chris" <nospam@.nospam.com> wrote in message
news:emgxeRbuHHA.3796@.TK2MSFTNGP02.phx.gbl...
>I am creating an SP. I want to compare alternative ways of writing the
>query to see which one is the most efficient. I am trying to use the
>execution plan but it doesn't give an overall cost of the query so I can
>compare it to the alternative query. Rather I get a break down of the
>various steps (this query uses functions which produce aggregates and these
>steps are included in the execution plan) . I want to explore the steps
>later, at this point I just want to compare the two queries as a whole.
>What is the best way of going about this? Regards, Chris.
>|||The wording is a bit unclear. My SP could potentially contain a variety of
variables obtained from sub queries etc. This means there will be more than
just two queries to compare. How would you deal with a more complex scenario
with SP's that use temp tables or subqueries etc.
Regards, Chris.
"Russell Fields" <russellfields@.nomail.com> wrote in message
news:emDSzebuHHA.4916@.TK2MSFTNGP04.phx.gbl...
> Chris,
> Put both queries in the same query window and run them. Then the
> percentiles will reflect both queries. Therefore, if query1 and its
> functions use 25% and query2 and its functions use 75%, that suggests that
> query1 is more efficient. However, UDFs (and some other functions) do
> degrade the reliability of these numbers. Still, it is a good first take.
> However, running both queries repeatly and getting a set of actual
> execution times for each will, in the final analysis, be a more accurate
> measure of the queries.
> RLF
> "Chris" <nospam@.nospam.com> wrote in message
> news:emgxeRbuHHA.3796@.TK2MSFTNGP02.phx.gbl...
>|||On Jun 29, 1:02 am, "Chris" <nos...@.nospam.com> wrote:
> The wording is a bit unclear. My SP could potentially contain a variety of
> variables obtained from sub queries etc. This means there will be more tha
n
> just two queries to compare. How would you deal with a more complex scenar
io
> with SP's that use temp tables or subqueries etc.
> Regards, Chris.
> "Russell Fields" <russellfie...@.nomail.com> wrote in message
> news:emDSzebuHHA.4916@.TK2MSFTNGP04.phx.gbl...
>
>
>
>
>
>
> - Show quoted text -
Make each version of your SP. Then capture in profiler
SP:StatementCompleted
with read,writes,duration,CPU
You need to clear database and procedure cache before each
execution of SP to have accurate values
DBCC DROPCLEANBUFFERS
DBCC FREEPROCCACHE