sql tuning dot.net perspective

Post on 25-Feb-2016

59 Views

Category:

Documents

9 Downloads

Preview:

Click to see full reader

DESCRIPTION

1 st Oct 2011. SQL Tuning Dot.net Perspective. Bob Duffy Database/Solution Architect Prodata SQL Centre of Excellence. Speaker Bio – Bob Duffy. Database/Solution Architect at Prodata SQL Centre of Excellence One of about 25 MCA for SQL Server globally (aka SQL Ranger) - PowerPoint PPT Presentation

TRANSCRIPT

SQL Tuning Dot.net Perspective

1st Oct 2011

Bob DuffyDatabase/Solution ArchitectProdata SQL Centre of Excellence

Speaker Bio – Bob Duffy• Database/Solution Architect at Prodata SQL Centre of

Excellence• One of about 25 MCA for SQL Server globally (aka SQL

Ranger)• MCM on SQL 2005 and 2008• SQL Server MVP 2009+• 20 years in database sector, 250+ projects• Senior Consultant with Microsoft 2005-2008• Regular speaker for TechNet, MSDN, Users Groups, Irish and

UK Technology Conferences • *new* Completed the SSAS Maestro course

Session GoalsLearn to improve performance and scalability by making changes to your dot.net application.

Target Audience is developers who work with SQL OrDBA’s who work in the dot.net world

We will avoid typical SQL query tuning and physical tuning (covered elsewhere)

What is an Application ?Pure OLTPOLTP With Mixed Workload (CRM/ERP)

Departmental (< 200 users )Enterprise (> 250 users or strict performance SLA’s)

Logging/AuditingReportingBatch ProcessingData ProcessingWeb Scale OLTP

1. Connecting to SQL ServerConnection Pooling

Connection Authentication

Connection Pooling - OverviewConnections are slow and expensive

(3 * network_packet_size) + 94kShould be LESS connections than usersGeneral Guidance: Acquire late release earlyConnection String sets Pool Size

Default Min of 1 and Max of 100One Pool is created Per

Processper application domainper connection stringwhen using integrated security, per Windows identity.

Connection AuthenticationDecide on Trusted v DelegatedDecide on windows v sql authenticationIf Windows Decide on Kerberos v NTLM

To check Kerberos

Good or Bad Connections?

2. ParameterisationYou Must ParameterisationOptimising ParameterisationHow to Parameterise

Why you MUST ParameteriseBetter Performance and Security

PerformanceLess plans in the cacheLess compiles (should be close to zero)Opens the door to prepared statements

SecurityHelps prevent SQL injection

Parameter Tips and PitfallsAvoid using concatenation instead of proper paramsDo specify Length with variable length types

Otherwise will result in procedure cache bloat and more compilesOne plan per input data length

Should we use “.AddWithValue”?No – the data size is not specified

For optimal performance cache command objectWhen repeated calls to same statement

Use Prepared Statement (discussed later)Monitor for Parameter Sniffing Issues

How to ParameteriseInline SQL

Select * from Customers where CustomerID=@ID

Use Parameters Collection Cmd.Parameters.add(“@Id,variable,length)

Ad Hoc v Parameterised

Making the Most of a Bad DealWhat if your application is in production and has no parameterization (eg Siebel)

Update of Application Data Access is best but may not be possibleSimple queries may be “ok” due to auto parameterisation“Forced Parameterisation” database setting may helpServer setting “Optimise for Ad Hoc” is the second best option

Will solve proc cache bloatWill not resolve compile Issues (High CPU usage)

Check your Proc Cache for offending Queries

Look for single use queries and total size SELECT c.cacheobjtype , c.objtype , c.usecounts , c.size_in_bytes , t.dbid , t.text FROM sys.dm_exec_cached_plans as c CROSS APPLY sys.dm_exec_sql_text(plan_handle) as t WHERE c.cacheobjtype = 'Compiled Plan' ORDER BY c.usecounts DESC --ORDER BY t.text

3. SqlCommand TipsParameterize those queries

Last Section

Use RPC CallsUse CommandType=StoredProcedureDo NOT “stuff” statements (causes compiles)

Cache where frequently calledCareful of overzealous “stateless” architectureConsider Preparation for batches of similar commands

Make good use of methods with no resultsets

ExecuteNonQueryExecuteScalar

4. Prepared CommandsChatty Interfaces may send same request with different parameters.

Waste of bandwidth and other resourcesUse sqlCommand.Prepare to optimise this

Sends the SQL only once over networkCalls sp_prepexec in SQL ServerCalls sp_execute with each set of parameters

Greater startup cost, so avoid with only single calls

5. The Chunky v Chatty DebateWe consider two types of chunkiness

Data transferred. One row or one tableOne call per statement or one per batch

Affects type of dot.net callsGetAllCustomers()GetCustomer(CustomerID as int)

Affects guidance on ado.netUse of Stored Proceduresuse of TransactionsUse of Data Caching

Some Test ResultsTest Name

(Seconds)

AdHoc SQL 126.03Connection Pooling 75.56Parameterized 21.55Connection Pooling 19.16Cached Connection 18.04Prepared 15.33DataReader 13.67Chunky 7.43

6. Optimising ResultsetsData Readers v Data TablesOrdinal Indexes

SqlDataReader - OverviewStreaming access row by row basis

Data is buffered in the backgroundConnected while reading

Avoid passing aroundAvoid delaying reader or you will see ASYNC_NETWORK_IO

Very fast but inflexible

SqlDataReader - BatchingAvoid making multiple round trips when you can make oneIf you need two distinct sets of data, send both requests in a single batchUse SqlDataReader.NextResult to move to the next tableThis is forward only

Warning – while this will improve performance we do find it affects maintainability if not used with good helper functions.

DataTable - OverviewCached, in-memory access to all of the rows

Everything is pulled across the wire BEFORE reading starts

DataAdapter Fills the DataTableUses a reader behind the scenesSo double pass over data to read it !!

Disconnected AccessCan be cached and passed around

Supports limited searching and sorting

Bottom Line: Ultimate flexibility, with some performance sacrificed

What are Ordinal IndexesBoth Reader and table offer column name or Ordinal Indexes

Column Names calls GetOrdinal under the covers

RecommendationsAlign enumerations with the output column listCall GetOrdinal in advance and store the ordinal positions to avoid multiple calls

C#int productIdOrdinal = reader.GetOrdinal(“productId”);while (reader.read()) productIds.Add((int)reader[productIdOrdinal]);

Some Test Results

7. SqlBulkCopy Overview.NET class that exposes the functionality of SQL Server’s Bulk Copy API

Sends rows to SQL Server using TDS

Enables minimally-logged inserts

Accepts data input via either DataTable or IDataReader

Requirements for Minimal LoggingTable not replicated

TABLOCKTable is a heapIf table has clustered Index

Pages fully logged unless table empty

Trace Flag 610 allows for Minimal logging on non empty tables with clustered Indexes

8. Table Valued ParametersTable-Valued Parameters are a SQL Server 2008 feature, available from ADO.NET 3.5

Allows creation of typed table parameters to stored procedures

Table types support CHECK, DEFAULT, PRIMARY KEY, and UNIQUE constraints

Input only; must specify READONLY option in stored procedure parameter

Some Test Results – 1 million row insert

Test SecsRow by Row (Prepared) 245.47SqlBulkCopy - Minimal Logged 2.4TVP 6.2SqlBulkCopy - Fully Logged 4.5

9. Data Caching ConsiderationsProper use of caching can be the key to application scalability.

Consider:Should cache granularity be user-level or application-level?Should the cache live on the server or at the client?How should cache expiration and update be handled?What form of data or objects should be cached?How do we distribute cache across servers ?Do we cache reads/writes or both?

10. Some ORM GuidanceCheck the Health of the Plan Cache

Queries Provided Earlier

Log and monitor the SQL being generatedRML OR DMV’s are good here

If a query is not getting optimized properly, consider migrating to a stored procedure

Coming up…

#SQLBITS

Speaker Title Room

Quest Trivia Quiz: Test Your SQL Server and IT Knowledge and Win Prizes Aintree

SQL Server Community SQL Server Community presents : The A to Z of SQL Nuggets Lancaster

SQLSentry Real Time and Historical Performance Troubleshooting with SQL Sentry Pearce

Attunity Data Replication Redefined – best practices for replicating data to SQL Server Empire

11 Stored Proc or T-SQL not optimalToo much IO on query plan

Too many joinsLack of covering indexesNon SARGABLE predicatesUse of UDF’s

Too much data in columns or rowData Model Design not optimal (de-normalised)Abuse of Temp variablesProcedural or math heavy code in stored procsAbuse of DISTINCT

In conjunction with UDF or non sargableAbuse of UNIONUse of TriggersImplicit Conversions

ResourcesSQL Server Connection Poolinghttp://msdn.microsoft.com/en-us/library/8xx3tyca.aspx

SET OPTIONS That Affect Resultshttp://msdn.microsoft.com/en-us/library/ms175088.aspx

How to use Impersonation and Delegationhttp://msdn.microsoft.com/en-us/library/ff647404.aspx

Beginners Guide to ADO.Net Entity Frameworkhttp://msdn.microsoft.com/en-us/data/aa937723

Query NotificationHow does it work?

Thank You

top related