Apr 23 2015

PASS Board Update: April 2015

It’s been a pretty interesting month on the board.

First, we did have a little problem. I’m sure one or two of you may have noticed that the SQL Saturday web site was down for a moment. Well, three days. Joking aside, this was a serious problem, but my involvement was largely peripheral since I’m in charge of Chapters. I tried to help out a little, offering what support I could and, if nothing else, supplying an ear, willing to listen. Quite a few people took advantage of that. I communicated all their information up the chain to HQ and the other board members. Nothing has been held back.

Next, we’ve started the budgeting process. That’s fascinating. As well as a giant pain in the… well, anyway. Thankfully the people at HQ are helping shepherd me through the process.

We’ve had a number of leadership changes at different PASS Chapters. A couple of new groups have been created. We’re making some progress there.

Also this month I went to the PASS Business Analytics Conference.

Let’s talk about this.

The event was really well put together. The keynotes were amazing. You can read about them here and here.  The venue was great. The speakers I saw seemed good (for the most part). We hosted a number of focus group discussions with attendees and speakers to get feedback on the event. Overall, they were extremely happy with it. They also provided a lot of great suggestions to help improve the event if we do it again.

And there is the question. Do we do this again?

I’m not sure.

If you saw my tweets during the event, I was very supportive. I was also actively promoting the event in the weeks leading up to it. All this is part of my responsibilities as a board member. We were committed to an event and I’m going to help the organization ensure that event is successful. Period.

However, if you had asked me whether I would support doing this again next year, prior to going to the event, I would have said no. Now, I’m not sure. The fact is, there’s a split between the perfect BAC audience member and the perfect Summit audience member. That’s not saying that there’s not crossover. There very much is. But, my concern is, can we, the PASS organization, properly support a community that is business focused using the tools we have for supporting a technology focused community? Should we?

From all the feedback, we supplied an event that people liked and cared about and, most importantly, would suggest to others. So we probably can support a business focused community. Should we?

For myself, and all this is just me, reporting to you, I think that the big question is, does this help our existing membership. Prior to this event, I would have said absolutely not. I would have argued that there was not a path from DBA to analyst. After talking to lots and lots of people at the event, I found that a healthy number of our peers have moved from being a DBA or BI person to being a data analyst. It goes something like “Hey! You know that data stuff, right? Well we need someone to do this data analysis thingymabob and you just volunteered.”

In short, many data analysts are accidental data analysts and they come from our peers, our community, in fact, our #sqlfamily. Now, should we do this thing? I’m on the fence still, but I don’t mind admitting, I’m wavering towards “Hell, Yes.”

Apr 21 2015

Query Hash Formula Changes Between Versions

The question came up on SQL Server Central that someone wanted to track queries on their 2008R2 instance and on their 2014 instance in order to validate performance after an upgrade. The plan was to use query hash values. I was curious, so I set up a quick test. First, I spun up two VMs in Azure, both A1, both Enterprise, but one for 2008R2 and one for 2014. Then, I grabbed a generic query that runs against DMVs:

SELECT TOP 10
SUBSTRING(dest.text, (deqs.statement_start_offset / 2) + 1,
(CASE deqs.statement_end_offset
WHEN -1 THEN DATALENGTH(dest.text)
ELSE deqs.statement_end_offset
- deqs.statement_start_offset
END) / 2 + 1) AS querystatement,
deqp.query_plan,
deqs.query_hash,
deqs.execution_count,
deqs.last_elapsed_time,
deqs.last_logical_reads,
deqs.last_logical_writes,
deqs.last_worker_time,
deqs.max_elapsed_time,
deqs.max_logical_reads,
deqs.max_logical_writes,
deqs.max_worker_time,
deqs.total_elapsed_time,
deqs.total_logical_reads,
deqs.total_logical_writes,
deqs.total_worker_time
FROM sys.dm_exec_query_stats AS deqs
CROSS APPLY sys.dm_exec_query_plan(deqs.plan_handle) AS deqp
CROSS APPLY sys.dm_exec_sql_text(deqs.sql_handle) AS dest
ORDER BY deqs.total_elapsed_time DESC;

I ran the query and captured execution plans on both servers. No changes, just defaults. The output is here:

0x40B8F425DDC3D692 –2014
0x958B3949E630C004 –2008R2

I don’t have access to the code at Microsoft, but I think it’s pretty safe to say that the formula for creating the hash value for a given query has changed between versions. This isn’t surprising, but it is a little disappointing. It would have meant an easy way to compare performance between queries on two different systems. Now, you’ll have to compare based on the T-SQL text which just isn’t as easy and clean.

Apr 16 2015

Azure SQL Database v12 and SQL Magazine

I spend many of my evenings researching and writing. Sometimes it’s writing new books. Sometimes it’s fixing and rewriting old books. Occasionally it’s a blog post like this one. Lately, it’s been a series of articles for SQL Magazine that are all about the new functionality available in Azure SQL Database v12 that was released in February for most data centers in Azure. It’s a whole new ball game. Check out my introductory article for v12 and the one on DMVs in Azure. I have more coming up on CLR in Azure, getting started, PowerShell, DBCC, T-SQL enhancements, Premium Tier and more. I’ll also get into “old” functionality like backup and restore. I’ll also explore new functionality, Azure is nothing if not dynamic, as becomes widely available.

I know a lot of you have been putting off exploring Azure, especially Azure SQL Database until it was “ready.” Guess what? It’s that time. Feel free to lean on me here, and over at SQL Mag, for questions, suggestions, thoughts, concerns, anything you have about Azure.

Apr 13 2015

Azure SQL Database Firewall Settings

The new portal for managing Azure is pretty. I’m not sure I’m in love with it, but it’s pretty.

However, one thing that I have to access regularly is the firewall settings for my Azure SQL Database. I do demos from all over the place. I never know what my IP address is going to be. Rather than expose everything, I just set up whatever IP address I’m on and then remove it later. The old portal made this easy. The new one… not so much.

So, let’s get this down real quick. Assuming you connect to the new portal and go straight to your database, you’ll see this image showing you the database and the server it’s on:

DatabaseAndServer

You won’t see anything else that suggests FIREWALL. But, you can click on the server. When you do, you’ll see another panel open up to the right of the one you’re on. It’ll have an image like this:

Server

Still nothing that screams FIREWALL, but if you click on the little gear icon that says SETTINGS you’ll get yet another panel opening to the right that resembles this image:

ServerSettings

There she blows, Firewall. Click on that and you’ll see the standard screen for editing your IP address access into the firewall:

ServerSettingsFirewall

Just remember that after adding a new IP address to your firewall you must hit the save icon at the top of the screen, or you still won’t be able to access your database.

Apr 10 2015

I Am Grateful For

I decided in January that I would write regularly about people that I’m grateful for. Now it’s April. Oops.

The concepts are simple and science based. Positive emotions help your clarity of thought. Further, your moods and actions extend out through your network to the third layer. With that in mind, I want to think more clearly and the most direct path to positive thoughts and emotions being gratitude, I’m attempting to focus and publicize my gratitude by publicly sharing it through these blogs (in short, I’m trying to tune my brain like I would tune a query).

I am grateful for Tim Ford (b|t).

Tim is a great guy. Further, Tim can be patient with thick headed dorks (raising my hand). For example, among all the other stuff Tim does (Board of Directors for PASS, Editor at SQL Mag, Track organizer at Connections), he organizes and runs SQL Cruise. Yeah, yeah, cue the laugh track. You don’t get it. SQL Cruise changes peoples lives. I’ve seen it. Now, first time I went, I had a blast, but, I’m a bit thick, so I didn’t understand everything I was seeing. What I knew was that I didn’t tweet or write a blog post for a week, therefor I must not be doing my job. So, I asked Red Gate to not send me on any more cruises. We skipped them for a while. Last year, with a lot of prodding from Kevin Kline (another person I’m grateful for, but one blog post at a time), I went on the cruise again. Yeah, sun, rum, beaches… Oh, and people doing intense networking, redirecting their careers, going through intense, long form sessions with great speakers (and me)… Whoa! I spent some time talking with Tim and his wife Amy and the light finally dawned. Tim isn’t ONLY finding a way to get a bunch of us out on a boat so we can have fun (yeah, that’s part of it), he’s literally building and engineering opportunities for people to recharge and reset in preparation for radically modifying their career (not their jobs, people after SQL Cruise have careers). And it’s done the same thing for me. I’ve been personally reinvigorated by the work that Tim has done and I’m immensely grateful for that. But it’s not just SQL Cruise. Tim has helped me personally and professionally because he’s just that sort of person. He’s one of a long list of people that I’m regularly gobsmacked are willing to hang out with me.

Thank you Tim.

Apr 07 2015

Error: Unknown Property ismemoryoptimized

If you’re starting the process of moving your databases in Azure SQL Databast to v12, you need to do one thing. Make sure you have SQL Server 2014 CU5 or better installed on your machine with Management Studio (current list of updates). Otherwise, like me, you’ll be looking at the above error.

Just a quick blog post to help out. I saw this error, did a search, and didn’t hit a single entry telling me what to do. I started the install of CU6 (I needed to catch up on cumulative updates anyway). While that was happening, I went to Twitter and posted to #sqlhelp to see if anyone else had hit this. I got a response from Amit Banarjee pointing me to this MSDB blog on the topic, reinforcing the decision I had already made. I just wish they had posted the error along with everything else in the blog post. It would make things easier.

Apr 06 2015

Constraints and SELECT Statements

I’ve posted previously about how a foreign key constraint can change how a SELECT query behaves. Logically that just makes sense. But other types of constraints don’t affect execution plans do they?

Yes.

Let’s take this constraint as an example:

ALTER TABLE Sales.SalesOrderDetail WITH CHECK 
ADD  CONSTRAINT CK_SalesOrderDetail_UnitPrice 
CHECK  ((UnitPrice>=(0.00)))

That will ensure that no values less than zero can slip in there. We can even validate it:

INSERT Sales.SalesOrderDetail
        (SalesOrderID,
         CarrierTrackingNumber,
         OrderQty,
         ProductID,
         SpecialOfferID,
         UnitPrice,
         UnitPriceDiscount,
         rowguid,
         ModifiedDate
        )
VALUES  (60176, -- SalesOrderID - int
         N'XYZ123', -- CarrierTrackingNumber - nvarchar(25)
         1, -- OrderQty - smallint
         873, -- ProductID - int
         1, -- SpecialOfferID - int
         -22, -- UnitPrice - money
         0.0, -- UnitPriceDiscount - money
         NEWID(), -- rowguid - uniqueidentifier
         GETDATE()  -- ModifiedDate - datetime
        );

Will give me an error:

Msg 547, Level 16, State 0, Line 470
The INSERT statement conflicted with the CHECK constraint “CK_SalesOrderDetail_UnitPrice”. The conflict occurred in database “AdventureWorks2014″, table “Sales.SalesOrderDetail”, column ‘UnitPrice’.

Let’s look at a SELECT query now. If we run this:

SELECT  soh.OrderDate,
        soh.ShipDate,
        sod.OrderQty,
        sod.UnitPrice,
        p.Name AS ProductName
FROM    Sales.SalesOrderHeader AS soh
JOIN    Sales.SalesOrderDetail AS sod
        ON sod.SalesOrderID = soh.SalesOrderID
JOIN    Production.Product AS p
        ON p.ProductID = sod.ProductID
WHERE   p.Name = 'Water Bottle - 30 oz.';

The resulting execution plan looks like this:

Constraint

But, if I modify the query to look like this, adding an additional AND filter on the constrained UnitPrice column:

SELECT  soh.OrderDate,
        soh.ShipDate,
        sod.OrderQty,
        sod.UnitPrice,
        p.Name AS ProductName
FROM    Sales.SalesOrderHeader AS soh
JOIN    Sales.SalesOrderDetail AS sod
        ON sod.SalesOrderID = soh.SalesOrderID
JOIN    Production.Product AS p
        ON p.ProductID = sod.ProductID
WHERE   p.Name = 'Water Bottle - 30 oz.'
        AND sod.UnitPrice > $0.0;

You know what happens to the execution plan? Nothing. It stays exactly the same. The optimizer knows that in order to satisfy the query, it can safely ignore the change in the WHERE clause. In fact, you can look at the SELECT operator properties for the two different plans and note that while the Query Hash values changes, the Plan Hash value stays the same. The plans are identical.

With that knowledge, I’m going to modify the query to look like this, reversing the UnitPrice reference to look for data that violates the constraint:

SELECT  soh.OrderDate,
        soh.ShipDate,
        sod.OrderQty,
        sod.UnitPrice,
        p.Name AS ProductName
FROM    Sales.SalesOrderHeader AS soh
JOIN    Sales.SalesOrderDetail AS sod
        ON sod.SalesOrderID = soh.SalesOrderID
JOIN    Production.Product AS p
        ON p.ProductID = sod.ProductID
WHERE   p.Name = 'Water Bottle - 30 oz.'
        AND sod.UnitPrice < $0.0;

And now we have a new execution plan:

Constraint_scan

 

The optimizer recognized that there is no way that any data can be returned with the WHERE clause above because there is an enforced constraint (note the use of the WITH CHECK clause on the constraint). This completely changes the execution plan in every possible way. Now, instead of attempting to access the data, a Constant Scan operator is put in as a place holder for an empty result set.

To sum up, yes, constraints absolutely affect the choices made by the optimizer when those constraints would have an affect on the plan, even a SELECT query. Also, I would argue, this means that the use of enforced constraints can be a performance enhancer since the optimizer can make intelligent choices about how a given query is dealt with.

UPDATE: Fixed a typo that said the constraint prevented data that was equal to or less than zero. It’s only for data less than zero.

Apr 03 2015

Speaker of the Month: April 2015

One of my favorite events of the year is the SQL Saturday in Silicon Valley. They’ve had four of them and I’ve gone to three (had to miss last year, scheduling conflict). It’s a fantastic event and Mark Ginnebaugh (b|t) does a great job putting it together. In fact, this year, we got to listen to Ross Mistry and T.K. Rengarajan have a “fireside chat” for the keynote. For those who don’t know, Mr. Rengarajan is just a VP at Microsoft. Yeah, he simply runs the ENTIRE FLIPPING AZURE DATA PLATFORM. That’s all. They had a few demos and showed us unreleased code and new versions of SSMS not yet available publicly (including functionality around Query Store and execution plans, my little heart was going pitter-pat). Anyway, if you missed it, you missed it… So… Speaker of the Month.

I am choosing Vicky Harp (b|t) and her presentation, Care and Feeding of Your System Databases.

This was a very good presentation and Vicky is a very good presenter. She presented without a microphone, even though one was available. Sitting in the back corner, I could hear her very clearly. She made excellent eye contact throughout the presentation. She used her slides as guides for what she was going to talk about. She used Zoomit appropriately and frequently. I learned a couple of things during the presentation. Vicky demonstrated excellent knowledge of the subject matter, fielding every question. Largely, I’d say this was almost a class in how to present properly. And, I think it’s a good choice because system databases are kind of easy to ignore, yet, there’s important stuff there that Vicky addresses well.

Feedback I’d give Vicky, number one, repeat the question. She handled all the questions so well. I mean she really knows her stuff. But, she didn’t repeat the questions and some of the people asking didn’t have booming voices, so despite it being a small room, I was guessing context based on Vicky’s answer (which, was easy since her answers were so good). I also thought some of the examples were overly simplistic. We might have hit more of the material in more depth by skipping a couple of the really simple examples.

I understand Vicky will be recording this for SQL Server World Wide Users Group. Not sure how to get that, but it’s absolutely worth a listen.

Mar 31 2015

PASS Board Update: March 2015

Hello all,

It’s been a while since my last update. Sorry. I’ve just been traveling and presenting and working on books and this slipped through the cracks a bit. I’ll try to do better in the future.

Nothing major to report. As a board member I’ve been working primarily on two things. First, I’ve been polishing my goals for the Chapters Portfolio. I’ll publish those below. Next, I’m working on the budget for Chapters for next year. In between times, a lot of what I do, is work with Regional Mentors (RM) and Chapter Leaders (CL) on an individual basis. As much as the stuff that PASS as an organization is involved with is around technology, the PASS organization, especially Chapters, is all about people. And you know how people are. I’ve already had online meetings with the RMs and CLs in which I let them know what was going on with the Board and with my goals. The one point I made that I want to share is my own focus. I told everyone, and I’m telling you, that I see my role as a means of figuring out what the PASS organization can do for the Chapters rather than telling them what they need to do for the organization. Other than that, at these meetings, I’ve listened and taken notes. I want to know what’s needed by the CLs and the RMs to do their work as volunteers, and, where possible, make that work easier.

I also had the opportunity to have a meeting with the user groups of the UK (whether they were PASS chapters or not). This was an opportunity to listen to what their needs are and to understand whether or not PASS is meeting them. One of the most interesting things I learned revolved around Chapter Leaders and the Summit. There are a number of requirements for a Chapter to be in good standing with the PASS organization. And, if the Chapter meets all those requirements, then they can receive one free registration for Summit. Cool beans… unless you’re overseas where the cost of getting to Summit is much higher than here in the States. Suddenly, this seemingly HUGE benefit just isn’t that big. I’ll see what I can do about this one. Not sure what yet, but I’m thinking about it.

Here is a rough draft of the goals I’ve been working on:

PASS Growth

  • Create a pilot program that allows a regional email to be sent to PASS membership that highlights user group events within that region
    • Purpose is to measure the work involved in order to understand the needs for expanding this to all regions
    • Success is arriving at a measure of work that allows for creating plan for next steps
    • Run for three months to arrive at measure
  • Increase communication between Director for Chapters, HQ and Chapter Leaders by hosting quarterly meetings

Global Growth

  • Create interface to connect Speakers with Chapters
    • Purpose is to provide for Chapters to get in touch with Speakers using our existing local speaker list in SQL Saturday
    • Success is mechanism for Chapter Leaders to find Speakers within their region
    • 3 months to arrive at minimal functionality
  • Complete definition of Regional Mentor Annual Rating process
    • Purpose is to provide Chapters with mechanism for reviewing RMs
    • Success is a completed document that has been reviewed by CL and RM represenatives
    • Time frame requires completion by EOY 2015
  • Continue efforts to establish chapters in countries with no or few existing chapters
    • Targeting Baltics, Mexico, Saudi Arabia, Argentina

That’s about it for now.

Coming up I’ll be going to the PASS Business Analytics Conference (use the code BFFGF, that’s my Friends & Family discount, to get some money off the registration). I’ll write something up about that right after it’s over.

Mar 30 2015

Does the New Cardinality Estimator Reduce Bad Parameter Sniffing

parameter sniffing https://www.flickr.com/photos/nalbertini/6224914311/in/photolist-au5j9P-oovkHJ-j5hbfL-o2tQP4-5rjH9-o63z6j-e16cSr-o6eBtd-83UkyT-61apWt-xMWdz-6vUmcL-ojqHXt-krd1A8-bwvg6r-4wvJLh-pweDW5-5UzK52-NwvaB-4uQ6em-dZ7BLC-reoTH4-7RN6Bi-kr7ViJ-5BgntE-83a6sH-duXGC-6c3j6-Rphmr-4ceWRz-9pMBSV-4mz1HK-2kLC5w-6N2QGL-bcP2Qc-62i9cA-dPe47C-6ugVmj-59x1pS-6PYR8J-9QpqJK-83XraW-xJ3MS-mkE3Fq-5uJunA-oMMXp-rEX8bF-of5Fpr-3aJrZd-6WLBU7/No. Next question.

Although, that answer can be slightly, ever so slightly, nuanced…

Parameter sniffing is a good thing. But, like a good wine, parameter sniffing can go bad. It always comes down to your statistics. A very accurate set of statistics with very little data skew (some values that have radically more/less data than other values) and a very even distribution (most values have approximately similar cardinality), and parameter sniffing is your bestest buddy on the planet (next to a tested backup). But, introduce some data skew, let the stats get wildly out of date, or suffer from seriously uneven distribution, and suddenly your best friend is doing unspeakable things to your performance (kind of like multi-statement table valued user defined functions).

SQL Server 2014 has the first upgrade to the cardinality estimator since SQL Server 7. To get the best possible understanding of what that means and how it works, I can only recommend Joseph Sack’s white paper. It’s THE source of good information. All the new functionality within the estimator, for the most part, in most situations, will probably lead to more accurate row estimates. This means, in theory, less bad parameter sniffing. But, if you are benefiting from the old estimation engine, and in some cases you absolutely are, then you may see an increase in bad parameter sniffing. It comes right back around to your statistics and your data distribution.

So, does the new cardinality estimator lead to a reduction in bad parameter sniffing? It still mostly depends on your statistics. So, I can’t say, with certainty that the new cardinality estimator in SQL Server 2014 literally reduces bad parameter sniffing. In many instances, probably even most, the changes to the model are going to resort in better row estimates which means better choices by the optimizer. In some instances this will reverse and you may see pain, directly related to the changes in the cardinality estimator. But, overall, the fundamental behavior of parameter sniffing, again, a very positive thing, are unchanged. So, to a large degree, at least in my mind, no, the new cardinality estimator doesn’t reduce bad parameter sniffing because, most of the time, it’s due to the three causes mentioned before, out of date statistics, data skew and uneven data distribution. The new cardinality estimator doesn’t change these fundamentals, so, mostly, it won’t reduce bad parameter sniffing.