Aug 12 2013

You should NOT attend the PASS Summit

logo_headerIf you asked me, prior to today, if I would type or say those words, I would have laughed right at you.

But then, I saw this question on Ask SQL Server Central. It’s from a college student, not yet twenty-one, who was considering paying his (assuming it’s a guy since their handle is ’Eagle Scout’) own way to the Summit and wondered if it would be worth it. It pains me to say that I suggested that he not do it.

Don’t get me wrong. I think the PASS Summit is probably the single greatest resource you have to advance your career. Where else can you go to get that broad a choice in training? Where else can you go to get that many of the leaders of our industry, specializing in all aspects of SQL Server and the Microsoft stack, sharing their knowledge? Where else can you expect to have extended networking opportunities with those same leaders and all our peers? Easy answer right? Nowhere. It’s a unique place and a unique opportunity. So I’m nuts for telling this guy not to go, right? I don’t think so.

This case is unique. We’re talking about someone who is not yet employed in this industry, still in school, paying their own way out to this event. Not just paying for the event, but for the travel, the lodging, the food… It’s not a small expenditure. In fact, it carries a pretty hefty price tag. Someone in his situation, I couldn’t, in good conscience, recommend it.

What about the rest of you? I’ve heard people say that it costs too much, that it’s not worth it, that you can get the same information for free online, that there are better teachers elsewhere. So you don’t really need to spend the money either, right?

Right. Don’t go. Don’t spend that money. Seriously…

That way, those of us who recognize the unique value this conference offers will have a major leg-up on the rest of you in the job market. Not that there aren’t jobs enough for everyone. But the better jobs, the exciting and interesting jobs, those are going to go to people who are investing in their future by learning more, networking more and striving to achieve more. In short, the PASS Summit attendees who show up and take part. I can’t tell you how many people I know that are currently working at what they consider to be their dream job that point to the PASS Summit as the single biggest factor in landing that job. I do.

Stay home. Read a few blog posts. Don’t bother networking through events like SQL Saturday or your local user group. Certainly don’t travel to Charlotte and sit through sessions by industry leaders in order to ask them engaging and pertinent questions that will immediately help you improve on your company’s bottom line by improving the speed, safety or accessibility of its data. Save that money. Save that time. After all, you’re already in your dream job. Right?

Aug 07 2013

24 Hours of PASS Streaming Available

If you missed the 24 Hours of PASS Summit 2013 Preview, you missed some excellent sessions. I watched a few, but not all. But now, thanks to the wonders of modern technology, I can go back and catch the ones I missed. You can too.

I’d like to call out my session, Be a Successful DBA in the World of Cloud and On-Premises Data. I know that my fellow DBAs and database developers are largely dismissive of Azure. I get it. But I really think you’re missing out on this. It’s another excellent tool in your toolbox that you need to start taking advantage of. Little things like getting a quick and easy installation of SQL Server 2014 or Windows Server 2012 R2 up and running. Helpful things like the ability to quickly prototype applications and databases. The possibilities just keep expanding in and around this tool set. You can take advantage of the tools you’re working on mastering too.

Please, check out my presentation from 24HOP. I think I address many of the issues that are keeping people from exploring and understanding this new and growing technology. And, if you want to get a leg up on learning it, I’m putting on an all day pre-conference seminar at this year’s PASS Summit in Charlotte. Check out and register for Thriving as a DBA in the World of Cloud and On-Premise Data.

Jul 01 2013

Getting Started With SQL Server 2014 the Easy Way

You know you want to at least take a look at the new Client Technology Preview (CTP) of SQL Server 2014. I don’t blame you either. I want to spend hours swimming through it too. But, you’re thinking to yourself, “Heck, I’d have to download the silly thing, provision a new VM, walk through the install… Nah. Too much work.” I don’t blame you. I found myself on the road the day the software was released, so I was going to attempt to do all that work on a hotel wireless system. In short, I was going to have to wait, no options. Or were there? Actually, there is a much easier option. Azure Virtual Machines.

And no, it’s not that I can simply get a Windows Azure VM ready to go faster than I can a local one (and, depending on just how I set up and maintain my local servers, that might be true). No, it’s that I can immediately get a copy of SQL Server 2014, no download required. It’s that I can, within about five (5) minutes have a server up and running with SQL Server 2014 installed and ready to go. How? Microsoft maintains a gallery of images for quick setups of Azure Virtual Machines. A couple of those images include SQL Server 2014.


To get started on this, and not pay a penny, you need to make sure that you pass the MSDN permissions listed at that link. I know that some people won’t, and I’m sorry. However, get your MSDN subscription set up and link it to an Azure account, then you’re ready to go. Throughout this post, I’ll refer to paying for Azure, if you’re running through MSDN, just insert, “using up my credits” for “paying” and it should all make sense.

First, click on the Virtual Machines icon.

VMNewThis will show a list of VMs on your account, if any. We’re going to add one, so we’ll click on the little plus sign in the lower left corner of your screen.

Clicking on the New button gives you options. Reading the screen you can tell that you have a list of different services that you can add; Compute, Data Services, App Services, Networks and Store. By default, if you’ve opened this listing from the VM list, you’re going to already have Compute selected. That provides a second list of options; Web Site, Virtual Machine, Mobile Service and Cloud Service. Again, if you’ve opened these options from the VM list you’re going to have the Virtual Machine selected. If not, make sure that is what gets selected. The final two options you have are Quick Create and From Gallery. For our purposes we’re going to use the Gallery, but let me first tell you what the difference here is. Your licenses for SQL Server, Windows Server, and most Microsoft products (so far as I know) are transferable between Azure and your on-premises machines. This means you can create an empty virtual machine on Azure and then load your software on to it. You don’t pay additional licensing fees. But, you can also use the images on the Gallery. Here you can set up a VM for whatever is listed and you get those machines and their software for additional cost, but no additional license required. In short, you can pay a little bit more to get access to SQL Server or what have you without having to buy an additional license. It’s a great deal.


Worry about paying for it all later. We’re going to click on the From Gallery selection. This opens up a new window showing all the different possibilities you have for your VMs. You can install anything from Ubuntu to Sharepoint to several different flavors of SQL Server. You can even add your own HyperV images to this listing (although that does mean paying for licensing on any created VMs). Scroll down until you see SQL Server 2014 CTP1. On my listing currently, there are two copies. One that runs on Wndows Server 2012 and one that runs on Windows Server 2012 R2. If you want a Start button on your screen, pick the second one. You’ll then be walked through the wizard to get this thing created. Click on the right arrow at the bottom of the screen after selecting a VM.


Now you need to supply a machine name. It needs to unique within your account. You’ll also have to pick the size of machine you want. This, and the size of the data you store, is what you pay for. You’ll need to decide how you want to test 2014, small or large. For my simple purposes, exploring 2014, I’m going with Medium. That currently means 2 cores and 3.5gb of memory. You can go all the way up to 8 cores and 56gb of memory, but you will be paying for that, just so we’re clear. You also have to create a user and password for the system. Strict password rules are enforced, so you’ll need a special character and a number in addition to your string.


You need to configure how this machine will behave on the network. You need to supply it with a DNS name, your storage account, and your region. I would strongly recommend making sure that your servers and your storage are all configured for exactly the same region. Otherwise, you pay extra for that extra processing power. Also, you may see somewhat slower performance.


Finally you have to, if you want to, add this server to an Availability Group. For our test purposes we’ll just leave that set to None. But, you can make this a part of an AG in Azure or with a mixed hybrid approach as an async secondary with your on-premises servers. Oh yes, the capabilities are pretty slick. I would suggest also leaving PowerShell remoting enabled so that you can take advantage of all that will offer to you in terms of managing your VMs and the processes running within them.


VMCreatingClick on the check mark and you’re done. You’ll go back to the VM window and at the bottom of the screen you’ll see a little green icon indicating activity. It will take about five minutes for your VM to complete. While it’s running, you can, if you choose, watch the process, but it’s a bit like watching paint dry. You’ll see the steps it takes to create your machine and provision it with the OS and SQL Server version you chose.

Once it’s completed, you’ll have a VM with a single disk, ready to go. But, you need to connect to it. Remember that user name and password? We’re going to use that to create a Remote Desktop connection to the server. When the process is completed, the server will be in a Running state. Click on that server in the Management Portal and click on the Dashboard selection at the top of the screen. This will show you some performance metrics about the machine and, at the bottom, give you some control over what is happening. The main thing we’re looking for is the Connect button.

VMConnectClick on that button. You will download an RDP file from the Azure server. Open that file (and yes, your system may give you security warnings, click past them) and you’ll arrive at a login screen, configured for your Azure account. That’s not what you want. Instead, you’re going to click on “Use another account.” Then, in that window type in your machine name and user name along with the password. Once you click OK, you’ll be in an RDP session on your SQL Server 2014 CTP1 VM. Have fun!


Remember, you can stop the VM when you’re not using and you stop paying for it (or, using up your MSDN credits). Just go to the dashboard and use the “Shut Down” option at the bottom of your screen.

If you found this useful and you’d like to learn a lot more about the capabilities of using Azure within your environment, I’d like to recommend you sign up for my all day pre-conference seminar at PASS 2013 in Charlotte. I’ll cover this sort of thing and one heck of a lot more about the future of being a DBA working in the hybrid environment of Azure and on-premises servers.

Nov 09 2012

PASS Summit Day 3: Dr. David Dewitt

Two quick points, I’m putting this blog together using the Surface.. ooh… and this isn’t a keynote, but a spotlight session at the Summit. Still, I thought I would live blog my thoughts because I’ve done it for every time Dr. Dewitt has spoken at the Summit.

Right off, he has a slide with a little brain character representing himself.

But, we’re talking PolyBase, and futures. This is basically a way to combine hadoop unstructured nosql data with structured storage within SQL Server. Mostly this is within the new Parallel Datawarehouse. But it’s coming to all of SQL Server, so we need to learn this. The information ties directly back to what was presented at yesterday’s keynote.

HDFS is the file system. On top of that a framework for executing distributed fault-tolerant algorithms. Hive & Pig are the SQL languages. Sqooop is the package for moving data and Dr Dewitt says it’s awful and he’s going to tell us why.

HDFS was based on a google file system. It supports 1000s of nodes and it assumes hardware failure. It’s aimed at small numbers of large files. Write once, read multiple times. The limitations on it are caused by the replication of the files which makes querying the information from a datawarehouse more difficult. He covers all the types of nodes that manage HDFS.

MapReduce is used as a framework for accessing the data. It splits the big problem into several small problems. It puts the work out into the nodes. That’s Map, Then the partial results from all the nodes is combined back together through Reduce. MapReduce uses a master, JobTracker and slaves, multiple TaskTrackers.

Hive, a datawarehouse solution for Hadoop. Supports SQL-like queries.It has somewhat performant queries. By somewhat he says that the PDW is 10 times faster.

Sqoop is the library and framework for moving data between HDFS and a relational DBMS. It seriealizes access to hadoop. That’s the purpose of PolyBase to get parallel execution access all the Hadoop hdfs. Sqoop breaks up a query through Map process. Then Sqooop runs two queries a count, and then reworks the query into a pretty scary query including an ORDER BY statement. This causes multiple scans against the tables.

Dr. Dewitt talks through the choices for figuring out how to put together the two data sets, structured and unstructured. The approach taken by Polybase is to work directly into HDFS, ignoring where the nodes are stored. Because it’s all going through their own code, they’re also setting up to text and other data streams.

They’re parallelizing access to HDFS and supporting multiple file types. Further, putting “structure” on “unstructured data”

By the way, I’m trying to capture some of this information, but I have to pay attention. This is great stuff.

How the DMS,the stuff used by Microsoft to manage the jump between HDFS and SQL Server is just flat out complicated. But the goal was to address the issues above and it does it.

He’s showing the direction that they’re heading in. You can create nodes and objects within the nodes through sql-like syntax. Same thing with the queries. They’ll be using the PDW optimizer. Phase 2 modifies the methods used.

I’m frankly having a little trouble keeping up.

It’s pretty clear that the PDW in combination with the HDFS allows for throwing lots and lots of machines at the problem. If I was in the situation of needing to collect & process seriously huge data, I’d be checking this out. The concepts are to use MapReduce directly, but without requiring the user to do that work, but instead using TSQL. It’s seriously slick.

By the way, this is also making yesterday’s keynote more exciting. That did get a bad rap yesterday, but I’m convinced it was a great presentation spoiled by some weak presentation skills.

All the work in Phase 1 is done on PDW. Phase 2 moves the work, optionally, to HDFS directly, but still allows for that to be through a query.

Dr. Dewitt’s explanation of how the queries are moved in and out of PDW and HDFS are almost understandable, not because he[s not explaining it well, but because I’m not understanding it well. But seeing how the structures are logically handling the information does make me more comfortable with what’s going on over there in HDFS.

I’m starting to wonder if I’m making buggy whips and this is an automobile driving by. The problem is, how on earth do you get your hands on PDW to start learning this?



Nov 08 2012

PASS Summit 2012 Day 2: Keynote

Welcome to Day 2 of the PASS Summit!

It’s been a very exciting event so far. Today I’m presenting two sessions, one on tuning queries by fixing bad parameter sniffing and one on reading execution plans. Please stop by, or watch the one on execution plans on TV as PASS is livestreaming events all day long on SQL TV (which is what I used to call Profiler).

The intro video, which can be good or goofy was really good this year. They had people from all over the world talking in their native language, making the point that the PASS organization is a global community. It really is.

Doug McDowell is giving us the finance and governance information for the PASS organization. I find this boring and vital at the same time. We need to know how this organization is managed, if we care about the organization. And since, let’s be honest, this organization has changed many of our lives for the better. I mean through the family we’ve met, the jobs we’ve gained, and just the knowledge that has been shared with us. PASS has doubled it’s expenses in two years in order to support all the stuff they do, SQL Saturday, Rally, 24 Hours of PASS, etc. It’s amazing.

We have three new board members, Wendy Pastrick, James Rowland-Jones and  Sri Sridharan. Congrats guys. You’re crazy for taking part, but thanks for everything you do.

Next up is Tom LaRock, another board member and a good friend. The PASSion awards are great. It’s the people who are doing, crazy sick work for the community. Mention goes to Amy Lewis and Jesus Gil. But the award went to Jen Stirrup. Well deserved. She is so active and so passionate. It’s amazing. It’s a well deserved win for her. Congrats Jen and thanks for all you do.

PASS Board members are gathering feedback from the community. If you have an idea, talk to a board member.

Don’t forget to attend the Women in Technology Luncheon. Men and women can attend.

Quentin Clark is now up for the Microsoft part of the keynote. We’re seeing a bunch of people talk about how great SQL SErver 2012 is. It really is great. He’s taking off on the concept of the data lifecycle. That’s a pretty interesting topic. He’s talking about how big data is getting both really, really cool and absolutely frightening. Hotels tracking guests within their building, coupons & ads based on the person standing in the supermarket, things like that. People are actually to the point where we can do things like this. It’s really cool. But wow, that is going to build out some seriously large data sets. The idea is to make gathering, interpreting, and sharing data easy, simple and very, very fast.

We’re starting off with data management. The combination between SQL SErver and Hadoop is pretty slick. It’s PolyBase, the new technology announced yesterday. But, please, presenters, don’t leave teeny tiny fonts up on screen while you talk. Zoom in. The room can’t see it. However, that information was very interesting. I like seeing how you can put these things together. Next up is discovering and refining data. We’re going straight into Excel. That’s the bad news. The good news, Access is dieing. YAY!

So the demo was poorly delivered, but very well structured. We got a good idea of how exactly we can do this with the new technology. There are lots of setup in the management area and in Excel to prep for  what they’re calling the ‘Ah ha’ moment. In other words, this is making your data more and more available, but the work to set it up is absolutely non-trivial. The structures get built out in really interesting ways, especially all the model work you’ll be doing in SSAS in order to prep this data. They’re showing how Azure marketplace hooks in. Once all of it is put together, an incredibly difficult task, you can really poke at the data with these new tools. It’s exciting stuff. It’s a shame that the presenters sucked all the life out of it.

Nov 08 2012

PASS Summit 2012 Day 2: KILT DAY!

Welcome to the fourth Kilt Day at the SQL PASS Summit. It might be a little silly, but it’s fun. It’s also Women in Technology day with the WIT Luncheon. Guys are invited.

A short word about the bloggers table. Last year we were… a little loud. So this year, we were cautioned… well, more like told to be quiet or they’d take away our toys. I agree with the intent of the message, please keep it down. But the delivery… it hurt PASS at the bloggers table and upset people. As I was reminded last night by a dear, dear friend who I accidently hurt, how you deliver a message is as important as the message you deliver.

But, that’s OK. Let’s learn from our mistakes, grow & move on.


Last night, I attended karaoke with the fine people from PragmaticWorks. Thanks guys for a great event and for letting me in the door.

And did I mention, IT”S KILT DAY!

Nov 07 2012

PASS Summit 2012 Day 1

We’re off and running here at the PASS Summit.

New this year is live streaming all day.

Bill Graziano is introducing the Summit. More importantly, he’s introducing PASS. Further, he’s introducing speakers to everyone. He doesn’t mean just speakers at the summit, but anyone who has spoken at a SQL Saturday or a user group, and it was a scary large group of people. PASS has created a new web site to make it easier to find local Chapters. Track one down. On the one hand, it’s weird that we’re sitting at the PASS Summit and introducing the PASS organization, but I think they’re right to do it. It’s a great organization and I’m always surprised at how many people don’t know about it.

Bill’s big announcement is the all new PASS Business Analytics conference which will take place in Chicago in April 2013. Since more and more people have gotten good at collecting data, but we really do need to work harder on making use of that data.

There are 300 Microsoft engineers this year at the PASS Summit. That’s a serious amount of brain power. No wonder it’s been so warm here in Seattle. That much brain power is going to warm things up considerably. I’m planning on going and talking to these guys. You should too.

Nice work Bill.

Ted Kummert is up for the keynote. He’s showing the team from SQL Server 2012 at their release party. That’s a large group of extremely smart people. I appreciate all they’ve done. SQL Server 2012 is an excellent product. Nice job kids. Don’t get cocky.

The message. Big data. Because let’s face it, we’re getting bigger and bigger data all the time. It’s the harder problem and the sexier problem. Howerver, a lot of us are still working with small data sets and struggling. Don’t forget about us. But, the new In-Memory database that they’re putting out is pretty slick. That’s going to move things very quickly. Plus, as a geek, it gives us more to learn. Wonderful. The new functionality is going to be released with the next version of SQL Server, and it’s going to be a part of the system. That’s pretty cool. Of course, it’ll probably be Enterprise only, but if you need it, it’ll be worth it. This is going to make a big difference in performance tuning. It’ll open up additional opportunities.

Oooh. Management studio looks radically different. He’s working through a web page. Fair warning, it’s not the finished experience, but it’s very interesting that it might be the direction that they’re heading in. It also resembles Windows 8 a little. The demo’s look pretty cool. He’s improved performance 30 times by simply moving everything into memory. IO latching and locks just go away. Performance shoots through the roof. I need to get my hands on that. We all do probably.

See how columnstore works within this type of hardware and software is pretty amazing too. Plus, you can update it in the upcoming release and you can cluster it. We’re moving into a new world people.

But, don’t forget, this requires HUGE hardware, so it won’t be cheap, at all. Plus, it’s not magic. You’ll still be able to completely mess it up. You’ll still be able to write horrible queries or make poor choices in where to apply indexes. TANSTAAFL always applies.

HDInsight is the new non-relational storage engines based on what used to be Hadoop. This is some cool stuff. Plus we’re seeing newer and bigger Parallel Data Warehouse. I love how things are expanding out so quickly.

The most interesting thing I saw was a new UI for managing SQL Server that was web based. It’s pretty slick, but I’m wondering where that’s going to go.

They also introduced a new thing called PolyBase. It shows a common interface to allow queries across Hadoop and Structural data from a single query. That’s going to be a big deal, but… like everything else, it suggests a lowest common denominator approach. Performance? As Brent Ozar tweeted, if you liked linked servers, you’ll love PolyBase. However, since it’s only in the Parallel Data Warehouse, the hardware made just make any problems go away.

I just can’t get excited about PowerPivot. I agree that it’s a cool thing for business people, but I just can’t get into it. My failing. I know. However, the spatial data display within Excel… that’s slick stuff.

But this has been an interesting and exciting keynote. The new technology coming up from Microsoft is really cool. I think we’re getting a lot of new opportunities to do new things with our data.

Nov 05 2012

PASS Summit 2012: Day -3

The Summit proper starts on Wednesday, but the Summit starts at registration. I left a little early from work setting up for SQL in the City: Seattle in order to run up the hill and get to the convention center around the time that it opened. Why? Cause I get to meet my SQL Family for the first time this week. Lots of people are there and it really is like a family reunion. Smiles, hugs, catching up, stories. It’s the best way to launch the event. Not a lot to report, but I just had to share. I love my SQL Family.

Oct 17 2012

SQL In The City: Seattle

If you missed all the great speakers on the five city tour of SQL in the City, don’t despair. Many of the same people will be back at SQL in the City in Seattle. It’s scheduled on Monday before the PASS Summit proper starts, so if you’re looking to get your learn on early and you can’t sign up for a pre-con, this is a great, free, opportunity to pick up some additional instruction. Check out the list of speakers. It’s going to be an event worth attending.

I’ve seen the early drafts of the feedback forms from the prior five events. People really seem to enjoy this slightly different approach. In short, Red Gate puts on a heck of a show.

During the five city tour, I was able to do three different presentations, two focused on improving database development processes and one on picking up some of the more obscure monitoring metrics. Based on the feedback, these went over well.

One of the biggest hits is my, for want of a better term, sales pitch for a sandbox development process. From talking to people and reading through the feedback forms, it seems that large majorities are hitting some pretty common issues while attempting to develop databases. DBAs, well intentioned, and right, though we may be, are, to a degree, standing in the way of developers going as fast as they can during development. In this session I spend an hour outlining how I think we, DBAs, can fix that problem. It may sound like a developer focused session, but I’m really hoping to get a room full of DBAs and get them all convinced to be on the side of developers and the development process in order to help deliver more code faster into our production environments, but do it in a safe and secure manner. Developers are welcome as long as they bring everything they learn back to their DBA team. But it all starts at the sandbox. Come to my session and I promise to explain it in full.

And, of course, this is another chance to meet, talk to, interact with, and have some laughs with your #sqlfamily. If you’re in Seattle anyway, stop by, learn something, talk to someone, have a little fun.

Sep 24 2012

Interviewing a DBA

I’m not a fan of trivia style interview questions. Yes, I ask a few because you have to in order to immediately eliminate the completely unqualified applicants. Even those types of questions, in my opinion, need to be focused on concepts and not syntax. The reason we have the Books Online with SQL Server is because you shouldn’t have to memorize every possible command along with all their parameters. Want to know how to write a MERGE query? Look it up. What does a MERGE query do? That you ought to know. I think concepts are important. Questions about the recovery models within SQL Server aren’t trivia about the system, they’re trying to get to your understanding of how point in time recovery works.

I don’t really like posting interview questions. And most of the time when I’ve seen interview questions posted (even mine), they’re pretty trivial stuff that doesn’t really get to whether or not the person you’re trying to hire is a good fit for the position and your team. I also don’t like posting interview questions because some people will try to use them to study up and attempt to BS their way into a position they frankly don’t deserve and haven’t earned. SQL Server knowledge and experience comes from using it to solve problems out in the world and protecting the information generated by a business.

That’s why I love this question. And I don’t mind sharing it with you because you can’t really memorize an answer to it:

You get a call from one of the business people. They tell you that the database is running slow. What do you do?

This is completely and utterly open-ended. It can go anywhere. In fact, it’s going to go where you lead it. For example, you could say “I first look at the Windows server error logs.” OK, that’s fine (several people I’ve interviewed started there). What indications would you find there that the server is running slow or what would you find there to show why the server is running slow? Suddenly, maybe you don’t want to look at the error logs for the server any more, or maybe you do. But you get the idea. There is no single correct answer here. There are however, lots of very problematic paths, and I’m going to let you go down them. I had one guy insisting that the very first thing he needed to do after the phone call was take a look at the application code to see the method used to make the call to the database. We spent quite a bit of time exploring why this seemed to be the best approach to him. Was it? I’m not saying. No hints on this one. Your answer for this question, is your answer, and that’s why I love it.

Further, as we explore this question, and I’ve spent anywhere from 10 minutes up to an hour working on it as part of an interview, I’m also getting to see how you deal with problematic situations, what your logic chain looks like, what your understanding of SQL Server is, and, most importantly, how you fit into the team. Because with an open-ended question like this, we get to talk. We’re way beyond silly trivia contests now.

Before you think this is unfair to people who aren’t performance experts, fine, let’s talk about what happens when you get an alert that the server is offline. Not a systems person? OK, we just got an alert that a database consistency check failed, now what? See, the point is to go on an adventure where we explore your knowledge and approach. I just have to work hard to make sure we stay somewhat on topic so that I can assess your knowledge and skill level.

Now, if I approach any of these questions and your response is to reject them out of hand, something I’ve run into, then we’re done. I’m not going to focus on trivia, which is how lots of people prep for interviews. I expect you to have concepts, process, logic, and methods available from your time studying and learning. So if we interview, be ready for this exploration, not a trivia contest. And the only way to really prepare is to get experience and knowledge by actually working with SQL Server.

Oh, and sometimes, I ask questions or make statements that are wrong. Sometimes it’s on purpose. Other times, it’s because I screwed up or was ignorant. But you can’t sit there agreeing with me. You better be paying attention because I might be testing you further.

This type of question is just too perfect for understanding how much you know about SQL Server.

Want to start to prepare for answering this kind of question? I’ve got an opportunity for you. At the PASS Summit 2012 this year, I’ll be running an all-day pre-conference seminar called Query Performance Tuning: Start to Finish. In it, I’ll cover quite a bit of what might make it possible for you to answer this question should you be presented it in an interview. No, I’m not guaranteeing you’ll answer it correctly. I’m just offering a chance to prepare. Sign up for the Summit today. There’s still a discount in place that can help you offset the cost of the seminar until the 30th of September.