Aug 21 2013

Speaking this Fall

I’ve been at home through the whole month of August, restingworking hard, but it’s time to get on the road again.

First up, it’s off to San Diego for two events. On Friday, September 20th, Red Gate (my Lord and Master) is hosting a free afternoon session of training. Steve Jones (b|t) and I will be walking through several sessions oriented towards the DBA and the skill sets they need to get things done. It’s a free event, training done the Red Gate way, so register here, get some free training, a little networking and finish up with a frothy beverage.

Then it’s off to SQL Saturday San Diego the next day, Saturday, September 21st. As I write this, they haven’t posted the schedule, so I’m not sure what session I’ll be doing, but come on down anyway, especially if you’re in the area for SQL in the City the day before (or vice-versa).

October, things heat up quite a bit. We’re taking SQL in the City on tour through the US again. Events will be in Pasadena on the 9th, Atlanta on the 11th, and Charlotte on the 14th. These are all-day events with a number of different topics relating to development, deployment and DBA work. And, it’s all done the Red Gate way. We regularly fully register these events and have to put people on waiting lists. So, two things. If you want to go, register now. If you’re registered and can’t go, please let us know so we can give your ticket to someone else. I love these events. They’re great for networking, learning, and we get a lot of input from people who are using, or even just interested in, Red Gate tools. I’m presenting three sessions, one on backups, one on database deployment, and another one… that I’m working on, OK? If you’re in Charlotte for the PASS Summit, we’ll be there Monday, so swing on by.

That leads directly into the PASS Summit 2013, October 15-18. What to say about PASS. Biggest, best, SQL event. Period. SQL Family, SQL Kilt, Pre-Cons, Sessions, Karaoke, Networking, Parties, Vendors. You know the drill. I have a full day pre-con on Tuesday before the Summit begins all about Azure and relational databases. I’m presenting three times during the summit, a spotlight session on Tuning Windows Azure SQL Databases, a session with Dandy Weyn (b|t) about being the DBA of the Future, and a group session with a bunch of dear friends on Lessons on Working from Home. I know this is an expensive proposition, but if you can go, you should. It really does build your skill set and your network.

Things slow down a little in November. Right now I’m speaking twice in Dallas. Once at the SQL Saturday there and once at a pre-conference seminar for that SQL Saturday event. This time we’re talking all about query tuning. If you’re in the area, please register.

And that’s not all. There’s more to come.


Aug 15 2013

Azure and Your MSDN Account

AstonI’ve heard over and over again that the reason people don’t want to learn Azure, to explore it, to understand where it’s applicable to them and their business (and where it’s not, because, it does have well-defined limitations) is because they don’t want to pay for it.



Do you have an MSDN account? What? You do? Then you have access to a credit in Azure and an account with a spending limit. Your credit level is dependent on what level of MSDN you currently subscribe to, but who cares. You not only will NOT pay anything to play in Azure, you will NEVER pay anything. You can’t go beyond your limit.

Want some more incentive (not that you should need any now)? Cool. Set up your Azure account with MSDN and you can win an Aston-Martin V-8. Well, you could, but I’m entered too, so I’m going to win (I work for a UK-based company, the car is red, I’ve been working with Azure for a long time… come on, it’s mine). Although, you never know.

If you do get registered for Azure and then want to learn more, faster, especially the relational aspects of Windows Azure SQL Database and SQL Server on Azure VMs, I am putting on an all day pre-conference seminar on exactly those topics at this year’s PASS Summit. Register here.

We can look at pictures of my new Aston-Martin.

Aug 12 2013

You should NOT attend the PASS Summit

logo_headerIf you asked me, prior to today, if I would type or say those words, I would have laughed right at you.

But then, I saw this question on Ask SQL Server Central. It’s from a college student, not yet twenty-one, who was considering paying his (assuming it’s a guy since their handle is ‘Eagle Scout’) own way to the Summit and wondered if it would be worth it. It pains me to say that I suggested that he not do it.

Don’t get me wrong. I think the PASS Summit is probably the single greatest resource you have to advance your career. Where else can you go to get that broad a choice in training? Where else can you go to get that many of the leaders of our industry, specializing in all aspects of SQL Server and the Microsoft stack, sharing their knowledge? Where else can you expect to have extended networking opportunities with those same leaders and all our peers? Easy answer right? Nowhere. It’s a unique place and a unique opportunity. So I’m nuts for telling this guy not to go, right? I don’t think so.

This case is unique. We’re talking about someone who is not yet employed in this industry, still in school, paying their own way out to this event. Not just paying for the event, but for the travel, the lodging, the food… It’s not a small expenditure. In fact, it carries a pretty hefty price tag. Someone in his situation, I couldn’t, in good conscience, recommend it.

What about the rest of you? I’ve heard people say that it costs too much, that it’s not worth it, that you can get the same information for free online, that there are better teachers elsewhere. So you don’t really need to spend the money either, right?

Right. Don’t go. Don’t spend that money. Seriously…

That way, those of us who recognize the unique value this conference offers will have a major leg-up on the rest of you in the job market. Not that there aren’t jobs enough for everyone. But the better jobs, the exciting and interesting jobs, those are going to go to people who are investing in their future by learning more, networking more and striving to achieve more. In short, the PASS Summit attendees who show up and take part. I can’t tell you how many people I know that are currently working at what they consider to be their dream job that point to the PASS Summit as the single biggest factor in landing that job. I do.

Stay home. Read a few blog posts. Don’t bother networking through events like SQL Saturday or your local user group. Certainly don’t travel to Charlotte and sit through sessions by industry leaders in order to ask them engaging and pertinent questions that will immediately help you improve on your company’s bottom line by improving the speed, safety or accessibility of its data. Save that money. Save that time. After all, you’re already in your dream job. Right?

Aug 07 2013

24 Hours of PASS Streaming Available

If you missed the 24 Hours of PASS Summit 2013 Preview, you missed some excellent sessions. I watched a few, but not all. But now, thanks to the wonders of modern technology, I can go back and catch the ones I missed. You can too.

I’d like to call out my session, Be a Successful DBA in the World of Cloud and On-Premises Data. I know that my fellow DBAs and database developers are largely dismissive of Azure. I get it. But I really think you’re missing out on this. It’s another excellent tool in your toolbox that you need to start taking advantage of. Little things like getting a quick and easy installation of SQL Server 2014 or Windows Server 2012 R2 up and running. Helpful things like the ability to quickly prototype applications and databases. The possibilities just keep expanding in and around this tool set. You can take advantage of the tools you’re working on mastering too.

Please, check out my presentation from 24HOP. I think I address many of the issues that are keeping people from exploring and understanding this new and growing technology. And, if you want to get a leg up on learning it, I’m putting on an all day pre-conference seminar at this year’s PASS Summit in Charlotte. Check out and register for Thriving as a DBA in the World of Cloud and On-Premise Data.

Jul 01 2013

Getting Started With SQL Server 2014 the Easy Way

You know you want to at least take a look at the new Client Technology Preview (CTP) of SQL Server 2014. I don’t blame you either. I want to spend hours swimming through it too. But, you’re thinking to yourself, “Heck, I’d have to download the silly thing, provision a new VM, walk through the install… Nah. Too much work.” I don’t blame you. I found myself on the road the day the software was released, so I was going to attempt to do all that work on a hotel wireless system. In short, I was going to have to wait, no options. Or were there? Actually, there is a much easier option. Azure Virtual Machines.

And no, it’s not that I can simply get a Windows Azure VM ready to go faster than I can a local one (and, depending on just how I set up and maintain my local servers, that might be true). No, it’s that I can immediately get a copy of SQL Server 2014, no download required. It’s that I can, within about five (5) minutes have a server up and running with SQL Server 2014 installed and ready to go. How? Microsoft maintains a gallery of images for quick setups of Azure Virtual Machines. A couple of those images include SQL Server 2014.


To get started on this, and not pay a penny, you need to make sure that you pass the MSDN permissions listed at that link. I know that some people won’t, and I’m sorry. However, get your MSDN subscription set up and link it to an Azure account, then you’re ready to go. Throughout this post, I’ll refer to paying for Azure, if you’re running through MSDN, just insert, “using up my credits” for “paying” and it should all make sense.

First, click on the Virtual Machines icon.

VMNewThis will show a list of VMs on your account, if any. We’re going to add one, so we’ll click on the little plus sign in the lower left corner of your screen.

Clicking on the New button gives you options. Reading the screen you can tell that you have a list of different services that you can add; Compute, Data Services, App Services, Networks and Store. By default, if you’ve opened this listing from the VM list, you’re going to already have Compute selected. That provides a second list of options; Web Site, Virtual Machine, Mobile Service and Cloud Service. Again, if you’ve opened these options from the VM list you’re going to have the Virtual Machine selected. If not, make sure that is what gets selected. The final two options you have are Quick Create and From Gallery. For our purposes we’re going to use the Gallery, but let me first tell you what the difference here is. Your licenses for SQL Server, Windows Server, and most Microsoft products (so far as I know) are transferable between Azure and your on-premises machines. This means you can create an empty virtual machine on Azure and then load your software on to it. You don’t pay additional licensing fees. But, you can also use the images on the Gallery. Here you can set up a VM for whatever is listed and you get those machines and their software for additional cost, but no additional license required. In short, you can pay a little bit more to get access to SQL Server or what have you without having to buy an additional license. It’s a great deal.


Worry about paying for it all later. We’re going to click on the From Gallery selection. This opens up a new window showing all the different possibilities you have for your VMs. You can install anything from Ubuntu to Sharepoint to several different flavors of SQL Server. You can even add your own HyperV images to this listing (although that does mean paying for licensing on any created VMs). Scroll down until you see SQL Server 2014 CTP1. On my listing currently, there are two copies. One that runs on Wndows Server 2012 and one that runs on Windows Server 2012 R2. If you want a Start button on your screen, pick the second one. You’ll then be walked through the wizard to get this thing created. Click on the right arrow at the bottom of the screen after selecting a VM.


Now you need to supply a machine name. It needs to unique within your account. You’ll also have to pick the size of machine you want. This, and the size of the data you store, is what you pay for. You’ll need to decide how you want to test 2014, small or large. For my simple purposes, exploring 2014, I’m going with Medium. That currently means 2 cores and 3.5gb of memory. You can go all the way up to 8 cores and 56gb of memory, but you will be paying for that, just so we’re clear. You also have to create a user and password for the system. Strict password rules are enforced, so you’ll need a special character and a number in addition to your string.


You need to configure how this machine will behave on the network. You need to supply it with a DNS name, your storage account, and your region. I would strongly recommend making sure that your servers and your storage are all configured for exactly the same region. Otherwise, you pay extra for that extra processing power. Also, you may see somewhat slower performance.


Finally you have to, if you want to, add this server to an Availability Group. For our test purposes we’ll just leave that set to None. But, you can make this a part of an AG in Azure or with a mixed hybrid approach as an async secondary with your on-premises servers. Oh yes, the capabilities are pretty slick. I would suggest also leaving PowerShell remoting enabled so that you can take advantage of all that will offer to you in terms of managing your VMs and the processes running within them.


VMCreatingClick on the check mark and you’re done. You’ll go back to the VM window and at the bottom of the screen you’ll see a little green icon indicating activity. It will take about five minutes for your VM to complete. While it’s running, you can, if you choose, watch the process, but it’s a bit like watching paint dry. You’ll see the steps it takes to create your machine and provision it with the OS and SQL Server version you chose.

Once it’s completed, you’ll have a VM with a single disk, ready to go. But, you need to connect to it. Remember that user name and password? We’re going to use that to create a Remote Desktop connection to the server. When the process is completed, the server will be in a Running state. Click on that server in the Management Portal and click on the Dashboard selection at the top of the screen. This will show you some performance metrics about the machine and, at the bottom, give you some control over what is happening. The main thing we’re looking for is the Connect button.

VMConnectClick on that button. You will download an RDP file from the Azure server. Open that file (and yes, your system may give you security warnings, click past them) and you’ll arrive at a login screen, configured for your Azure account. That’s not what you want. Instead, you’re going to click on “Use another account.” Then, in that window type in your machine name and user name along with the password. Once you click OK, you’ll be in an RDP session on your SQL Server 2014 CTP1 VM. Have fun!


Remember, you can stop the VM when you’re not using and you stop paying for it (or, using up your MSDN credits). Just go to the dashboard and use the “Shut Down” option at the bottom of your screen.

If you found this useful and you’d like to learn a lot more about the capabilities of using Azure within your environment, I’d like to recommend you sign up for my all day pre-conference seminar at PASS 2013 in Charlotte. I’ll cover this sort of thing and one heck of a lot more about the future of being a DBA working in the hybrid environment of Azure and on-premises servers.

Nov 09 2012

PASS Summit Day 3: Dr. David Dewitt

Two quick points, I’m putting this blog together using the Surface.. ooh… and this isn’t a keynote, but a spotlight session at the Summit. Still, I thought I would live blog my thoughts because I’ve done it for every time Dr. Dewitt has spoken at the Summit.

Right off, he has a slide with a little brain character representing himself.

But, we’re talking PolyBase, and futures. This is basically a way to combine hadoop unstructured nosql data with structured storage within SQL Server. Mostly this is within the new Parallel Datawarehouse. But it’s coming to all of SQL Server, so we need to learn this. The information ties directly back to what was presented at yesterday’s keynote.

HDFS is the file system. On top of that a framework for executing distributed fault-tolerant algorithms. Hive & Pig are the SQL languages. Sqooop is the package for moving data and Dr Dewitt says it’s awful and he’s going to tell us why.

HDFS was based on a google file system. It supports 1000s of nodes and it assumes hardware failure. It’s aimed at small numbers of large files. Write once, read multiple times. The limitations on it are caused by the replication of the files which makes querying the information from a datawarehouse more difficult. He covers all the types of nodes that manage HDFS.

MapReduce is used as a framework for accessing the data. It splits the big problem into several small problems. It puts the work out into the nodes. That’s Map, Then the partial results from all the nodes is combined back together through Reduce. MapReduce uses a master, JobTracker and slaves, multiple TaskTrackers.

Hive, a datawarehouse solution for Hadoop. Supports SQL-like queries.It has somewhat performant queries. By somewhat he says that the PDW is 10 times faster.

Sqoop is the library and framework for moving data between HDFS and a relational DBMS. It seriealizes access to hadoop. That’s the purpose of PolyBase to get parallel execution access all the Hadoop hdfs. Sqoop breaks up a query through Map process. Then Sqooop runs two queries a count, and then reworks the query into a pretty scary query including an ORDER BY statement. This causes multiple scans against the tables.

Dr. Dewitt talks through the choices for figuring out how to put together the two data sets, structured and unstructured. The approach taken by Polybase is to work directly into HDFS, ignoring where the nodes are stored. Because it’s all going through their own code, they’re also setting up to text and other data streams.

They’re parallelizing access to HDFS and supporting multiple file types. Further, putting “structure” on “unstructured data”

By the way, I’m trying to capture some of this information, but I have to pay attention. This is great stuff.

How the DMS,the stuff used by Microsoft to manage the jump between HDFS and SQL Server is just flat out complicated. But the goal was to address the issues above and it does it.

He’s showing the direction that they’re heading in. You can create nodes and objects within the nodes through sql-like syntax. Same thing with the queries. They’ll be using the PDW optimizer. Phase 2 modifies the methods used.

I’m frankly having a little trouble keeping up.

It’s pretty clear that the PDW in combination with the HDFS allows for throwing lots and lots of machines at the problem. If I was in the situation of needing to collect & process seriously huge data, I’d be checking this out. The concepts are to use MapReduce directly, but without requiring the user to do that work, but instead using TSQL. It’s seriously slick.

By the way, this is also making yesterday’s keynote more exciting. That did get a bad rap yesterday, but I’m convinced it was a great presentation spoiled by some weak presentation skills.

All the work in Phase 1 is done on PDW. Phase 2 moves the work, optionally, to HDFS directly, but still allows for that to be through a query.

Dr. Dewitt’s explanation of how the queries are moved in and out of PDW and HDFS are almost understandable, not because he[s not explaining it well, but because I’m not understanding it well. But seeing how the structures are logically handling the information does make me more comfortable with what’s going on over there in HDFS.

I’m starting to wonder if I’m making buggy whips and this is an automobile driving by. The problem is, how on earth do you get your hands on PDW to start learning this?



Nov 08 2012

PASS Summit 2012 Day 2: Keynote

Welcome to Day 2 of the PASS Summit!

It’s been a very exciting event so far. Today I’m presenting two sessions, one on tuning queries by fixing bad parameter sniffing and one on reading execution plans. Please stop by, or watch the one on execution plans on TV as PASS is livestreaming events all day long on SQL TV (which is what I used to call Profiler).

The intro video, which can be good or goofy was really good this year. They had people from all over the world talking in their native language, making the point that the PASS organization is a global community. It really is.

Doug McDowell is giving us the finance and governance information for the PASS organization. I find this boring and vital at the same time. We need to know how this organization is managed, if we care about the organization. And since, let’s be honest, this organization has changed many of our lives for the better. I mean through the family we’ve met, the jobs we’ve gained, and just the knowledge that has been shared with us. PASS has doubled it’s expenses in two years in order to support all the stuff they do, SQL Saturday, Rally, 24 Hours of PASS, etc. It’s amazing.

We have three new board members, Wendy Pastrick, James Rowland-Jones and  Sri Sridharan. Congrats guys. You’re crazy for taking part, but thanks for everything you do.

Next up is Tom LaRock, another board member and a good friend. The PASSion awards are great. It’s the people who are doing, crazy sick work for the community. Mention goes to Amy Lewis and Jesus Gil. But the award went to Jen Stirrup. Well deserved. She is so active and so passionate. It’s amazing. It’s a well deserved win for her. Congrats Jen and thanks for all you do.

PASS Board members are gathering feedback from the community. If you have an idea, talk to a board member.

Don’t forget to attend the Women in Technology Luncheon. Men and women can attend.

Quentin Clark is now up for the Microsoft part of the keynote. We’re seeing a bunch of people talk about how great SQL SErver 2012 is. It really is great. He’s taking off on the concept of the data lifecycle. That’s a pretty interesting topic. He’s talking about how big data is getting both really, really cool and absolutely frightening. Hotels tracking guests within their building, coupons & ads based on the person standing in the supermarket, things like that. People are actually to the point where we can do things like this. It’s really cool. But wow, that is going to build out some seriously large data sets. The idea is to make gathering, interpreting, and sharing data easy, simple and very, very fast.

We’re starting off with data management. The combination between SQL SErver and Hadoop is pretty slick. It’s PolyBase, the new technology announced yesterday. But, please, presenters, don’t leave teeny tiny fonts up on screen while you talk. Zoom in. The room can’t see it. However, that information was very interesting. I like seeing how you can put these things together. Next up is discovering and refining data. We’re going straight into Excel. That’s the bad news. The good news, Access is dieing. YAY!

So the demo was poorly delivered, but very well structured. We got a good idea of how exactly we can do this with the new technology. There are lots of setup in the management area and in Excel to prep for  what they’re calling the ‘Ah ha’ moment. In other words, this is making your data more and more available, but the work to set it up is absolutely non-trivial. The structures get built out in really interesting ways, especially all the model work you’ll be doing in SSAS in order to prep this data. They’re showing how Azure marketplace hooks in. Once all of it is put together, an incredibly difficult task, you can really poke at the data with these new tools. It’s exciting stuff. It’s a shame that the presenters sucked all the life out of it.

Nov 08 2012

PASS Summit 2012 Day 2: KILT DAY!

Welcome to the fourth Kilt Day at the SQL PASS Summit. It might be a little silly, but it’s fun. It’s also Women in Technology day with the WIT Luncheon. Guys are invited.

A short word about the bloggers table. Last year we were… a little loud. So this year, we were cautioned… well, more like told to be quiet or they’d take away our toys. I agree with the intent of the message, please keep it down. But the delivery… it hurt PASS at the bloggers table and upset people. As I was reminded last night by a dear, dear friend who I accidently hurt, how you deliver a message is as important as the message you deliver.

But, that’s OK. Let’s learn from our mistakes, grow & move on.


Last night, I attended karaoke with the fine people from PragmaticWorks. Thanks guys for a great event and for letting me in the door.

And did I mention, IT”S KILT DAY!

Nov 07 2012

PASS Summit 2012 Day 1

We’re off and running here at the PASS Summit.

New this year is live streaming all day.

Bill Graziano is introducing the Summit. More importantly, he’s introducing PASS. Further, he’s introducing speakers to everyone. He doesn’t mean just speakers at the summit, but anyone who has spoken at a SQL Saturday or a user group, and it was a scary large group of people. PASS has created a new web site to make it easier to find local Chapters. Track one down. On the one hand, it’s weird that we’re sitting at the PASS Summit and introducing the PASS organization, but I think they’re right to do it. It’s a great organization and I’m always surprised at how many people don’t know about it.

Bill’s big announcement is the all new PASS Business Analytics conference which will take place in Chicago in April 2013. Since more and more people have gotten good at collecting data, but we really do need to work harder on making use of that data.

There are 300 Microsoft engineers this year at the PASS Summit. That’s a serious amount of brain power. No wonder it’s been so warm here in Seattle. That much brain power is going to warm things up considerably. I’m planning on going and talking to these guys. You should too.

Nice work Bill.

Ted Kummert is up for the keynote. He’s showing the team from SQL Server 2012 at their release party. That’s a large group of extremely smart people. I appreciate all they’ve done. SQL Server 2012 is an excellent product. Nice job kids. Don’t get cocky.

The message. Big data. Because let’s face it, we’re getting bigger and bigger data all the time. It’s the harder problem and the sexier problem. Howerver, a lot of us are still working with small data sets and struggling. Don’t forget about us. But, the new In-Memory database that they’re putting out is pretty slick. That’s going to move things very quickly. Plus, as a geek, it gives us more to learn. Wonderful. The new functionality is going to be released with the next version of SQL Server, and it’s going to be a part of the system. That’s pretty cool. Of course, it’ll probably be Enterprise only, but if you need it, it’ll be worth it. This is going to make a big difference in performance tuning. It’ll open up additional opportunities.

Oooh. Management studio looks radically different. He’s working through a web page. Fair warning, it’s not the finished experience, but it’s very interesting that it might be the direction that they’re heading in. It also resembles Windows 8 a little. The demo’s look pretty cool. He’s improved performance 30 times by simply moving everything into memory. IO latching and locks just go away. Performance shoots through the roof. I need to get my hands on that. We all do probably.

See how columnstore works within this type of hardware and software is pretty amazing too. Plus, you can update it in the upcoming release and you can cluster it. We’re moving into a new world people.

But, don’t forget, this requires HUGE hardware, so it won’t be cheap, at all. Plus, it’s not magic. You’ll still be able to completely mess it up. You’ll still be able to write horrible queries or make poor choices in where to apply indexes. TANSTAAFL always applies.

HDInsight is the new non-relational storage engines based on what used to be Hadoop. This is some cool stuff. Plus we’re seeing newer and bigger Parallel Data Warehouse. I love how things are expanding out so quickly.

The most interesting thing I saw was a new UI for managing SQL Server that was web based. It’s pretty slick, but I’m wondering where that’s going to go.

They also introduced a new thing called PolyBase. It shows a common interface to allow queries across Hadoop and Structural data from a single query. That’s going to be a big deal, but… like everything else, it suggests a lowest common denominator approach. Performance? As Brent Ozar tweeted, if you liked linked servers, you’ll love PolyBase. However, since it’s only in the Parallel Data Warehouse, the hardware made just make any problems go away.

I just can’t get excited about PowerPivot. I agree that it’s a cool thing for business people, but I just can’t get into it. My failing. I know. However, the spatial data display within Excel… that’s slick stuff.

But this has been an interesting and exciting keynote. The new technology coming up from Microsoft is really cool. I think we’re getting a lot of new opportunities to do new things with our data.

Nov 05 2012

PASS Summit 2012: Day -3

The Summit proper starts on Wednesday, but the Summit starts at registration. I left a little early from work setting up for SQL in the City: Seattle in order to run up the hill and get to the convention center around the time that it opened. Why? Cause I get to meet my SQL Family for the first time this week. Lots of people are there and it really is like a family reunion. Smiles, hugs, catching up, stories. It’s the best way to launch the event. Not a lot to report, but I just had to share. I love my SQL Family.