Apr 23 2014

Azure Automation

I introduced Azure Automation in a previous post. I’ve spent some more time exploring it.

There’s a set of documentation available as I noted before. Unfortunately, reading through the full set of documentation, I have some criticisms to offer. The layout of the documentation goes through “Common runbook tasks” actually more or less laying things out as I did, inadvertently, I assure you, in my previous blog post. The problem with that, as I found in that post is, the administration of the runbooks seems fairly straightforward from the screens. But, you can’t do a darned thing with any of it until you have a runbook . Further, you can’t have a runbook until that thing has some code in it. And, the documentation doesn’t include documentation about code. Instead, we just get a page with a list of samples, but no links to that code, nor an indication of where it might be. The scripts are located here. But man, that ought to be in the documentation. There’s also no clearly documented method for how to start doing the development. It’s not really necessary since the GUI leads you inevitably to the Draft screen we saw in my other post. But, documentation is generally supposed to let you know what to do, where to look, etc.

There is another set of documentation just on authoring runbooks. Lots and lots more meat there. I’ll go through it and follow up further.

Enough criticism, let’s play with some code.

I’m going to start with the “Hello World” code set. It’s supposed to be an introduction to how everything works. You can’t open it from the Azure Portal. Instead you have to download it to your machine and then either upload it into a new runbook or copy and paste it into the Draft editor window. Presumably this is so you can do the coding locally using the PowerShell ISE or other tools. Documentation for the script is clear. It’s description:

If you are brand new to Automation in Azure, you can use this runbook to explore testing and publishing capabilities.

 

Well, let’s just say that’s a little grandiose for what is, literally, a “Hello $Name” example. But, it’ll get your feet wet. I took the script, pasted it into my “RunningScare” runbook. From there, I have the  capacity to Save, Test, or Publish. Being a good paranoid type, I ran test first. It popped up a window to input the parameter and then showed the output in the Output Pane (which I hadn’t actually noticed):

OutputPane

I can’t tell you why it output multiple times, but it did from one test of the script. To see the rest of the functionality, scheduling, etc., I went ahead and hit Publish. That moved it from Draft to Published where all I can see is a faded outline of the actual script and a Start button at the bottom of the screen. I went ahead and ran it from there. It actually takes a surprisingly long time for such a silly small script to complete.  There’s event the ability to view the Job as it’s running:

JobSummary

So that works. Next up, scheduling. It’s pretty straight forward to walk through the GUI in the Portal (although, now I want to see if I can programmatically control the Automation interface, more to explore). I’m going to try to run this script once an hour. So, I’ll give the schedule the name, unique to my account, Hourly (imagination knows no bounds). And then things get weird. I can only schedule this for a “One Time” run or “Daily.” No other options available:

Schedule

Nothing in the core documentation about the details of scheduling. Checking the authoring doc (which has tons of stuff in it) there is a PowerShell command for directly controlling this (oh yes, much more to explore), Set-SmaSchedule. But, it’s not clear if the command has more variables other than a day interval. I’ll have to test it out to see. The Portal recognized that parameters were necessary, so I put one in and scheduled my runbook. Worked great.

With that, I have my first run book set up, tested and scheduled. So far, this is looking really interesting.

 

 

 

Apr 16 2014

Microsoft Azure Automation

AutomationMicrosoft just announced a new mechanism for managing your Azure resources, Automation. You can check out the documentation on it here. It’s a mechanism to create runbooks using PowerShell that you can then combine with other runbooks inside a runbook, etc. Let’s check it out. I’m doing everything you see here without consulting the documentation. I want to see how easy it is to put this stuff together. First, because it’s still in preview, you have to sign up. Once you’re accepted in the program, you get a new icon in your Management Portal.

Next, you’ll have to create an automation account. That’s pretty straight forward. It’s just a name, your selected region and the subscription you’re putting it under. No immediate guidance on where, when or if the region matters that much:

AutomationAccount

When you get into Account, nothing is there. No default runbooks or anything. Now, I know you can go and get sample runbooks and I suspect there will even be a clearing house through GitHub or somewhere for runbooks. But right now, we’re flying by the seat of our pants, so let’s just create our own runbook. It’s Azure, so the New button is right there in the lower left. Clicking on it, we can do a quick create for runbooks, code to be complete later:

Runbook

Nice and easy so far, although, so far, this thing doesn’t do anything. So now, I have a runbook. If I open it up, it shows a dashboard with several tabs, Jobs, Author, Schedule, Configure. Clicking on Jobs, I don’t see anything interesting displayed. Obviously I should click on Author next, but where is the fun in that? So I click on Schedule. That brings up a message that “You must publish this runbook before you can add a schedule. Click AUTHOR to author and publish this runbook.” So I go ahead and click on Configure (yeah, I’m that guy). It’s not that interesting. So, let’s go back to where we should have started, Author. Clicking there, I get this:

Published

OK, fine. Let’s click on Draft. Which brings me to, well, I think it’s a fascinating screen:

Script

It’s a script. And that’s PowerShell it expects me to type in there, but I really don’t know what kind of commands I should be using, so… Here endeth the beginning of my exploration of Azure Automation. I need to actually go and read the docs. More to come.

 

Jun 03 2013

Using PowerShell to move files to Azure Storage

I was searching all over the place to try to find out how to move files into Azure Storage. Most of the examples I could track down were for API calls through C#. There were very few examples using PowerShell, and most of those were for older versions of Azure. But, the one thing I’ve learned about Azure, it’s a fast moving target. Sure enough, in March there were new PowerShell blob cmdlets released. This is great news. I was able to get this to work:

[System.Reflection.Assembly]::LoadFrom("C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\2012-10\bin\Microsoft.WindowsAzure.StorageClient.dll")

$account = [Microsoft.WindowsAzure.CloudStorageAccount]::Parse("DefaultEndpointsProtocol=https;AccountName=mystorage;AccountKey=mykey")
$client = [Microsoft.WindowsAzure.StorageClient.CloudStorageAccountStorageClientExtensions]::CreateCloudBlobClient($account)
$blob = $client.GetBlockBlob('container\adw.bacpac')
$blob.UploadFile("c:\bu\adw.bacpac")

But let’s face it, this code makes more sense and is a heck of lot easier to write:

$context = New-AzureStorageContext -StorageAccountName mystorage -StorageAccountKey mykey 
Set-AzureStorageBlobContent -Blob "adw.bacpac" -Container container -File "c:\bu\adw.bacpac" -Context $context -Force

The problem is, since the stuff has only been out since March, anyone with working code is likely to be using the first sample. In fact, if you search for New-AzureStorageContext and put it inside quotes so you only get exactly that, you won’t even see the blog from Microsoft that I linked above until the fourth entry, let alone documentation. Heck, if you search MSDN for New-AzureStorageContext there are only two entries. So, I’m getting this blog post out in order to help spread the word. Here’s one other source of information on the new Azure Blob Storage commands in PowerShell from Microsoft

There is excellent stuff coming out from the Azure team in and around PowerShell. If you’re not using PowerShell to automate your local servers, you’re crazy, but the good news is it’s getting easier and easier to administer Azure from PowerShell as well.

May 28 2013

SMO vs Invoke-Sqlcmd in Azure

I’m working on a series of Powershell scripts for the pre-conference seminars on Windows Azure SQL Database that I’m going to be giving at TechEd and the PASS Summit. One of the things I want to do is script out a full backup process (one that’s consistent with transactions) which means I need to create a database copy. There’s a nice neat T-SQL command that does this:

CREATE DATABASE ADW_COPY
AS COPY OF nmhdhskcew.AdventureWorks2012;

So, being the lazy Powershell freak that I am, I figured I’d just pass it in with Invoke-sqlcmd as a single line like this:

Invoke-Sqlcmd -ServerInstance $server -Database $db -Username $user -Password $password -Query $sql

Which resulted in an unexpected error:

Invoke-Sqlcmd : The CREATE DATABASE statement must be the only statement in the batch.

Well, now I’m curious. I’m passing in a single command. It’s not part of any other commands or statements or batches. What if I did the same thing in SMO?

$cn = 'Server=tcp:server.database.windows.net;User ID=CloudAdmin;Password=@password;Trusted_Connection=False;'
$conn = New-Object System.Data.SqlClient.SqlConnection
$conn.ConnectionString = $cn

$srv = New-Object ("Microsoft.SqlServer.Management.Smo.Server") $conn

$db = $srv.Databases["master"]

$db.ExecuteNonQuery($sql)

It worked flawlessly. Hmmm… Maybe there’s an extra command being passed in from Invoke-Sqlcmd? Not sure. But it’s interesting.

May 20 2013

Azure does Powershell too

Or, I guess it might be more appropriate to say that Powershell does Azure. Regardless, there are a set of commandlets for Azure and Azure SQL Database. Here’s the link to get you started and the basic documentation. After that, it’s a little tricky to know for sure what’s required. Follow the instructions on the link to get the basic set up done, and then I’ll show you just a little bit about how you can manage your Windows Azure SQL Database through PowerShell.

First up, you need to set up a context, which is basically a connection to your Azure server. This requires very specific objects. The code below outlines what you need:

$SQLLogin = new-object System.Management.Automation.PSCredential("UserName", 
('12345' | ConvertTo-SecureString -asPlainText -Force))

$context = New-AzureSqlDatabaseServerContext –ServerName 'MyAzureServer' -Credential $SQLLogin

Nothing to it really. You create the PSCredential, your WASD SQL login, proving your user name and password (and no, that’s not my user name, and I have a much more secure password, going all the way to 8). Then you set the context, passing it your Server Name and the PSCredential you just created. That’s it. Now you’re connected to your Azure account and get up to all sorts of deviltry. Well, currently you can mainly manage the system, adding & dropping databases, servers and firewall settings. For example, this will create a new database:

$db = New-AzureSqlDatabase -Context $context -DatabaseName 'MyNewDB' -Edition Web

Nothing to it. And this will delete it:

Remove-AzureSqlDatabase -Context $context -Database $db -Force

Easy stuff. Just don’t forget the -Force property or you’ll have a window popping up asking “Are you sure Private Pyle? Which side was that Private Pyle?” You can also pull down information about the database:

$db.CreationDate
$db.Edition
$db.MaxSizeGB

Now, you want to run queries and other stuff, then you’re back to relying on Invoke-SqlCmd, but that works too. No reason you can’t start combing these commands together. Powershell on Azure quickly starts to open up possibilities for automation of your management. Further, like all things Azure, Microsoft is constantly working on making it better and better.

For more Powershell, WASD, database management, Azure VMs, tuning, high availability and the rest, be sure to register for the precon I’ll be doing with Dandy Weyn (b|t) and Thomas LaRock (b|t) at TechEd 2013 North America and TechEd 2013 Europe in just a very few weeks. It’s going to be fun stuff and we might teach you a little too.

Dec 03 2012

HDInsight, Finally

See this:

That’s right. The install worked. All I had to do was get a completely clean server set up. No domain controller. Now to get my learn on.

Microsoft has a web site with a number of introductory samples. I’ll start there and work through them. The very first example gets me set up with some data that it builds by running a Powershell script, importdata.ps1. But I’m not going to just blindly follow along. I want to see what the heck is happening so I can start understanding this stuff. By the way, thank you Microsoft for making the samples in PowerShell and not forcing me to relearn Python or something else. That would have been frustrating.

The script is really simple. It has two scenarios you can pass it, w3c or RunTraffic. They just change directory and run another PowerShell script, import.ps1, from two different directories. I’ll be the scripts are different. I’m running the w3c scenario, so let’s see what that script is doing.

Ah, now things are getting interesting. There are two functions, one for data generation which uses an executable to make up test data. The other a mechanism for calling Hadoop. Basically it uses two objects, System.Diagnostics.ProcessStartInfo and System.Diagnostics.Process. The ProcessStartInfo is for defining startup information for a process that you then define using the Process command. In this case it’s setting the location of hadoop:

$pinfo.FileName = $env:HADOOP_HOME + "\bin\hadoop.cmd";

Then it sets up arguments, if any. The actual calls to this from the code use a command, dfs, which has different settings -mkdir and -copyFromLocal. From what I can tell, it’s creating a storage location within Hadoop and then moving the data generated over. I’m good with all the scripts I can see except knowing where this dfs thing comes from.

Data load ran just fine:

Data loaded, time to test out a Map/Reduce job. Again there’s a powershell script included for running a simple job, so I check it out. First run, fails. Great. More stuff to try to troubleshoot in order to be able to see this work. This is not going to be easy.

Stepping through and running the scripts might not be the best way to learn this. So, I’m going to now start reading the Big Data Jumpstart Guide. I’ll post more as I learn it.

 

Nov 02 2012

More Surface

I’m continuing to work with the Surface.

BOOM

I just had my first crash. I was in IE when I tried to switch tabs and the pull down window would not stay open. So I tried closing IE. That didn’t work at all. I tried switching to the Start window or switching apps. No joy. I then did a hard restart, holding down the power button for about 10 seconds. On restarting, the system started updating itself, which was weird since I had made sure to do all the updates when I started. The update finished, I logged in, and noticed that the wifi symbol looked funny (wish I had a screen capture, ooh, is there a utility for that?). No big deal until I tried connecting to the wifi service, which just wouldn’t happen. In fact, it said there were no wifi services available. So I went to the settings. Wifi was gone. Not even an option.

I’m sitting there, heart in my mouth, my expensive new toy is toast. I’m just boarding an airplane when this happens, so I’m not going to be able to spend time troubleshooting. I finally decide to do another reboot, just because (have you tried turning it off and turning it on again). It reboots while I board the plane. Just before takeoff I check and wifi is back. Shew! But what caused the weird lockup?

I went to the desktop, which is a weird and funny place on a tablet like this. It’s Windows… mostly. I checked through the event logs, which look exactly like you’d expect. They’re also just about as useful as usual. I could see that I had an unexpected reboot. But I didn’t see any errors prior to the reboot that would indicate what had gone wrong. In fact, everything looked perfectly normal. More fun with the Surface.

POSH

Next up, PowerShell. Yeah, it’s Windows 8. This means we have a direct method for controlling the device. What’s more, since you can do Remoting from PowerShell, I can use this device to take charge of others. If you’re already using Windows 8, you know that the Start screen is actually in two layers, the pretty one, and the one with everything else. Go to the one with everything else and you get a whole bunch of apps that are immediately evident, including PowerShell. This opens inside the desktop.

I ran a little command, just to see what would come back. dir env:

Worked great. I can see the four processors, information about the machine and my login. Just like a real computer running PowerShell on a real operating system. Which, pretty much means I am running on a real computer with a real OS. This absolutely opens possibilities. I’m not sure what to do with it yet, but it’s something I’m going to keep in the back of my head as a mechanism for making this table more functional.

TWEETS

Part of my job is community engagement. Lots of that is done through web apps, posting on news groups, posts on G+ or Facebook, all through the browser. But, I also tweet. And tweeting through just the browser? Blech! I need an app. I’ve gone through four apps so far.

ROWI: Pretty. Very pretty. As a matter of fact, if most of the tweets I received had pictures, this would be a great app. Well, until you start tweeting yourself. No picture attachment capability at all that I could find. No completion when typing people’s twitter handles. No lookup for handles either. I couldn’t find a way to add search columns either. While this is a very pretty interface, it’s a pretty useless tool.

MetroTweet: This one was pretty good. It’s not as pretty as some of the other apps. Instead, it’s functional. It works really well. You can do most of what you would expect from a good twitter app. Tweets allow you to add location and pictures. You can set up searches and save them as new columns. I really liked this app but for one thing. It doesn’t allow for lookups on users handles. I’m sorry, but I’m not smart enough to remember the exact spelling of everyone’s twitter handle. So this is a contender, but not my preferred app on my Surface.

Tweeterlight: This is a clean and simple interface. It has to be because it gives you very little control over the environment. Basically, you have to set up searches in Twitter itself. Then they’ll be available in this app. Otherwise, nope. Tweeting is ok, but week. You can add pictures, but not take them. No name lookup at all. I’d rather use MetroTweet if these were my only choices.

TweeTro: My current winner. This app is very pretty. They really went after following the new RT styles. But, it’s still quite functional. You can add searches directly through the interface and move between columns by scrolling around. Tweets let you add pictures or take them. You get name look ups and searches directly from tweets. It doesn’t have location on the tweets, but that’s not something I use much. It even has a column just for the pictures that people are tweeting like ROWI. Tweetro is a solid, well put together, twitter client that does most of what you would need and does it with some style.

Apr 28 2011

SQL University: SQL Azure & PowerShell

SQL-University-Shield-268x300Welcome once more to the Miskatonic branch of SQL University. I see that most off you survived out last encounter… uh, class. Most of you even appear somewhat sane… somewhat. Today we’re going to talk about PowerShell and it’s use with SQL Azure. Which sounds a something like an eldritch horror breaking in from some outer dimension… hmmm… that could explain certain things… So, back to Powershell and SQL Azure.

You can’t run Powershell from SQL Azure. Thank you for attending and enjoy your trip home.

Still here? I did try. Let’s clarify. You can’t run PowerShell from SQL Azure, but I didn’t say that you couldn’t use SQL Azure as a target for PowerShell. It is possible to connect to your SQL Azure databases running PowerShell from a local desktop or server. Let’s examine the basic connections.

The strength of PowerShell, at least in my mind, is in the pipe. The ability to pass information from one command to the next through the pipe is how PowerShell stands out from just another scripting language. SQL Server 2008 R2, and by extension, SQL Azure, has very limited pipe capability (we’re all holding out hope for Denali). What little you can do comes through the Invoke-Sqlcmd (and yes, invoking things at Miskatonic is usually frowned upon).

Connecting to a SQL Azure database from PowerShell is just a question of configuring the connection appropriately. But appropriately isn’t that hard:

[sourcecode language="powershell"]Invoke-Sqlcmd -Query “SELECT * FROM sys.dm_exec_requests;” -ServerInstance “myAzureDB.database.windows.net” -Username “MyUserName” -Password “APassword”}[/sourcecode]

Yeah, it’s that easy. This will connect up and run the query. I’d show you the output, but it’s not all that thrilling.

The other way to use PowerShell is to connect to stuff through SMO. Clearly, if I can connect through Invoke-Sqlcmd, then making an SMO connection should be just as easy, and it is:

[sourcecode language="powershell"]
[reflection.assembly]::LoadWithPartialName(“Microsoft.SqlServer.Smo”) | out-null

$cn = “Server=tcp:myAzureDB.database.windows.net;Database=master;User ID=MyUserName;Password=APassword;Trusted_Connection=False;”
$conn = New-Object System.Data.SqlClient.SqlConnection
$conn.ConnectionString = $cn

$srv = New-Object (“Microsoft.SqlServer.Management.Smo.Server”) $conn
$db = $srv.Databases["master"]

$db.ExecuteWithResults(“SELECT * FROM sys.dm_exec_requests;”)}
[/sourcecode]

Well, OK, maybe not as easy, but it works. I’m not a fan of SMO.

The key to remember, any functionality that is not supported in SQL Azure, will not be supported through any PowerShell calls. But once you’re in, you’re in. You can begin making use of the abilities of PowerShell to perform automation on your SQL Azure database. Be aware though, you’re still paying for data movement charges. There’s nothing magic here. Also, since PowerShell doesn’t yet run on the SQL Azure server itself, remoting is right out.

Thanks for coming to my class. I hope you begin to employ PowerShell with your SQL Azure implementations. I also hope you all get home. The building’s beginning to shake. Since we’re no where near an active fault line, it usually means yet another visitation from beyond space and time, Why can’t they pick on Harvard occasionally, or go down to Princeton? In New Jersey, who’d notice? I’m heading for my bunker… uh, I mean office. See you next time, maybe.

Jan 31 2011

Powershell to Automate SQL Compare

A question came up on the SQL Server Central Forums, how could you use Red Gate SQL Compare to automate creating scripts for all your databases at once. What’s the first thing that popped into my head? Yep, Powershell.

SQL Compare Pro comes with a command line utility that allows you to automate just about everything you can do with the GUI. I like to use it when I’m setting up deployment mechanisms for systems because it makes one click deployments possible. I won’t go into all the methods of that type of automation here. For a lot more information on that topic, check out the Team-Based Development book.

If you’re interested in exploring how you can use the command line, just type this:

[sourcecode language="sql"]sqlcompare/?[/sourcecode]

Now is a good time to note that I’ve set SQL Compare’s location within my path because I use it so often from various locations in the disk, that having to type the full path every time I want it, is just too much.

Here is the PowerShell script:

[sourcecode language="powershell"]$Server = “grant-red1\gfr1″
$Path = “c:\dbscripts”

Invoke-Sqlcmd -Query “sp_databases” -ServerInstance $Server | ForEach-Object {Invoke-Expression “sqlcompare.exe /s1:$Server /db1:$($_.DATABASE_NAME) /mkscr:$($Path)\$($_.DATABASE_NAME) /q” }[/sourcecode]

It’s just a single line of code, not counting setting the variables. I’ll be making it into a full blown script and posting it on the Red Gate site (location to be determined) so you can just call it passing variables. In the mean time, let me explain what I’ve done.

First, I use the Invoke-Sqlcmd utility to pass the query sp_databases to the server. This will retrieve the result set which consists of DATABASE_NAME, DATABASE_SIZE, and REMARKS for all user databases on the instance and any databases available through a gateway on that instance. From there I simply use the pipe, “|”, to pass the results to a ForEach-Object loop. Yes, I’d prefer not to use a loop, but this is a DOS command line utility, not a PowerShell commandlet, which means it’s not aware of the beauties of the pipe. Invoke-Expression is used to call sqlcompare.exe for each row within the result set. I use the abbreviation for the parameter /Server1, /s1, to set the instance through the variable $Server. I then use the abbreviation of the parameter /Database1, /db1, and the database name came from my object, and the column name, $_.DATABASE_NAME. Then the magic. The parameter /MakeScript, I again abbreviated to /mkscr, along with a path, in which I included the database name in order to separate out each database, was all that was needed to complete the command.

In short, three PowerShell commands and one call SQL Compare’s command line are all that are needed.

When you run this script, you’ll get a folder for each database and inside that will be a complete set of scripts for recreating the database. You can use this to script out your databases for inclusion in a source control system, or for whatever else you might need.

This example is just a small taste of what’s possible to automate using the command line utility in SQL Compare. Further, you can see that it’s very possible, easy even, to integrate SQL Compare with the primary means of server automation within Microsoft today, PowerShell.


PS: I won’t normally be posting much Red Gate stuff on my personal blog. I’ll be doing that inside the official Red Gate area where there are lot more eyeballs looking at it. But, until we get that all set up, I really wanted to share this little tid bit.

Dec 10 2010

12 Days of SQL, On the 2nd Day of SQL…

Microsoft gave to me, an excellent new management language. Yeah, so it doesn’t rhyme or match the song in any way, but as far as gifts of the season go, PowerShell is it. There are a couple of problems with PowerShell, first, it’s not installed everywhere, and second, DBAs just haven’t quite latched on to this new language as a management tool.

That second problem is absolutely not one that I would say about Aaron Nelson (blog|twitter), our next 12 Days of SQL blogger.  Aaron is one of the leading lights out there educating DBAs, and everyone else for that matter, about the strengths and capabilities of PowerShell as a management language for SQL Server. I’ve said it twice, and I hope you noticed, but I didn’t say scripting language, because PowerShell isn’t. It’s a management language. It’s a way to automate the management of your servers and that’s what Aaron’s work is all about. He blogs and tweets and presents quite a lot about PowerShell. If you’re trying to learn this excellent tool, you should be reading his stuff.

In particular, the post that has me the most excited, out of all the work that Aaron has done this year, is this excellent explanation of remoting with PowerShell. Why this post in particular? Because remoting, combined with asynchronous calls means you can send a PowerShell script to any or all servers in your environment, at the same time. It means you can perform serious, enterprise level management tasks in an automated and repeatable fashion and you can do it to all your servers at the same time, easily. Yeah, I said easily. That’s because it is easy. Read Aaron’s excellent explanation and you’ll agree. Once you understand how to call all your servers remotely, it’s even easier to then pass them a script, call a stored proc, send a DBCC command, or just about anything else. This is why I call PowerShell a management language, because we’re not talking about scripting here, we’re talking about managing your servers, and that’s exciting.

On the 3rd Day of SQL, Microsoft gave to me, 3 excited screams… Sorry, but you’ll have to wait until Monday for the next installment, but the wait, I assure you, will be worth it. That’s because David Stein (blog|twitter) will be shouting a post in your direction that will absolutely be something you want to hear (and you’ll hear it, loud & clear). Now, I’m teasing Dave, and you can ask him why, but I mean it in good fun. David’s a FreeCon alumn, a nice guy, and a talented individual. If you’re not checking his stuff regularly, you really should be.