Update Rollup 9 for SCOM 2012 R2 now available

Microsoft has published the update rollup 9 for System Center Operations Manager 2012 R2 yesterday. Overall this update rollup focuses on significant fixes to Application Performance Monitoring in SCOM.

Fixes for following issues are included in this update.

  • SharePoint workflows fail with an access violation under APM
  • Application Pool worker process crashes under APM with heap corruption
  • Some Application Pool worker processes become unresponsive if many applications are started under APM at the same time
  • MOMAgent cannot validate RunAs Account if only RODC is available
  • Missing event monitor does not warn within the specified time range in SCOM 2012 R2 the first time after restart
  • SCOM cannot verify the User Account / Password expiration date if it is set by using Password Setting object
  • SLO Detail report displays histogram incorrectly
  • APM Agent Modules workflow fail during workflow shutdown with Null Reference Exception
  • AEM Data fills up SCOM Operational database and is never groomed out
  • The DownTime report from the Availability report does not handle the Business Hours settings
  • Adding a decimal sign in an SLT Collection Rule SLO in the ENU Console on a non-ENU OS does not work
  • SCOM Agent issue while logging Operations Management Suite (OMS) communication failure

This update rollup introduces the APM support for IIS 10 and Windows Server 2016. This requires an additional management pack Microsoft.SystemCenter.Apm.Web.IIS10.mp which can be found in “%SystemDrive%\Program Files\System Center 2012 R2\Operations Manager\Server\Management Packs for Update Rollups” with its dependencies once you install the update rollup.

Important

The Microsoft.Windows.InternetInformationServices.2016.mp which is a dependency, is not included in the UR9 package so you should download and install that separately before you enable APM for IIS 10 & Windows Server 2016.

You can download the binaries from here.

Savision Free Whitepaper – Born in the cloud | Monitoring Linux Workloads with OMS

I have recently authored a whitepaper titled “Born in the Cloud: Monitoring Linux workloads with OMS” published by Savision. This whitepaper focuses on Linux workload monitoring capabilities of Microsoft OMS born-in-the-cloud management suite which is capable of managing and protecting heterogeneous on-premises, cloud and hybrid data centers.

Following are the key areas of discussion in my whitepaper.

  • What Microsoft Operations Management Suite is and how it can simplify data center management.
  • Leveraging OMS Log Analytics to analyze, predict and protect your Linux workloads.
  • Integrating System Center Operations Manager with OMS for extended monitoring.
  • Harnessing the power of Business Service Monitoring of Savision Live Maps Unity in Microsoft OMS.

You can download this FREE whitepaper from here.

About Savision

Savision is the market leader in business service and cloud management solutions for Microsoft System Center. Savision’s monitoring and visualizing capabilities bridge the gap between IT and business by transforming IT data into predictive, actionable and relevant information about the entire cloud and datacenter infrastructure. You can visit  www.savision.com for more information about their product portfolio.

Salvaging SCVMM 2012 R2 with an existing database

Recently I was working in a SCVMM 2012 R2 deployment project and came across a DEFCON 1 situation. I was ready to uninstall everything and re-deploy VMM from scratch but I wanted to minimize the post installation configuration tasks after that. I came across a great post by the SCVMM team on how to achieve this with two SQL stored procedures and here is how I managed to save couple hours of the deployment time with that.

Backup First

Although you seriously know what you are doing, it is wise to back up the VMM database first. Just in case if you manage to screw up the entire database you can always restore from a copy.

The Process

The entire process is clearly explained in this TechNet article so I’m going to skip the lecture. But there are few things that I followed based on my gut feeling to make it work in a single attempt.

  • Stopped the VMM Server service before executing the first stored procedure and backing up the VMM database.
  • I restored the backup VMM  database using a dummy name and tested the stored procedures first to see there are any exceptions thrown during execution. Luckily it was successful. This is optional but it doesn’t hurt to try.
  • After installing the secondary VMM server I made sure to install the relevant UR version that was installed previously in the old VMM instance. This is critical or otherwise the database will not be useful at all.

Now if you have a Highly Available VMM environment things might look a little scary (VMM service fails most of the times) but the article explains how you can safely use the existing database by stopping at starting the VMM service manually before and after proceeding with the setup.

 

 

Azure Automation PowerShell ISE add-on is now GA

Azure Automation team has announced the general availability of PowerShell ISE add-on for Azure Automation last week. With this add-on it is easier to author your Azure Automation runbooks using the familiar PowerShell ISE. Below are some of the notable features of this add-on.

  • Use Automation activities (Get-AutomationVariable, Get-AutomationPSCredential, etc) in local PowerShell Workflows and scripts
  • Create and edit Automation assets locally
  • Easily track local changes to runbooks and assets vs the state of these items in an Azure Automation account
  • Sync runbook / asset changes between a local runbook authoring environment and an Azure Automation account
  • Test PowerShell workflows and scripts locally in the ISE and in the automation service

Installing Azure Automation add-on for PowerShell ISE is pretty much straight forward. Although you can install the add on from the GitHub source, Microsoft recommends that you install the add-on from the PowerShell Gallery.

  • In an elevated PowerShell window execute below cmdlet. This will install the add-on only for the current user.

Install-Module AzureAutomationAuthoringToolkit -Scope CurrentUser

  • To automatically load the Azure Automation ISE add-on every time you open the PowerShell ISE execute below cmdlet.

Install-AzureAutomationIseAddOn

  • Also to load the add-on adhoc only when you want, execute  below cmdlet in the PowerShell ISE.

Import-Module AzureAutomationAuthoringToolkit

Managing Cloud Storage with Microsoft Azure Storage Explorer

Today you might be using different third party tools to perform management operations in your Azure storage accounts. CloudXplorer & CloudBerry are some good candidates but they are not free (as in beer). For those Developers who are using Visual Studio 2013/2015 the in-built cloud explorer is a perfect tool but what about the IT Professionals like us? Do we have a good and free alternative?

Microsoft has introduced a standalone version of Microsoft Azure Storage Explorer (Preview) with Azure SDK 2.8 release.  This tool is let’s you to quickly create blob containers, upload file content into blob containers, download files, set properties and metadata, and even create and get SAS keys to control access. Also you can quickly search for containers and individual blobs, and inspect a number of things like metadata and properties on the blobs.

Features in Storage Explorer

  • Mac OS X, Windows, and Linux versions (New in v0.7.20160107)
  • Sign in to view your Storage Accounts – use your Org Account, Microsoft Account, 2FA, etc
  • Add Storage Accounts by account name and key, as well as custom endpoints (New in v0.7.20160107)
  • Add Storage Accounts for Azure China (New in v0.7.20160107)
  • Add blob containers with SAS key (New in v0.7.20160107)
  • Local development storage (Windows-only)
  • ARM and Classic resource support
  • Create and delete blobs, queues, or tables
  • Search for specific blobs, queues, or tables
  • Explore the contents of blob containers
  • View and navigate through directories
  • Upload, download, and delete blobs and folders
  • Open and view the contents text and picture blobs (New in v0.7.20160107)
  • View and edit blob properties and metadata
  • Generate SAS keys
  • Manage and create Stored Access Policies
  • Search for blobs by prefix
  • Drag ‘n drop files to upload or download

This tool currently supports blob operations only and according to Microsoft support for Tables & Queues is coming soon.

Let’s take a look at this tool and see how we can manage Azure Storage using that. First you need to log into your Azure subscription.

Storage-Explorer-1.png

Once you are signed into your Azure subscription you can immediately start navigating through all of your storage accounts.

Storage-Explorer-3.png

You can perform following blob operations by right-clicking on a storage blob.

Storage-Explorer-4.png

Attaching Storage

If you want to connect to storage accounts in a different Azure Subscription or Azure China Storage Accounts or any publicly available storage service that you are not an administrator, you can  right-click on the Storage node and select Attach External Storage. Here you can provide the Account Name & the Access Key to connect to those external storage accounts.

Storage-Explorer-6.png

Also it is possible to connect to a blob container using a Shared Access Signature key and in order to do so the SAS key should provide List permissions for that particular blob.

Storage-Explorer-7.png

You can download this tool from storageexplorer.com

The curious case of Microsoft Azure Stack

Lot of people have been asking me what Microsoft Azure Stack will mean to their cloud journey in 2016. As the product is still invisible to us Microsoft has released some guidance notes about what hardware specifications that will be need to a run a PoC lab for Azure Stack Technical Preview just before Christmas 2015.

In order to run a POC of Azure Stack in a single server following minimum and recommended configuration is suggested by Microsoft.

Component

Minimum

Recommended

Compute: CPU Dual-Socket: 12 Physical Cores Dual-Socket: 16 Physical Cores
Compute: Memory 96 GB RAM 128 GB RAM
Compute: BIOS Hyper-V Enabled (with SLAT support) Hyper-V Enabled (with SLAT support)
Network: NIC Windows Server 2012 R2 Certification required for NIC; no specialized features required Windows Server 2012 R2 Certification required for NIC; no specialized features required
Disk drives: Operating System 1 OS disk with minimum of 200 GB available for system partition (SSD or HDD) 1 OS disk with minimum of 200 GB available for system partition (SSD or HDD)
Disk drives: General Azure Stack POC Data 4 disks. Each disk provides a minimum of 140 GB of capacity (SSD or HDD). 4 disks. Each disk provides a minimum of 250 GB of capacity.
HW logo certification Certified for Windows Server 2012 R2

Storage considerations

Data disk drive configuration: All data drives must be of the same type (SAS or SATA) and capacity.  If SAS disk drives are used, the disk drives must be attached via a single path (no MPIO, multi-path support is provided)
HBA configuration options:
     1. (Preferred) Simple HBA
2. RAID HBA – Adapter must be configured in “pass through” mode
3. RAID HBA – Disks should be configured as Single-Disk, RAID-0
Supported bus and media type combinations

  •          SATA HDD
  •          SAS HDD
  •          RAID HDD
  •          RAID SSD (If the media type is unspecified/unknown*)
  •          SATA SSD + SATA HDD**
  •          SAS SSD + SAS HDD**

* RAID controllers without pass-through capability can’t recognize the media type. Such controllers will mark both HDD and SSD as Unspecified. In that case, the SSD will be used as persistent storage instead of caching devices. Therefore, you can deploy the Microsoft Azure Stack POC on those SSDs.

** For tiered storage, you must have at least 3 HDDs.

Example HBAs: LSI 9207-8i, LSI-9300-8i, or LSI-9265-8i in pass-through mode

Furthermore Microsoft suggests that the Dell R630 and the HPE DL 360 Gen 9 servers can be utilized for this effort as both of these models have been in market for some time, but you can always go for another vendor/model that fits the above specification.

From below you can listen to Jeffery Snover himself explaining what is behind the scenes in Azure Stack in development.

Microsoft Azure new SQL IaaS configuration experience

Happy New Year to all of my blog readers.

2016 is going to be an exciting year as we wait for the newest releases of Azure Stack, Windows Server & System Center from Microsoft. In my new year post I’m going to share some happy news for all Azure Ninjas out there working on IaaS.

If someone asks me what Microsoft Product makes my Azure deployment most complex I’d answer SQL server. The reason is SQL server being a awesome product needs an extra DBA care. When you are provisioning a SQL server VM in Azure you need to think about IOPs, Connectivity, Backups, Security first hand and how to provide the same level of experience as of a  on-premises data center.

Microsoft Azure team understood the pain us system administrators face when it comes to SQL server configuration and came up with a set of new configuration options for SQL VMs in Azure Resource Manager Deployment Model. In order to use the new configuration experience you need to create a VM in new Resource Manager deployment model and it supports any version of SQL server that Azure marketplace offers.

Simplified Connectivity

SQL IaaS in Azure (1)

In the classic model in order to configure SQL server connectivity from on-premises using SQL Server Management Studio (SSMS) you had to first Remote Desktop to the VM, open the SQL Server port in Windows firewall, enable SQL Server Authentication, and to allow inbound connectivity had to create a Public Azure endpoint for the VM. The new experience allows you to do all of that in the portal itself during the provisioning time and you can select whether this SQL server can only be contact from the VM itself or within the Virtual Network.

Automated Patching & Backup

SQL IaaS in Azure (3)

Another pain that IT Pros encounter with SQL server is patching. The new automated patching capability allows the administrators to define a maintenance window at their convenience during the VM provisioning itself. So if your customers need to take off the burden of patches for their SQL VM instances in Azure this is a life saver.

SQL IaaS in Azure (4)

What about backup then? The new Automated backup feature allows administrators to automatically backup all databases in SQL Server and it is not enabled by default  as different workloads can have  different backup requirements. You can retain these backups up to 30 days and even encrypt them.

Storage Optimizations

Be it on-premises or cloud the most important thing in SQL server instance is storage. Previously in Azure classic deployment model we had to attach the required number of data disks to provide the IOPs and throughput manually and stripe the SQL files across the disks or create a Storage Pool to divide the IOPs or throughput across them.  The new deployment model has all of these included in to provisioning by enabling us to specify the required IOPs, throughput and VHD size within the allowable limit of the VM instance size. The cool thing is when you tweak these settings Azure automatically changes the number of data disks using  Windows storage spaces. So you no longer have to worry about calculations.

Also you can select between any of the below three storage optimization method for your SQL VM depending on your workloads.

  • General is the default setting and supports most workloads.
  • Transactional processing optimizes the storage for traditional database OLTP workloads.
  • Data warehousing optimizes the storage for analytic and reporting workloads.

SQL IaaS in Azure (2)

For the automation geeks you can use the Azure Resource Manager templates to make it even more automated for larger deployments. Considering the amount of effort and time taken previously for SQL IaaS VM configuration in Azure the new deployment experience offers much more hassle free one time setup for SQL workloads.