Microsoft Storage Spaces Direct — Let’s go!

In enterprise IT, every 6 or 7 years hardware nears its end of live and it’s time to research and replace it. OK, this may not be the case for everything, but for the most part servers from Dell, HPE and Lenovo seven years marks the end of support and you have to turn to 3rd parties.

In my case, it’s decision time for a big part of our infrastructure. For the past 10+ years we have taken the traditional storage approach of a big SAN to support our storage needs. Over the years we have relied on NetApp, EMC and Dell solutions to serve block and file storage. We have also made a big investment in VMware and have have taken the hard line of virtual first and even offer co-location for others within the business.

Back to the ticking time bomb. Our VMware cluster runs in HPE blade chassis gets its storage from a NetApp SAN. The HPE hardware is nearing seven years and the NetApp will be up five years old next year – we purchased five years maintenance up front, so I’m sure the renewal is going to be a small fortune.

Our NetApp serves several purposes:

  • Block storage over fiber channel for VMware
  • Block storage over fiver channel physical Windows and Linux hosts
  • NFS file storage for an Oracle Virtual environment (Oracle Financials)
  • CIFS/SMB storage for client file shares

With so much going on in the cloud space, and more specifically Azure for me, we decided to rethink our need for for traditional storage. And since we’re rethinking storage why not rethink virtualization platforms?

After some debate, we decided to move from VMware to Hyper-V and build out a hyper converged infrastructure leveraging Microsoft Storage Spaces Direct (S2D). We’re still working on sizing and selecting a hardware vendor, although Dell will most likely be our choice with their ready nodes.

While this takes care of one of the NetApp workloads,  I still have three more to deal with! Fortunately, there’s other projects in motion to shorten the list to one. We’ll probably end up with a new, smaller traditional array to serve the block storage to the few physical Windows and Linux hosts — and I’m OK with that.

New Azure Certifications – Changing Paths

Last week Microsoft announced a new certification: Microsoft Certified Azure Administrator. There are two exams in beta you’ll need to take to obtain this certification:

You must take both exams to obtain this certification. The cert and exams are designed to eventually take the place of the 70-533 Implementing Microsoft Azure Infrastructure Solutions exam.

The new certification and exams are supposed to be more real-world and cover topics you’re more likely to use on the job. I recently started studying for 70-533 so I’ll start shifting my focus to the topics covered in AZ-100 so I can take it once it’s available in a few months.

One last thing. If you have already taken 70-533, there’s a new transition exam to get you the Microsoft Certified Azure Administrator certification:

Azure Functions

Serverless computing has been all the rage recently. I’ve spent a little time playing around with AWS Lambda and found it interesting, but never really used it for more than tinkering.

Microsoft recently released Azure Serverless Computing Cookbook, a free ebook that walks you through creating a sample application using Azure Functions. The 300+ page book does a nice job of covering the basics and starts you down the path of using Azure Functions for more advanced uses.

If you don’t know C#, no problem. For the most part you’re copy and pasting code from the book. While that doesn’t help you learn C#, if does get you familiar with Azure Functions capabilities, including integrating third-party apps/APIs like SendGrid and Twilio.