A good admin is a lazy admin.

You may have heard that quote from one of your geeky IT colleagues. It is one of my personal favorites as it encompasses the concept of scripting perfectly. There is no better feeling than watching your script doing your job while enjoying a cup of coffee.

Protect Your Data with BDRSuite

Cost-Effective Backup Solution for VMs, Servers, Endpoints, Cloud VMs & SaaS applications. Supports On-Premise, Remote, Hybrid and Cloud Backup, including Disaster Recovery, Ransomware Defense & more!

Powershell is an area of Windows that is present everywhere and yet, there is still a large number of IT professionals that have never worked with it. Sometimes justified, not everyone needs it, but if your reason is “I don’t like CLIs, I prefer GUIs”… Not good enough.

Here are some important reasons that will convince you to learn Powershell :

Spend your time on what really matters

A lot of people don’t realize it but many things in the life of a sysadmin are repetitive manual tasks. I am not talking about long and brain-intensive tasks, I am talking about the small ones that doesn’t take more than 5 minutes of your time every morning, not a big deal right?

Download Banner

An engineer spending 5 minutes an everyday on a task comes down to a hefty 18.5 hours per year, which translates to almost a whole 2 and a half days of work. If you take a minute or two to think about it I am sure you will find one of these tasks in your daily routine. Now some tasks can’t be automated with scripting like moving a backup tape to a vault, however, if it can be it will save you lots of time.

Most of you will find a different application of what I am describing but I am going to take the example of a trivial task of a VMware administrator. Every now and again you should check for snapshots lying around on virtual machines as they are not supposed to remain for longer than a few days. Instead of going on a ghost hunt in the UI, why not run a simple “Get-VM | Get-Snapshot | Where Created -lt (Get-Date).AddDays(-3)”. Job done in 10 seconds.

Now this example was a bit silly as not everything can be scripted by a single cmdlet. Some automation scripts can take a lot of time to write properly and some manager may even get frustrated about you working on a script instead of doing operational tasks.

Well, take the example above, even if writing this script takes you 2 full days (so 16 hours), the outcome will be the following:

  • 2.5 hours saved over the course of a year. Exponentially more if the daily task takes more than 5 minutes and as years go by
  • Improved your scripting skills in the process of writing it – hence offer a more advanced skill to the company
  • Rewarding aspect of creating something that improves your daily work and your colleagues’ It may seem silly but it feels a lot nicer knowing that one of your scripts is doing your job well than going through the same repetitive clicks every single day
  • Extra time freed up to work on other project or more important operational tasks that cannot be automated

Ensure consistency and avoid human error

Human error is the biggest plague of everything we do… as humans. If you tell a computer to print “Hello world” every day at 5 PM, it will do it. However, if you tell a human to do the same thing, it might end up printing it at 5:02 PM or print “Hello wolrd” just because he is used to it and didn’t pay attention or got distracted. Things like that don’t happen if they are scripted, and if they do, that means you screwed up the script which is worse…

Now, nobody wants to print “Hello world” at 5 PM every day, my point is, a small human mistake like a typo can have big consequences in a production environment.

Take the example of a VMware administrator (again) who has to decommission a datastore. As per the procedure, he unmounts the datastore after making sure it is no longer used and move on to detaching the Lun from the host. He wrote the LUN Id before unmounting the datastore and look for it in the devices on the host: here it is “naa:600100123456798” – click – Detach. 5 minutes later the monitoring starts to go off with lots of red on the screen, services down and loads of VMs in an inaccessible state. He later realized that the LUN to detach was in fact “naa:600100123456789” and that the other LUN Id existed and was hosting a bunch of VMs.

I’ll level with you, that’s a pretty grim and very unlucky scenario but who knows, it happens to the best of us. Whereas, if he used PowerCLI to detach the LUN based on the datastore name which in turns gathers the backing device automatically it wouldn’t have happened. Once again, you could pick the wrong datastore here as well, but then you may want to take a break from critical production environments ;).

Benefit from using Windows Core

Although not as obvious as the previous two, hardware resources and security can be indirectly impacted by the ability of your workforce to use Powershell on a daily basis. Standard Windows Server with GUI is quite to use as long as you know what you are doing. The interface is similar to the desktop one so anyone can jump on a server via RDP and start doing stuff. As opposed to this, Windows Server also comes with the option (which is now the default one) to not install the GUI, known as “Core install”. The biggest caveat of Core installs is that everything is done via command line, i.e. Powershell. Meaning those that don’t know how to use it will be lost or will rely on the availability of remote management tools (RSAT…) which don’t exist for everything.

Now the benefits of core installs are multiple and win over the cons of it. The following are issued by Microsoft themselves:

Reduced servicing:

Because Server Core installs only what is required for a manageable DHCP, File, DNS, Media Services, and Active Directory server, less servicing is required.

Reduced management

Because less is installed on a Server Core-based server, less management is required.

Reduced attack surface

Because there is less running on the server, there is less attack surface.

Less disk space required

Server Core requires only about 3.4GB to install.

Not everything can be done in GUIs

Like any other OS, the Windows UI is installed on top of the system, meaning you can only work with what the UI offers. Agreed, the Windows UI is probably the best out there and you can probably do most of what needs to do with it. However, once you start going for the little nitty gritty you may end up facing a wall with no other than firing up Powershell and get your hands dirty.

An example comes to mind that is not directly related to Powershell but to PowerCLI: The events.

If you have ever worked on vCenter and needed to go back a few weeks in the events of an object, I am fairly certain that you had a few non-Christian thoughts about whoever decided not to implement a time selector in this thing – never understood that. Anyway, what would take you roughly 8 hours to get where you want in the UI would take no more than a one-liner and 15 seconds in PowerCLI. “Get-VMHost *esx10* | Get-VIEvent -Start (get-date “5/1/2019”) -Finish (get-date “6/1/2019”)”. I dare you to go back to the event of the 08/01/2019 in the UI.

Customize and improve reporting

Along with monitoring, reporting is one of those areas that are too often disregarded due to several factors that drive operational efficiency down. I want to point out that no Powershell script no matter how good it is can replace a solid monitoring system. However, some things cannot be monitored easily with standard solutions like Nagios or Zabbix without resorting to convoluted and nonstandard configurations. In such instance, it might be a good idea to using a Powershell script running in a scheduled task on a server that produces a report and sends it to you via email or exports it as html format on an IIS server for on-demand view. Like it was mentioned in the first point, VMware snapshots could be a candidate.

Once you get the hang of running reports with Powershell, you will realize that you can end up having visibility on a lot of components in your infrastructure. One of the benefits of it is that you will not forget to check it if it is not something you do often, like DRS level 1 recommendation in my case.

Improve your skills

Powershell is not going away anytime soon and will become increasingly important in the future. Hell, even Linux can run Powershell core now… Having this skill is becoming increasingly important as well as Microsoft is implementing it in most of its products, and more third-party companies write modules to manage their own solutions with it which is why a growing number of employers are looking for scripting skills. The learning curve of Powershell is a little bit steep at the very beginning but it is not as difficult as it appears to be. Be assured that the syntax is a lot more forgiving than any other scripting language, which is great (looking at you Bash…). Nobody becomes an expert by only reading the books. While it is a very good starting point, the best way to extend your learning is to have use cases and curiosity. At the end of the journey (which never ends really), the reward of having it in the CV will be, the profile that has Powershell skills will come on top of the one that doesn’t have it, simple as that.

Conclusion

While this blog sounds a lot like a love letter to Microsoft, it doesn’t mean the thing is without flaws. There is still frustration creeping around the corner like not finding cmdlets on an old server running Powershell 2.0 or Powershell remoting across domains. With that said, all technology experts agree on the fact that Powershell skills have become a must-have in the industry of IT, and not only for Windows administrators. As many vendors are working on Powershell integration, all fields are concerned by the above; cloud, storage, virtualization, Middleware, DBA.

Head over to the Microsoft docs to get started with Powershell.

Follow our Twitter and Facebook feeds for new releases, updates, insightful posts and more.

Rate this post