Why automation is key to software-defined storage

Written by 

You would have heard of the term “software-defined”. It is now being applied to “everything” — compute, environment, storage, networking, data centres, W-LANs — and in all cases, it is about the decoupling of the control software from the hardware.

iTWire, in collaboration with Craig Waters, Virtualisation Architect, APJ at Pure Storage, presented Part One of this discussion on software-defined storage (SDS).

We discussed SDS and the advantages it brings to organisations. Part Two explores the benefits from an automation perspective and how utilising SDS enables organisations to further simplify infrastructure management.

The conclusion was that SDS proves value for an IT team by saving time that can be redeployed back into the business, and it also supports the overall growth of your organisation. Software defined is the future of infrastructure components and we haven’t even started looking at how this affects orchestration/automation – that’s the next story, so read on.

Automation has been a hot topic in the storage industry over the past year. If we look at the meaning of “automation”, it is basically about the use of “equipment” to automate processes. In the context of this article, it is about programmatically controlling IT systems via multiple tasks bundled into repeatable workflows to simplify systems management.

While command line interfaces (CLI) can be utilised to control storage tasks, IT admins would typically use a graphical user interface (GUI) to deploy, provision and monitor systems. But the question remains, what value do IT admins provide an organisation by completing these repetitive tasks manually?

SDS concepts help simplify the management of infrastructure; these repetitive tasks can now be performed using programming scripts to not only reduce the overhead of a given task but also to remove any human error.

The use of application programmable interfaces (APIs) in infrastructure provides IT admins with the ability to programmatically control infrastructure components via scripting languages. This has led to the opening of these APIs exposing the ability to plug into most management products available in the market today.

This adoption of open standards has led to the use of RESTful APIs along with scripting languages for Windows (PowerShell) and Linux (Python) to programmatically control infrastructure components without the use of GUIs.

Traditionally, closed APIs placed responsibility solely on the vendor to provide access to their infrastructure solution. A long lead time was required for that integration to be developed, tested and made available to the customer base. This resulted in a slow process of integration when the vendor would need to select important integration, and the customer would be sidelined if this integration was not deemed important.

Open standards address this challenge. By exposing API’s for a vendor’s technology, customers can utilise scripting tools to integrate their preferred management platform without relying on a vendor to provide that feature as part of any given software release. The adoption of RESTful APIs means that any configuration can be converted into a standard data interchangeable format, which will be processed between different vendor software/hardware products. This allows for a standard method of configuration, removing human error while being able to integrate into organisations existing management platform.

This drastically reduces complexities of integration while providing organisations with the ability to modularly plug infrastructure components into an overall solution. Again, without the challenges of compatibility or obtaining support from the vendor.

As a result, the IT workforce can spend less time managing components of a solution and configuring systems. They can instead adopt open standard APIs, along with scripting tools and management platforms, simplifying the operational management of infrastructure. This reduces the effort to monitor and maintain the overall solution and puts more focus on redeploying time back into the organisation further up the stack at the application level.

Let’s look at storage as an example. If we can programmatically control the provisioning of storage, we can reduce the complexities of:

  • Selecting the location and size of a volume;
  • Creating the volume on a storage array;
  • Providing protection for the volume based on agreed SLA;
  • Rescanning the compute hosts to provide visibility of the volume; and
  • Formatting the volume filesystem to support the underlying application.

In abstracting these technical decisions, we can provide self-service capabilities to application owners to complete a provisioning task:

Q. What application will this volume be used for?

A. Billing system (taking the application requirements into consideration when creating the volume).

Q. What SLA does this volume require?

A. Gold (taking the SLA requirements when protecting the volume).

In this simplified example, we’ve taken business requirements and made technical decisions. The end user is no longer concerned with the complexities of provisioning storage. They only need to know the use case – a great example of self-service.

There’s a mandate in most organisations to do more with less. This comes back to reducing the complexity of infrastructure and adding greater value to the business by aligning IT. The utilisation of SDS coupled with automation enables the IT department to do more with fewer resources. They can focus less on infrastructure/operations and more on providing value back to the business utilising open standards, and RESTful APIs is the answer for any organisation looking to better manage their increasingly complex internal IT estates.

sds-principles1

Source: http://www.itwire.com/business-it-news/storage/75713-why-automation-is-key-to-software-defined-storage.html