The critical steps in transforming siled assets into actionable data in the cloud

The rate of change in the cloud and data space is phenomenal. Faced with the pandemic and the constant evolution of technologies, the expectations of managers and customers are forcing business units to adopt new tools and ways of working. Demand is hot right now.

I see a lot of organizations being pushed to adopt the cloud, whether by political pressure or the assumption of better capabilities. Some organizations initially focus on cost savings between cloud and bank and find that they lack the necessary processes, tools, and governance. Without proper preparation, the business case for cloud adoption quickly diminishes.

Whatever the reason for change, organizations need to understand that they should not be pressured into making hasty decisions.

Understanding your digital portfolio is key to choosing the “right” cloud platform to migrate to, backed by a robust decision framework. Cloud adoption without a transformation program will bring you quick wins, but ultimately a “lift and shift” doesn’t fix legacy architecture. It also does not allow organizations to maximize the value of the features built into cloud platforms.

Interestingly enough, data is often overlooked when onboarding a new platform, but you need to be clear about its intended lifecycle. When planning cloud migration, the focus is on the workload (application or virtual machine), but outside of data sovereignty and classification, I don’t see the priority given to assets data they deserve.

The most important thing is to take a data portfolio approach. It’s essential to understand your data assets, whether it’s a large spreadsheet, a customer database, or a system of record that you need to manage.

What’s really crucial is knowing where your data is, what it can do, and who has access to it. You also need to understand how other systems (perhaps secondary data systems) are feeding from or into your data. Then you need to apply the appropriate data protection and retention capabilities on any cloud-based system it lives with.

Let’s go back five years – or even just before the pandemic – when many organizations started testing the waters with the cloud. Everything used to be contained in one or two data centers – maybe a few buildings. Now everything is transferred through different service providers in several data centers.

Moreover, it is not just about understanding the value of your data in isolation but collectively as a department or organization. When evaluating how to turn siled assets into actionable data in the cloud, the main challenge is balancing immediacy with good strategy.

The first thing you need to do is decide whether you want a department or business unit approach. This will have a significant impact on your overall data strategy: where does your data reside and what kind of services can we provide based on the data we will have? This is especially the case with certain organizational departments that need to prioritize cloud adoption and transformation, creating a multi-tiered approach.

You should also be aware of the capabilities or services that these data assets contribute to and the lifecycle around them.

We see many people moving to cloud-based platform-as-a-service (PaaS) or software-as-a-service (SaaS) models and finding that their backup or archive mechanisms are no longer relevant. It’s a major problem in government if you have a data retention policy that dictates that records should be retained for 50, 60, or maybe 70 years. What if the only way to access government information from 60 years ago was through a legacy system accessed through Exchange 2000?

In fact, many organizations need to better understand their data capabilities and how to fully leverage them. What they did in a traditional tech stack doesn’t necessarily apply in a new world. Adopting feature-rich API platforms enables more integration, orchestration, and automation. This means you need to start thinking about how you’re changing the way your team works and what “good” looks like. Above all, you need to change the way you measure success.

It is also essential to find the best mix of workforce skills. Moving from a traditional network data center game to more scripted, DevOps environment is no small feat, especially in today’s skills shortage environment. Finding enough people with the right combination of abilities is difficult.

Last but not least, it’s about taking a security-by-design approach and understanding how you need to protect your digital and data assets.

We’ve seen what happens when organizations are unable to protect their data. Both primary and secondary data have been the subject of numerous cybercrime activities recently, with personal data in particular under constant attack. Additionally, many cybercriminals, through ransomware for example, target data backup repositories to minimize the victim’s ability to recover from an attack. Thus, backup strategies are now an essential line of defense in cyber warfare.

Citizens will continue to challenge organizations to prove that their data is safe. They will want to know how their data is used and how it is stored. An end-to-end focus on data security will be an ongoing challenge for public and private sector organizations.

Essential Eight, the model put in place by the federal government to combat cyber threats, has a great set of controls. Although quite cumbersome to set up, it offers a good level of security. But we often see customers struggle with the trade-off between better functionality and going through the steps to ensure the data they hold is secure and trustworthy.

Yet many technology trends (and challenges) are driven by consumer demand. Customers often set expectations for how they want to deal with organizations, especially government, which can create issues when collecting data to deliver the services consumers want and how they want to consume them.

The government has no choice; he must accept the change. But executives and procurement managers shouldn’t be too hasty and jump into the wrong solution or the most expensive solution. A well-thought-out data strategy should come first.

Matthew Gooden is chief innovation and technology officer at Datacom.