As we move forward in this “new normal” work environment, it’s becoming more and more clear that organizations – especially mid-market businesses – need a new approach to managing data. They’re entering a new reality in which some employees are in the office, others are remote, and everyone needs to work together, which presents some challenges to say the least.
So it should come as no surprise that, according to a recent survey from IDG, 95.7% of organizations currently have an executive mandate to leverage cloud technology. The cloud represents a great way for mid-market organizations to transform data management. But how best to do it? There are private cloud platforms, which bring the cloud experience to physical environments. There are public cloud resources, which allow organizations to more elastically fulfill their application needs. The possibilities are endless.
In most cases, the best strategy is a hybrid one. Private cloud, public cloud, and on-prem data storage don’t need to be adversaries – in fact, they can all work together to build a data management dream team for your business.
In this eBook, we’ll take a look at 5 key factors you should consider as you build this dynamic new infrastructure.
76.4% of survey respondents said it was the top issue. The bottom line is that, as workloads are proliferated across multiple deployment models, some of them outsourced to third parties, it’s becoming more difficult than ever to ensure that all of an organization’s data and workloads are secure. It’s imperative for IT to focus on security.
How to ensure security depends on how workloads are allocated, of course. Typically with public cloud environments, security is a shared responsibility, meaning both the client and the public cloud provider need to play a role. Organizations may not necessarily be able to apply the same tools and practices they usually use in an environment that’s partially beyond their control.
But this is precisely the value of the hybrid cloud approach – it allows organizations to pick and choose which workloads are allocated where, with security top of mind. If there are certain portions of their data that are absolutely vital and can’t be entrusted to a third party, IT has the option to keep that data in house. As CIOs and other decision-makers weigh these options, they’ll need to balance their security needs with other strategic priorities for the organization.
So it’s only natural that 70% of IDG survey respondents rated performance as the No. 2 most important criterion influencing workload placement decisions.
It’s worth noting that “performance” can mean different things for different workloads. For some, it’s purely latency driven – they want users to be able to access their information at all times with little or no delay. In other scenarios, gauging performance might be a matter of measuring throughput or evaluating data locality. For every corporate IT office, the challenge is to consider what “performance” means for each workload as they weigh different deployment models.
Some organizations are better equipped for scaling up than others. There are some that are able to seamlessly add thousands of cores on demand in a public cloud, and this is to their benefit, given their current IT needs; for other organizations, they don’t have the IT infrastructure in place to take advantage of scale-out cloud architectures, and they need to meet demanding hardware requirements first. There is no “one size fits all” strategy for maximizing cloud performance that works for every organization – the challenge is to find an approach that best suits you.
Simply put, you want to make sure IT has the level of control they need, when and where they need it, over the data that matters most.
It’s not always easy to make this happen. Every organization has its own set of management tools and workflows, all designed to address their specific business objectives. When it comes time to decide on cloud deployment models, organizations need to make difficult decisions about what data allocations will best accommodate the daily operational tasks that need doing – performance tuning, availability management, additional resource provisioning, data backup, disaster recovery, and so on. Each potential breakdown of cloud data could have a different impact on these workflows.
Major public cloud providers, such as Amazon and Google, each offer their own distinct cloud-native services and management interfaces. Managing each individual component of your hybrid cloud environment may lend itself to different approaches that IT managers need to learn and adapt to, and there may be different dashboards and workflows used for each. With a wellcrafted hybrid cloud strategy, you won’t be able to eliminate these disparities altogether, but you can craft the best possible plan to work around them and still get work done in an efficient manner.
In fact 33.3% of IDG respondents said availability was the most important characteristic, even ahead of security or performance.
The gold standard in cloud data availability is the “four nines” The gold standard in cloud data availability is the “four nines” standard – in other words, 99.99% guaranteed access to your files. Most public cloud providers are reluctant to write contracts that guarantee this level of availability, even if they generally do deliver it. This leaves many mid-market IT decisionmakers to ponder: Is putting critical workloads in public cloud environments worth it, when they can’t quite be certain they’ll have the access they need? At times, they go with a private infrastructure they feel they can more reliably count on.
For some aspects of the business that are absolutely vital to its survival, availability that’s as close to 100% as possible is essential. The truth is that some workloads may require onpremises deployment, so as to ensure compliance with strict RPOs/RTOs, and/or with country-specific regulations for data locality. It all depends on the specific needs of your organization.
Generally speaking, public cloud tends to be a cost-effective platform – it offers massive scalability, easy flexibility, and per-gigabyte costs that are lower than most. For most workloads, especially those that need to be managed over a long-period of time, the investment more than pays for itself. But there are exceptions to every rule. There are sure to be some components of your organization’s data that require unpredictable and relatively rapid access, perhaps because your needs are seasonal or project-driven in nature. For those, you may find it more cost-effective to stay on premises.
When it comes to cost savings, public cloud has rightly been lauded as a source of significant competitive advantage. Many public cloud vendors offer both customer-managed “as a service” cloud options and fully managed options that feed off of their vendor-controlled infrastructure, and these typically offer solid cost savings compared to other data management options. But for any organization with complex needs where data is concerned, it’s important to pick and choose the most cost-effective specifications on a case-by-case basis.