Sure, I can run my Mission Critical applications in the cloud, but should I?
Photo by Zoltan Tasi on Unsplash

Sure, I can run my Mission Critical applications in the cloud, but should I?

Cloud computing is ubiquitous and fully comprehensive nowadays. The time when public cloud was only a remote platform where developers could build fancy little applications is a thing of the past. Today, all major cloud providers like AWS, Azure and GCP offer scalable and reliable solutions that can run most Mission Critical applications, even those applications that require specific infrastructure certifications like SAP HANA.

Moreover, many market leader software providers are favoring public cloud deployments over more traditional on-premise or even private cloud deployments, like for example SAP with their RISE initiative for S/4HANA. No wonder it’s easy to get lost in the cloud buzz and think that public cloud is the best and only option for any application.

But...

"Just because you can do it doesn’t mean you should.” ― Hanya Yanagihara, A Little Life

Moving to the cloud can lead to potential benefits, and that’s why companies do it. Increased agility, lower time to value, flexible scalability, better resource allocation, pay per use or exiting data center real estate business are some of the well-known drivers for it. Initially reduced TCO was also among the expected outcomes but as of now many customers realize that this may not be always the case and public cloud can be more expensive in the long run; the financial benefits of the cloud usually happen as a result of switching to OPEX/pay per use from a traditional CAPEX model, and the positive impact that this can have on the balance sheet.

But I’m not a finance specialist so let’s go back to the technical discussion.

In order to make an informed decision, the tradeoffs of moving to the public cloud should be considered too. In my opinion, these are the most relevant ones for business critical applications:

1- System entanglement i.e. integrations

Mission critical applications are neither standalone nor monolithic. They are deployed in a multi-tier fashion (DB – Application – Presentation) and do exchange information with many other systems. Therefore, before moving just one of the 1000 pieces of the puzzle to the cloud, the implications need to be evaluated. How are the interfaces going to work if I have one system in the public cloud and another one on premise? The ERP in the cloud and BW on premise? The DB tier in the cloud and App tier on premise? That’s probably not going to work optimally, as all the interfaces may need to be identified and adapted, tuned for the cloud, and even additional network infrastructure provisioned to minimize the latency. Service response times may be impacted if a query needs to go from an on-premise app server to a cloud HANA DB, and back on-premise to show the result in the presentation layer, even worse if some other satellite systems need to be accessed besides the main database, for example a CRM system (this is what we call RFCs or Remote Function Calls in SAP’s architectures and their individual response time is a critical contributor to the overall dialog response time).

2 - Increased scope

Because of the entanglement, customers may find that if they want to move a key application to the cloud, like the ERP, they actually need to move the full environment as a whole, with all the tiers and integrations: the 1000 piece puzzle. This is not going to be simple or cheap, so even if savings are expected as a result of a cloud adoption, these savings could be offset by the wider project scope: more systems to migrate = more complexity and spend. I have personally seen situations like this, customers initially just needing to migrate SAP ERP from Oracle on-premise to the public cloud, and systems integrators wisely recommending to migrate the full landscape (tens of SAP instances instead of just one, hundreds of VMs) in order to keep the coherence and performance of the environment.

3 - Customization

Functional alignment with the business requirements is also a sticky point. Most customers use customizations that need to be kept as they serve the business perfectly. In the public cloud, those customizations may not be even possible, and the deployments may need to start from scratch instead of migrating the work and maturity achieved throughout the years. Sometimes this may be something desired though, in order to simplify, standardize and clean up old code that is no longer necessary. However, customers may lose some vital functionality, or may need to learn new ways to customize the application using cloud APIs and services. When it comes to SAP, if customers decide to move their ERP to the public cloud under the RISE umbrella, they will no longer be able to modify the code the old way, they will need to use BTP. That’s a big change that needs to be evaluated in detail before moving the public cloud. Also, typically SAP public cloud deployments are for brand new, from scratch (i.e. Greenfield) implementations only. Brownfield migrations that include customizations are not possible.

4 - Service availability

When it comes to Mission Critical applications, SLAs are crucial. Most cloud providers offer less than 3 nines of availability by default, which is rarely enough for essential applications. So customers need to implement HA and DR in the cloud, and request shorter RTO/RPO options, to reach at a very least 3, if not 4 or more nines of availability. HA and DR in the cloud is not different from on-premise, in the sense that redundant zones, hardware and clustering are needed and will be reflected in the solution price. There are no savings by implementing HA/DR in the cloud vs. on-premise. HA/DR is going to add to the cost, which can break the cloud TCO.

5 - Your business in someone else’s hands

Control is another major consideration. By moving the mission critical applications to the cloud, customers are basically handling their business and their precious data to someone else, and even if there is a contract between both entities, well… you never know… The hyperscaler may be hacked and lose your information. They may have a major outage and shut down your business for hours. Then they may need to prioritize what services to restore when resources come back on line, and if they come back in a degraded state and cannot serve every customer, you may not be on the priority list. And yes, they may pay you penalties in form of credits if they break the SLA but that’s not going to help to recover your business, including lost reputation and even legal consequences as a result of service unavailability or data loss. The chances for these situations may be slim, but public cloud outages have already made headlines several times.

Security is also another concern, especially in this sense of control, as data is going to be outside your organization and in some cases shared with other customer’s data in a typical multi-tenant public cloud deployment. Hyperscalers implement high levels of security so protection is going to be high, however exposure is going to be high as well and as a result, overall the risk of leaks or attacks may be increased in the public cloud vs. keeping your data on-premise or in a private cloud within your company boundaries. Risk is a factor of exposure and not only protection (did you ever wonder why UNIX systems are rarely hacked, if ever? Among other reasons, because they are less exposed than Linux and Windows ones).

To cloud or not to cloud?

If after evaluating all of the above, an informed decision is made and a public cloud deployment for mission critical is deemed to be advantageous for the business… by all means, go for it! The cloud is ready and can support you.

Now, if you still have doubts, you like the cloud but are unsure if you can accept the trade-offs, maybe consider a private or hybrid cloud deployment instead. Going back to the SAP RISE example, the do offer these alternatives too, called RISE Private Edition (hyperscaler private cloud) and RISE Private Edition – Customer Datacenter Option (customer owned datacenter or co-location). SAP doesn’t promote these two options as much as the public cloud so make sure you ask SAP about them if RISE is under consideration.

Alternatively and for any mission critical workload including SAP and others (EPIC, Oracle, etc.), HPE Greenlake is an excellent solution for customers that love the cloud concept but are not ready for the tradeoffs. Greenlake brings the cloud to customers, either in their own datacenter or a co-location for customers that are exiting the datacenter. Furthermore, for hybrid deployments, the co-located datacenter can be adjacent to a public cloud datacenter, therefore minimizing latencies and other potential issues.

It’s great to have options.

But that you can do something, doesn’t mean that you necessarily have to do it.

To view or add a comment, sign in

Others also viewed

Explore topics