According to findings from Tariff Consultancy, the average cloud computing price for enterprises has dropped by two-thirds since 2014. Tariff found that an average entry-level cloud computing instance is currently valued at 12 cents per hour for Windows users, with cloud services now employed by enterprises across a range of crucial applications.
The cost of the public cloud appears to have stabilized. Amazon Web Services, Google, and Microsoft offer comparable entry-level compute-instance pricing, as do other providers.
Despite such low prices, I still hear complaints from IT about the cost of cloud.
The actual cost of the cloud is not for the services themselves. That’s a small part. The real money goes to the people, tools, time, and risk mitigation. IT shops that look at only the cost of AWS versus Microsoft are missing a huge part of the equation. And they’re the ones that get sticker shock when they see the entire cost of cloud-based migrations, including new development and operations, that have little to do with the cost of compute instances.
I advise that you consider the cost holistically. That means working up a well-thought-out TCO (total cost of ownership) model that considers all aspects of the cost of moving to the cloud, such as people, migration, security, operations, and testing. You need to then balance that cost with the value of agility and time to market, which are typically huge for most enterprises.
That analysis usually shows that moving to the cloud is more cost-effective than leaving the applications and data where they are, in your data center. But using the cloud is not as cheap as it may seem from those compute prices.