{"id":50410,"date":"2020-04-29T00:00:00","date_gmt":"2020-04-29T00:00:00","guid":{"rendered":"https:\/\/www.techopedia.com\/database-management-systems-is-the-future-really-in-the-cloud\/"},"modified":"2022-04-30T14:40:46","modified_gmt":"2022-04-30T14:40:46","slug":"database-management-systems-is-the-future-really-in-the-cloud","status":"publish","type":"post","link":"https:\/\/www.techopedia.com\/database-management-systems-is-the-future-really-in-the-cloud\/2\/34246","title":{"rendered":"Database Management Systems: Is the Future Really in the Cloud?"},"content":{"rendered":"
It’s been almost a year since Gartner declared that the cloud<\/a> will become the de facto solution for database management systems<\/a> (DBMS).<\/p>\n In that time, the technology community has been debating the relative merits of this approach vs. traditional on-premises models, and it seems that the broad conclusion is that while the cloud does provide some advantages, by itself it does not lend full support for what the enterprise needs going forward. (Also read: 7 Reasons Why You Need a Database Management System<\/a>.)<\/p>\n What’s missing? <\/strong><\/p>\n According to some experts, the cloud enables a foundation for what does need to happen for DBMS going forward, but only higher level architectures will truly bring about the flexibility and advanced service levels that will define the enterprise<\/a> in the age of connected digital services and the Internet of Things<\/a> (IoT).<\/p>\n It is important to note that this view does not contradict anything that Gartner has said about DBMS and the cloud, just that focusing strictly on cloud-based infrastructure<\/a> does not tell the complete story.<\/p>\n In its blog posting last June entitled: “The Future of the DMBS Market is the Cloud,” Gartner’s Adam Ronthal<\/a> notes that cloud-based DMBS<\/a> adoption was already on the uptake and will dominate the field in short order.<\/p>\n Mainly, this is due to the fact that the cloud is where much of the innovation and cost optimization is taking place, while on-prem will soon be relegated to specialty applications and jobs that require legacy compatibility.<\/p>\n While this may be true up to a point, service providers like Altinity<\/a> note that these facts need a little context.<\/p>\n For one thing, the idea that pricing models that show capex dropping but opex remaining the same or increasing only slightly are not panning out in the real world. Cloud architectures<\/a>, in fact, tend to hit opex pretty hard as environments scale.<\/p>\n In addition, while cloud innovation is impressive, the DBMS market is also being influenced by open source<\/a>, Kubernetes<\/a>, AI and other developments that can be deployed in the cloud or on-prem. (Read: How do companies use Kubernetes?<\/a>)<\/p>\n Open source in particular has been a hotbed of DBMS innovation for at least two decades, according to Altinity. Many of the most disruptive data management technologies were developed within open source projects, and the continued inflow of venture capital into this market ensures that this will continue for some time. (Also read: Open Source: Is It Too Good to Be True?<\/a>)<\/p>\n For anybody contemplating a change to their data management strategies, the cloud is certainly worthy of attention but it will likely come up short without a healthy dose of open source.<\/p>\n In this light, DBMS is just like any other application where success is not determined simply by moving to the cloud but by how you do it.<\/p>\n Aiven’s Gilad David Maayan<\/a> suggests the following five best practices for cloud-based DBMS:<\/p>\n The IT universe is littered with failed cloud initiatives because too much attention was paid to how it was going to work in the end rather than how to make the transition smoothly.<\/p>\n Hackers<\/a> prefer to target data at rest<\/a> because no one is paying attention to it at the moment. But it still contains valuable, privileged information like personal finances and trade secrets. (Read: What benefit can real-time data provide that data at rest can't?<\/a>)<\/p>\n A standard authorization procedure is critical to ensure data and infrastructure protection, particularly in large environments where access must be updated constantly.<\/p>\n Data in motion<\/a> is subject to theft as well, so a secure, encrypted digital tunnel is a highly effective way to keep it safe.<\/p>\n Tedious but important jobs are best handled by automation, which can be configured around key aspects of the monitoring process like reporting, testing and integration.<\/p>\n Organizations should also be aware that the cloud can easily turn into a management nightmare without the proper controls, says Deloitte’s David Linthicum<\/a>.<\/p>\n Many clouds, in fact, devolve into the same silo-laden infrastructure that currently inhibits workflows in the data center<\/a>, particularly when individual business units are left to create and manage their own clouds without IT supervision.<\/p>\nCloud, In Context<\/span><\/h2>\n
Process Matters<\/span><\/h2>\n
Develop a strategy first, then migrate<\/h3>\n
Encrypt and tokenize data at rest<\/h3>\n
Secure with Identity and Access Management (IAM)<\/h3>\n
Protect in-transit with encryption and VPNs<\/h3>\n
Automate monitoring<\/h3>\n
A Cloud Management Nightmare?<\/span><\/h2>\n