Advertisment

Hot and hybrid

Idle resources, visibility, hidden costs, security loose-ends, and complications of provisioning – can the hybrid cloud address such on-ground issues?

author-image
VoicenData Bureau
New Update
Hot and hybrid

Idle resources, visibility, hidden costs, security loose-ends, and complications of provisioning – can the hybrid cloud address such on-ground issues?

Advertisment

By Pratima Harigunani

When Cloud came in, it not just disrupted the IT industry, but also dislocated and disoriented a lot of its parts. Soon the “whether or not” question changed into “which one”. Thankfully, hybrid cloud solved the big public or private dilemma by melting the or into a sort of and.

The adoption appetite clearly mirrors the enthusiasm that this welding of two different worlds has led to. Hybrid/multi-cloud has been noted as the predominant strategic posture to manage digital-era IT and business transformation with 62% of enterprises pursuing a hybrid IT strategy, as per 451 Research’s Voice of the Enterprise (VotE): Digital Pulse, Budgets and Outlook 2019. Similarly, Everest Group Research indicates that 58% of enterprise workloads were observed to be on or are expected to be on hybrid or private cloud. In fact, the projected market value of the hybrid cloud infrastructure market is as big as USD 128.01 billion (Mordor Intelligence).

Advertisment

Ok, the Hybrid cloud is hot. But it does have one distinctive issue to reckon with: that of idle resources and over-provisioning. That’s not just one issue. It trickles into many more problems.

Provisioning – playing the Tetris

Organizations are, increasingly but not completely, becoming aware of the idle resources in their cloud infrastructure. According to Shrikant Navelkar, Director – Oracle Relationships, Clover Infotech, “They are realizing that these idle resources are a cause of unnecessary costs. Idle resources result from not having complete visibility into the cloud utilization and hence procuring resources on the cloud much before it is required. Hence, organizations must have complete visibility on when the resources are required, and the IT teams must have the autonomy to decide and plan this well in advance to ensure a better return on cloud investments.”

Advertisment
Shrikant Navelkar

 “Organizations must have complete visibility on when the resources are required, and the IT teams must have autonomy to decide and plan this well in advance.”

Shrikant Navelkar, Director – Oracle Relationships, Clover Infotech

Until recently, enterprises had not necessarily spent too much time studying the idle resources and costs in hybrid cloud environments. According to Kumara Raghavan, Director, SDI, HPC and AI, Lenovo Data Center Group, APAC, since the cost has now become an issue, a lot of organizations are now looking at tools that can help them drive financial accountability and deliver accurate visibility into resources and utilization. “Idle resources also present a security threat, as these resources might not get updated to security protocols causing security vulnerabilities. Once idle resources are identified, IT teams can manage and secure these resources easily,” he added.

Advertisment

Cost Management is, and should be a continuous effort, stressed Narendra Bhandari, Senior Vice President at Persistent Systems. He further explained that this will also help build effective policies on moving workloads around and a case for modernizing traditional and custom workloads to take advantage of Containers and rearchitecting code using Microservices.

The long tail – security and integration bumps

The word ‘idle’ can be spelled in many ways and interestingly, one of them is ‘fragile’. There is a security implication of these extra machines sitting in the dug-out area. “Organizations clearly understand the need for strong cybersecurity and are quickly realizing the benefits of security-as-a-service. But, as companies migrate to the cloud, the attack surface also expands. This has led to a surge in cyber attacks and many companies are struggling to prioritize projects and tools that can best protect their people and business,” Rohan Vaidya Director of Sales – India, CyberArk stated.

Advertisment

Poor integration and weak deployment velocity of cloud investments also wreak an unexpected blow for developers and security teams alike. Ask Vaidya and he holds a mirror to the not-so-pretty reality out there. “Quick and dirty is a well-versed term when it comes to IT professionals who want to get things done to support the business demands. The business team is constantly under pressure to catch up with either customer demands and adapting to the external environment or changing the competitive landscape,” he said, adding that their time-to-market in modern times has a high dependency on the technology teams which support their business applications.

Murtaza Bhatia

“Visibility is the key aspect while managing security in an environment which spans outside the organization to cloud (hybrid or public).”

Murtaza Bhatia, National Manager – Vertical Solutions, NTT Ltd. (India)

Advertisment
Narendra Bhandari

“Cost Management will help build effective policies on moving workloads around and a case for modernizing traditional and custom workloads.”

Narendra Bhandari, Senior Vice President, Persistent Systems

“It’s a tough situation to always balance the velocity of deployment and security guidelines. The general perception of non-critical applications or infrastructure may not need as much attention to security guidelines. A modern hacker has been exploiting these vulnerabilities. Emerging technologies give ample of these opportunities for the hacker to exploit effortlessly,” he said.

Advertisment

Among the companies surveyed for Palo Alto Networks Asia-Pacific Cloud Security Study conducted by Ovum Research in India, nearly half (47%) were seen to operate with more than 10 security tools within their infrastructure to secure their cloud. However, according to Riyaz Tambe, Director – Sales Engineering, India and SAARC, Palo Alto Networks, “Having numerous security tools creates a fragmented security posture, adding further complexity to managing security in the cloud, especially if the companies are operating in a multi-cloud environment.”

One more ripple – the developer side

The issue of weak integration or clumsy deployment is not restricted to a hybrid cloud environment alone, and Navelkar dismisses the idea of putting hybrid clouds in the spotlight here. “This can happen on other infrastructure as well.” He maintains though that the damage that is caused by these things pose a heightened burden on developers and security teams. “For instance, if an application is migrated from on-premise to Cloud and it is not integrated well then it will not yield the desired results in terms of performance, output, and strategic impact. The developers would have to then understand the root cause and impact areas and fix the issues. Such activities will consume their time, which could otherwise be channeled towards productive areas such as new product development and enhancements.”

What eventually happens is that poor integration and weak deployment can increase the perils of data breaches which would imply that the security teams will face unprecedented challenges, unless they closely guard the deployment and integration and take appropriate action proactively.

During that first phase of cloud migration, as Richard Beckett, Public Cloud senior product marketing manager, Sophos described, you are likely to build that infrastructure manually in the cloud provider console, clicking on the console to create your VPC, to create your network, to create your instances, configure security groups and so on.

“But, this infrastructure can be hard to replicate exactly – so when a new development environment is required that mirrors the live production environment exactly, or the organizations need to replicate the infrastructure in another region, it’s very difficult without a recipe to create that exact same infrastructure. And those slight variations in the configuration are bad news, not only because they create weak deployment velocity, but they also create bugs and security issues,” Beckett stated.

Rohan Vaidya

“Organizations clearly understand the need for strong cybersecurity and are quickly realizing the benefits of security-as-a-service.”

Rohan Vaidya, Director of Sales – India, CyberArk

This issue is compounded as you add more developers, each requiring their own environment. Organizations can end up with development, test, and production environments that will be different. Different OS versions, configuration settings – something will not be aligned. And that all leads to application bugs when each team merges their changes to the live system and a nightmare for security and operations teams who need to fix security and reliability issues across slightly different environments.

“The number of VMs available from Google, AWS and Microsoft increased 317% on average during the past four years.”

Cloud Price Index, 451 Research

According to Beckett, “To solve that problem, infrastructure as code templates allow development to describe infrastructure as a text file – called a Json file. And even better, it will allow teams to update that file to make individual changes once built and increase velocity.”

Tambe suggests it is ideal for organizations to have a central console that uses technologies such as artificial intelligence to help prevent known and unknown malware threats, and quickly re-mediate accidental data exposure when it arises. “Start automating threat intelligence with natively integrated, data-driven, analytics-based approaches (leveraging machine learning/artificial intelligence) to avoid human error.

Experts like Murtaza Bhatia, National Manager, Vertical Solutions, NTT Ltd. (India) believe that an environment that is seamlessly integrated with security controls and visibility solutions that mutually share contexts among them enhances visibility. “It also provides rich data to make it much easier for the automation function to correlate with the information being generated. Integration plays a key role to exchange context between the on-premise and cloud security controls so that uniform policies can be applied on infrastructure and services spanning across on-premise and cloud,” Bhatia stated.

He further added that for this to happen, the application must have ‘secure by design principles, which requires developers to run the SDLC and move security testing towards the left of the cycle. “This can lead to conflicts between security and development teams in moving the code to production because of testing at each phase of the cycle. However, this can be overcome with the use of modern security testing tools that automate testing processes on code check-in and reveal corresponding vulnerabilities. This provides IT and system integrators with the tools needed to account for each stage of the life cycle – from design, development to deployment and beyond.”

One for the road

These may be uncomfortable questions, but enterprises will have to anticipate them, pre-empt them and confront them.

Incidentally, the 451 Research pointed out an unexpected drift catching up in the enterprise landscape. Enterprises may not be ‘avoiding’ complexity, but actually ‘choosing’ it for the value it delivers in the form of differentiated offerings, more efficient applications, happier customers, and lower costs. They want to chase ‘optimization’ rather than ‘resolution.’ It is not just a simplification of complexity that they are after but something else. They do not want to lose the value that complexity has created and that’s where ‘optimizing’ helps because it lets complexity remain - but ‘manages’ it.

Counterintuitive and strange, but when has IT been predictable and straight all these decades? Whether the car goes back in your own garage or a parking lot, a flat tire can still spoil a good day. What ultimately matters is keeping the toolbox around. And one that works

for you.

feedbackvnd@cybermedia.co.in

Advertisment