Vera Shulman.
Opinion

Learning to manage the cloud without losing control

"The key question for decision-makers is no longer which model writes better code, but whether their cloud infrastructure is ready for the day AI becomes central to business operations rather than a marginal tool," writes Vera Shulman, CEO of ProfiSea.

The race to adopt generative AI across organizations has created a strategic blind spot. While product and engineering teams focus on model capabilities, leadership is beginning to notice a troubling pattern: AI adoption is rapidly becoming a major driver of cloud spending often without sufficient visibility or control.
The real challenge is no longer just which model to choose, but how to build a cloud infrastructure that can support AI without allowing costs and operational complexity to undermine organizational stability.
Advanced models, such as those developed by Anthropic, including the Claude 3.5 series, offer remarkable reasoning capabilities. However, they come with intensive resource demands. For organizations deploying these models via APIs or on their own cloud infrastructure, every query directly impacts the bottom line.
1 View gallery
Vera Shulman
Vera Shulman
Vera Shulman.
(Rami Zerenger)
According to Gartner, by the end of 2025, approximately 30% of GenAI projects will be abandoned after the proof-of-concept stage. The primary reason is not model performance, but the inability to control the cloud computing and storage costs that surround them.
The pace of technological change is so rapid that many organizations adopt advanced models quickly, only to later discover critical gaps: insecure processes, exposed organizational data, and runaway costs. As a result, many projects stall at the PoC stage or are implemented in an uncontrolled manner.
Using these tools effectively requires more than simply deploying a model it requires managing how that model interacts with the cloud. Organizations must move away from trial-and-error approaches toward structured systems that define clear boundaries while maintaining flexibility in a rapidly evolving market.
Key Principles for Cloud Governance
To enable growth without losing control, organizations must adopt three core principles:
1. Token Economics (Tokenomics) Management
In traditional cloud environments, organizations measure and manage memory and compute usage. In the AI era, however, the fundamental unit of consumption is the token.
Effective management requires a dedicated monitoring layer that quantifies token usage and attributes it to specific projects or departments.
One of the most effective ways to control these costs is through Retrieval-Augmented Generation (RAG). Instead of sending massive volumes of data to a model, RAG systems retrieve only the most relevant data for each query. This is not just a way to improve output accuracy, it is a critical economic strategy that significantly reduces data transfer and cloud costs.
2. Multi-Cloud Strategy and Hybrid Architecture
Relying on a single model or cloud provider is a strategic risk organizations can no longer afford.
A flexible cloud infrastructure enables organizations to run complex workloads on high-performance models, such as those from Anthropic, in secure environments, while routing simpler tasks to lighter, more cost-efficient models.
The goal is to avoid vendor lock-in, where an organization becomes dependent on a single provider. Flexibility not only ensures business continuity in case of outages, but also strengthens negotiating power with cloud providers when managing pricing.
3. Automation and Financial Management for AI
The dynamic nature of AI requires a shift toward autonomous infrastructure management. Manual management of AI-driven cloud resources is no longer viable.
Organizations must implement optimization tools capable of allocating resources in real time, identifying anomalies in usage, and automatically stopping inefficient processes.
According to IDC, organizations that adopt dedicated monitoring and control tools for AI infrastructure achieve 25% greater system stability along with significant reductions in operational costs.
A Necessary Mindset Shift
Adopting advanced AI models without losing control requires a fundamental shift in perspective. AI is not just another application, it is a new and highly demanding workload on cloud infrastructure.
The key question for decision-makers is no longer which model writes better code, but whether their cloud infrastructure is ready for the day AI becomes central to business operations rather than a marginal tool.
Control and efficiency will not come from limiting the use of advanced models. Instead, they will come from intentionally designing infrastructure around efficiency, governance, and precision from the outset.
In a business environment where cloud spending is a major component of operational costs, organizations that optimize their cloud infrastructure alongside AI adoption will transform AI from a budget burden into a strategic asset that drives competitive advantage.
Vera Shulman is the CEO of ProfiSea, a company developing innovative solutions for monitoring, managing, and optimizing public cloud infrastructure for organizations of all sizes.