Key Takeaways
- BofA has restarted coverage of CoreWeave with a Buy recommendation and $100 price objective
- Lead analyst Tal Liani projects AI compute supply constraints will persist until at least 2029
- CoreWeave achieves approximately 2.5-month deployment cycles for new Nvidia hardware versus 4-6 months for traditional hyperscalers
- Long-term take-or-pay agreements help mitigate risks from customers who may become future competitors
- The company is transitioning toward debt structures backed by contracts with investment-grade clients
Shares of CoreWeave jumped 1.7% Tuesday following Bank of America’s decision to reinitiate coverage with a Buy recommendation and a $100 price objective. Trading closed at $83.37, adding to an impressive 14% year-to-date gain through Monday’s session.
CoreWeave, Inc. Class A Common Stock, CRWV
Led by analyst Tal Liani, the research note highlighted CoreWeave’s strategic positioning within the rapidly expanding AI infrastructure-as-a-service sector, which BofA values at $79 billion.
Liani emphasized that the company stands to benefit from persistent compute demand, its purpose-built software optimized for AI applications, and strategic alliances with industry leaders like Nvidia and OpenAI.
While BofA recognized what it termed “inherent risks” associated with the investment thesis, the firm concluded these concerns are overshadowed by the growth potential.
CoreWeave’s competitive advantage lies significantly in execution velocity. The firm can bring new Nvidia processors online in roughly 2.5 months on average. By contrast, larger and more diversified hyperscale operators typically require four to six months, per BofA’s analysis.
This timing differential carries substantial weight in today’s market environment. AI research organizations are aggressively pursuing compute resources, and CoreWeave has demonstrated an ability to satisfy this appetite more rapidly than established cloud infrastructure giants.
Competitive Threats Exist But Remain on the Horizon
A significant concern surrounding CoreWeave involves major customers—Meta Platforms among them—developing proprietary data center infrastructure. This evolution positions these clients as potential future competitors for compute capacity.
The situation presents a complex challenge. These large-scale customers represent substantial portions of CoreWeave’s revenue stream, making their eventual departure a meaningful risk factor.
However, BofA characterized this threat as non-imminent. Contractual arrangements featuring multiyear commitments with take-or-pay provisions secure revenue streams while CoreWeave expands infrastructure and diversifies its customer portfolio.
Liani also emphasized CoreWeave’s AI-focused orchestration platform as a differentiated asset that’s difficult to duplicate. “Hyperscalers will close part of the gap,” the analyst stated, “but the speed and slope of that convergence remain uncertain.”
Financing Strategy Attracts Market Attention
CoreWeave’s capital structure approach has generated considerable market discussion. The firm utilizes debt instruments to finance additional compute infrastructure, characterizing this as “success-based” capital deployment linked directly to customer agreements.
To address associated risks, CoreWeave is pivoting toward debt arrangements specifically collateralized by revenue commitments from investment-grade clients and the physical hardware assets. This approach effectively transfers portions of credit exposure to the customer base.
According to BofA’s assessment, successful execution of rapid capacity expansion could enable CoreWeave to achieve “hyperscale-style expansion without hyperscale balance-sheet strength.”
The primary vulnerability remains potential setbacks in construction timelines or facility conversion projects, which could negatively impact share performance.
Liani additionally noted that emerging agentic AI applications may amplify infrastructure requirements, potentially extending supply constraints beyond current market expectations.
Bank of America maintains that the supply-demand imbalance for AI compute resources will continue through at least 2029.



