$42.1 million poured into startup offering energy-efficient solutions for costly and unwieldy operational data and AI workloads

by | Apr 22, 2025 | Technology

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More

Hyperscale data warehouse vendor Ocient announced today that it has raised $42.1 million as the second extension of its series B funding to accelerate the development and delivery of energy-efficient solutions for costly and unwieldy operational data and AI workloads.

The funding infusion doesn’t just add to the Chicago startup’s already hefty war chest; it sharpens a mission to make hyperscale analytics radically cheaper and greener at the very moment enterprises fear ballooning data‑center power bills. 

The new round increases the company’s total funding to $159.4 million. The latest round was led by climate-savvy backers such as Blue Bear Capital and Allstate Strategic Ventures — a signal that investors now view data-platform efficiency as a climate issue as much as a performance one. 

Ocient CEO Chris Gladwin told VentureBeat that Ocient’s architecture already delivers “ten‑to‑one price‑performance gains” on multi‑petabyte workloads, and plans are underway to carry that advantage into new verticals from automotive telemetry to climate modeling. The startup has doubled its revenues for three consecutive years and appointed Henry Marshall, formerly CFO at space-infrastructure firm Loft Orbital, to steer its financial operations, signaling that Ocient is entering a formal growth stage.

A funding round framed by climate economics

The $42.1 million top‑up follows the $49.4 million raise in March 2024 that lifted Ocient’s invested capital to $119 million and marked 109 percent year‑over‑year revenue growth. Alongside its new investors, the company retains support from Greycroft and OCA Ventures, with Buoyant Ventures backing the extension for its “differentiated approach to delivering energy‑efficient analytics.” Gladwin linked the round to a broader mission: “Enterprises are grappling with complex data ecosystems, energy availability, and the pressure to control costs while proving business value,” he said. 

Why hyperscale analytics hits a wall

Modern data warehouses thrive when datasets are measured in terabytes. Beyond that, network and storage I/O become the choke point, not raw CPU cycles. As Gladwin told VentureBeat, “When datasets get bigger, the flow of data from storage to processing units becomes the true limiting factor.” 

In telco, ad‑tech and government deployments, query engines must scan trillions of records while simultaneously ingesting streams that keep pouring in. Traditional cloud architectures that separate compute and object stora …

Article Attribution | Read More at Article Source