The deep learning revolution has a curious blind spot: the spreadsheet. While Large Language Models (LLMs) have mastered the nuances of human prose and image generators have conquered the digital canvas, the structured, relational data that underpins the global economy — the rows and columns of ERP systems, CRMs, and financial ledgers — has so far been treated as just another file format similar to text or PDFs.That’s left enterprises to forecast business outcomes using the typical bespoke, labor-intensive data science process of manual feature engineering and classic machine learning algorithms that predate modern deep learning.But now Fundamental, a San Francisco-based AI firm co-founded by DeepMind alumni, is launching today with $255 million in total funding to bridge this gap. Emerging from stealth, the company is debuting NEXUS, a Large Tabular Model (LTM) designed to treat business data not as a simple sequence of words, but as a complex web of non-linear relationships.The tech: moving beyond sequential logicMost current AI models are built on sequential logic — predicting the next word in a sentence or the next pixel in a frame. However, enterprise data is inherently non-sequential. A customer’s churn risk isn’t just a timeline; it’s a multi-dimensional intersection of transaction frequency, support ticket sentiment, and regional economic shifts. Existing LLMs struggle with this because they are poorly suited to the size and dimensionality constraints of enterprise-scale tables.”The most valuable data in the world lives in tables and until now there has been no good foundation model built specifically to understand it,” said Jeremy Fraenkel, CEO and Co-founder of Fundamental. In a recent interview with VentureBeat, Fraenkel emphasized t …