When Derek Waldron and his technical team at JPMorgan Chase first launched an LLM suite with personal assistants two-and-a-half years ago, they weren’t sure what to expect. That wasn’t long after the game-changing emergence of ChatGPT, but in enterprise, skepticism was still high. Surprisingly, employees opted into the internal platform organically — and quickly. Within months, usage jumped from zero to 250,000 employees. Now, more than 60% of employees across sales, finance, technology, operations, and other departments use the continually evolving, continually connected suite.“We were surprised by just how viral it was,” Waldron, JPMorgan’s chief analytics officer, explains in a new VB Beyond the Pilot podcast. Employees weren’t just designing prompts, they were building and customizing assistants with specific personas, instructions, and roles and were sharing their learnings on internal platforms. The financial giant has pulled off what most enterprises still struggle to achieve: large-scale, voluntary employee adoption of AI. It wasn’t the result of mandates; rather, early adopters shared tangible use cases, and workers began feeding off each other’s enthusiasm. This bottom-up usage has ultimately resulted in an innovation flywheel. “It’s this deep rooted innovative population,” Waldron says. “If we can continue to equip them with really easy to use, powerful capabilities, they can turbocharge the next evolution of this journey.” Ubiquitous connectivity plugged into highly sophisticated systems of recordJPMorgan has taken a rare, forward-looking approach to its technical architecture. The company treats AI as a core infrastructure rather than a novelty, operating from the early contrarian stance that the models themselves would become a commodity. Instead, they identified the connectivity around the system as the real challenge and defensible moat.The financial giant investe …