AI agents run on file systems using standard tools to navigate directories and read file paths. The challenge, however, is that there is a lot of enterprise data in object storage systems, notably Amazon S3. Object stores serve data through API calls, not file paths. Bridging that gap has required a separate file system layer alongside S3, duplicated data and sync pipelines to keep both aligned.The rise of agentic AI makes that challenge even harder, and it was affecting Amazon’s own ability to get things done. Engineering teams at AWS using tools like Kiro and Claude Code kept running into the same problem: Agents defaulted to local file tools, but the data was in S3. Downloading it locally worked until the agent’s context window compacted and the session state was lost.Amazon’s answer is S3 Files, which mounts any S3 bucket directly into an agent’s local environment with a single command. The data stays in S3, with no migration required. Under the hood, AWS connects its Elastic File System (EFS) technology to S3 to deliver full file system semantics, not a workaround. S3 Files is available now in most AWS Regions.”By making data in S3 immediately available, as if it’s part of the local file system, we found that we had a really big acceleration with the ability of things like Kiro and Clau …