Gemini 3 refused to believe it was 2025, and hilarity ensued 

by | Nov 20, 2025 | Technology

Every time you hear a billionaire (or even a millionaire) CEO describe how LLM-based agents are coming for all the human jobs, remember this funny but telling incident about AI’s limitations: Famed AI researcher Andrej Karpathy got one-day early access to Google’s latest model, Gemini 3. –and it refused to believe him when he said the year was 2025.

When it finally saw the year for itself, it was thunderstruck, telling him, “I am suffering from a massive case of temporal shock right now.” 

Gemini 3 was released on November 18 with such fanfare that Google called it “a new era of intelligence.” And Gemini 3 is, by nearly all accounts (including Karpathy’s), a very capable, foundation model, particularly for reasoning tasks. Karpathy is a widely respected AI research scientist who was a founding member of OpenAI, ran AI at Tesla for a while, and is now building a startup, Eureka Labs, to reimagine schools for the AI era with agentic teachers. He publishes a lot of content on what goes on under-the-hood of LLMs. 

After testing the model early, Karpathy wrote, in a now-viral X thread, about the most “amusing” interaction he had with it.  

Apparently, the model’s pre-training data had only included information through 2024. So Gemini 3 believed the year was still 2024. When Karpathy attempted to prove to it that the date was truly November 17, 2025, Gemini 3 accused the researcher of “trying to trick it.”  

He showed it news articles, images, and Google search results. But instead of being convinced, the LLM accused Karpathy of gaslighting it — of uploading AI-generated fakes. It even went so far as to describe what the “dead giveaways” were in the images that supposedly proved this was trickery, according to Karpathy’s account. (He did not respond to our request for further comment.) 

Baffled, Karpathy – who is, after all, one of the world’s leading experts on training LLMs – eventually discovered the problem. Not onl …

Article Attribution | Read More at Article Source