<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Large Context Window on The Coders Blog</title><link>https://thecodersblog.com/tag/large-context-window/</link><description>Recent content in Large Context Window on The Coders Blog</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Fri, 08 May 2026 17:37:29 +0000</lastBuildDate><atom:link href="https://thecodersblog.com/tag/large-context-window/index.xml" rel="self" type="application/rss+xml"/><item><title>Anthropic User's Long Context AI Experience</title><link>https://thecodersblog.com/anthropic-user-experience-with-long-context-models-2026/</link><pubDate>Fri, 08 May 2026 17:37:29 +0000</pubDate><guid>https://thecodersblog.com/anthropic-user-experience-with-long-context-models-2026/</guid><description>&lt;h2 id="when-ai-remembers-everything-my-deep-dive-into-anthropics-1-million-token-canvas"&gt;When AI Remembers Everything: My Deep Dive into Anthropic&amp;rsquo;s 1 Million Token Canvas&lt;/h2&gt;
&lt;p&gt;For years, the holy grail of AI interaction wasn&amp;rsquo;t just about generating a perfect sentence or a coherent paragraph. It was about the AI remembering. Truly remembering. Not just the last few lines of our conversation, but entire documents, complex codebases, or lengthy research papers. Anthropic, with its Claude models, has been aggressively pushing the boundaries of what &amp;ldquo;remembering&amp;rdquo; means for an AI. Initially, a 200K token context window felt like a revelation. Now, with models like Claude Opus and Sonnet boasting an astounding 1 million token capacity, the potential for seamless, deeply informed AI assistance feels within reach. This isn&amp;rsquo;t just about feeding more data; it&amp;rsquo;s about unlocking entirely new workflows and interaction paradigms. But as with any bleeding-edge technology, the reality often has more nuance than the marketing hype. My journey with these expansive context windows has been a testament to their power, but also a stark reminder of the delicate art of managing and understanding what the AI &lt;em&gt;actually&lt;/em&gt; retains.&lt;/p&gt;</description></item></channel></rss>