🚀 AI Books MCP Server - Extend LLM context 15-60× via gravitational memory | Official MCP server for Claude Code & Anthropic
-
Updated
Feb 11, 2026 - TypeScript
🚀 AI Books MCP Server - Extend LLM context 15-60× via gravitational memory | Official MCP server for Claude Code & Anthropic
The source codes for bachelor's thesis
Production-ready test-time compute optimization framework for LLM inference. Implements Best-of-N, Sequential Revision, and Beam Search strategies. Validated with models up to 7B parameters.
Add a description, image, and links to the context-extension topic page so that developers can more easily learn about it.
To associate your repository with the context-extension topic, visit your repo's landing page and select "manage topics."