Llm9p: LLM as a Plan 9 file system

Llm9p: LLM as a Plan 9 file system

AI & ML·2 min read·via Hacker NewsOriginal source →

Takeaways

  • LLM9P allows users to interact with Large Language Models (LLMs) through standard filesystem operations.
  • The project supports multiple backends, including Anthropic API and Claude Code CLI, with plans for local LLM support.
  • By leveraging the 9P filesystem protocol, LLM9P simplifies LLM access, making it universally scriptable and composable.

LLM9P: Bridging Large Language Models with the 9P Filesystem Protocol

Introduction to LLM9P

In an innovative move, the LLM9P project has emerged, enabling seamless interaction with Large Language Models (LLMs) via the 9P filesystem protocol. This approach allows users, scripts, and AI agents to write prompts to files and read responses as if they were dealing with local files. Imagine using simple commands like cat and echo to communicate with LLMs instead of dealing with complex SDKs or HTTP APIs. This could change the way developers integrate LLMs into their workflows.

Supported Backends and Architecture

Currently, LLM9P supports two backends: the Anthropic API and the Claude Code CLI, with plans to incorporate local LLMs in the future. The architecture is designed to be pluggable, making it straightforward to add new LLM providers. For instance, users can access the Anthropic API by exporting their API key and launching the LLM9P server. Alternatively, those with a Claude Max subscription can bypass API tokens entirely by utilizing the Claude Code CLI. This flexibility is a significant advantage for practitioners looking to customize their LLM interactions.

The Power of 9P

But why use the 9P protocol? Originally developed for Plan 9 from Bell Labs, 9P is a lightweight network filesystem protocol that allows remote resources to be accessed as local files. This universality means that any programming language or tool capable of reading and writing files can interact with LLMs through LLM9P. Furthermore, the protocol’s simplicity enables users to chain LLM calls using standard Unix pipes and shell scripts, enhancing the composability of workflows. The lack of dependencies makes it an attractive option for developers who prefer minimalism.

Installation and Usage

Getting started with LLM9P is straightforward. Developers can either install it via Go or build it from source. Once set up, users can mount the filesystem using various methods, including Plan 9 from User Space or directly through Linux’s built-in 9P support. This flexibility allows for easy integration into existing environments. For example, a user can quickly switch models or adjust parameters like temperature and system prompts by writing to specific files, streamlining the interaction process.

Conclusion

LLM9P represents a significant step forward in making LLMs more accessible and easier to work with. By leveraging the 9P filesystem protocol, it not only simplifies LLM access but also enhances the scripting capabilities for developers. As more backends are added and local LLM support becomes available, LLM9P could become a go-to tool for engineers and AI practitioners looking to harness the power of large language models in their applications.

More Stories