Cross Model Support Growing
A growing Portability signal - the tooling layer is starting to decouple from the model vendor (a little).
Example #1: Claude Code + open-weight models via Ollama. Ollama now supports Anthropic’s Messages API, which means tools built for Claude’s interface (including Claude Code) can be pointed at local/open models instead of being locked to a single provider.
Example #2: Open Responses. Open Responses is an open-source spec/ecosystem built around a shared schema inspired by the OpenAI Responses API - explicitly aiming for multi-provider, interoperable LLM interfaces (call models, stream outputs, compose agentic workflows) without hard dependency on any one platform.
“LLM APIs have largely converged on similar building blocks—messages, tool calls, streaming, and multimodal inputs—but each provider encodes them differently.”
- https://www.openresponses.org/
- https://ollama.com/blog/claude
This is upward pressure on the Portability dial - the more “agent tooling” speaks common APIs, the easier it becomes to switch models/providers (or run locally) without rewriting your entire workflow. Not mainstream yet (very developer-focused) but a real signal that lock-in isn’t the only possible future.
> The interface layer is thickening. If you disagree with my interpretation, or you’ve spotted a better signal then reply and tell me.


