Metadata-Version: 2.4
Name: acatome-lambic
Version: 0.5.2
Summary: MCP-aware LLM shell with provider switching
License-Expression: GPL-3.0-or-later
License-File: LICENSE
Requires-Python: >=3.11
Requires-Dist: httpx>=0.27
Requires-Dist: mcp>=1.0
Requires-Dist: prompt-toolkit>=3.0
Requires-Dist: rich>=13.0
Provides-Extra: dev
Requires-Dist: black>=24.0; extra == 'dev'
Requires-Dist: pytest-asyncio>=0.24; extra == 'dev'
Requires-Dist: pytest>=8.0; extra == 'dev'
Provides-Extra: litellm
Requires-Dist: litellm!=1.82.7,!=1.82.8,>=1.0; extra == 'litellm'
Description-Content-Type: text/markdown

# acatome-lambic

MCP-aware LLM shell with provider switching.

Connects to MCP servers via stdio, talks to LLMs (ollama, OpenAI, Anthropic via litellm),
and provides a terminal chat interface with tool calling.

## Usage

```python
from acatome_lambic.tui.app import Shell
from acatome_lambic.core.config import LlmConfig, McpServer, ShellConfig

config = ShellConfig(
    llm=LlmConfig(provider="ollama", model="qwen3.5:9b"),
    servers=[
        McpServer(name="acatome", cmd=["uv", "run", "acatome-mcp"]),
        McpServer(name="precis", cmd=["uv", "run", "precis"]),
    ],
    system_prompt="You are a research assistant.",
)
Shell(config).run()
```

## Commands

- `/model <provider/model>` — switch LLM
- `/think on|off` — toggle reasoning mode (default: on)
- `/tools` — list tools with on/off status
- `/tools off <pattern>` — disable tools matching pattern
- `/tools on <pattern>` — enable tools matching pattern
- `/expand <call_id>` — show full (untruncated) tool result
- `/status` — show session info
- `/clear` — clear message history
- `/help` — show command help
- `/quit` — exit

Applications can register custom commands (`custom_commands`) and
LLM-routed message commands (`message_commands`) via `ShellConfig`.
All registered commands appear in tab autocomplete.
