Metadata-Version: 2.4
Name: acatome-lambic
Version: 0.2.6
Summary: MCP-aware LLM shell with provider switching
License-Expression: GPL-3.0-or-later
License-File: LICENSE
Requires-Python: >=3.11
Requires-Dist: httpx>=0.27
Requires-Dist: mcp>=1.0
Requires-Dist: prompt-toolkit>=3.0
Requires-Dist: rich>=13.0
Provides-Extra: dev
Requires-Dist: black>=24.0; extra == 'dev'
Requires-Dist: pytest-asyncio>=0.24; extra == 'dev'
Requires-Dist: pytest>=8.0; extra == 'dev'
Provides-Extra: litellm
Requires-Dist: litellm>=1.0; extra == 'litellm'
Description-Content-Type: text/markdown

# lambic

MCP-aware LLM shell with provider switching.

Connects to MCP servers via stdio, talks to LLMs (ollama, OpenAI, Anthropic via litellm),
and provides a terminal chat interface with tool calling.

## Usage

```python
from lambic import Shell, LlmConfig, McpServer

shell = Shell(
    model=LlmConfig(provider="ollama", model="qwen3.5:9b"),
    servers=[
        McpServer("acatome", cmd=["uv", "run", "acatome-mcp"]),
        McpServer("precis", cmd=["uv", "run", "precis"]),
    ],
    system_prompt="You are a research assistant.",
)
shell.run()
```

## CLI

```bash
lambic --config path/to/config.toml
```

## Commands

- `/model <provider/model>` — switch LLM
- `/think on|off` — toggle reasoning mode (default: on)
- `/tools` — list tools with on/off status
- `/tools off <pattern>` — disable tools matching pattern
- `/tools on <pattern>` — enable tools matching pattern
- `/expand <call_id>` — show full (untruncated) tool result
- `/status` — show session info
- `/clear` — clear message history
- `/quit` — exit
