Something went wrong

Thank you for being patient! We're working hard on resolving the issue

Crate overview - Lona Docs Log in

tento-ai

tento-ai collects model adapters, tool-calling helpers, and experimental memory pipeline code used by local AI features.

Public surface

  • gemini::GeminiModel wraps Google Gemini model access.
  • claude::ClaudeModel and openai hold alternate provider adapters.
  • generate_content::{ModelExecutor, LlmFunction} defines the typed structured-output call path.
  • memory::{ChatMemoryPipeline, TurnLog, KnowledgeStore} owns session memory assembly and fact extraction.
  • tools::{ToolRegistry, ToolContext} defines client action tools that model responses can request.

Boundaries

The crate may depend on provider SDKs such as tento-google, but callers should work through model and function abstractions rather than constructing raw provider requests in app code. Provider API keys are declared through tento-env and should stay out of command lines and persisted docs.

Memory storage is intentionally trait-based. Production hosts should implement TurnLog and KnowledgeStore against their own durable storage instead of depending on the in-memory implementations outside tests or demos.

Scripts

The scripts module exposes model listing and test commands for local development. These commands are debugging surfaces, not the stable library API.