Docs-Builder

# Analysis AI Integration

Version: 1.1.1
Last Updated: January 23, 2026

This file rehomes the AI integration extract from Technical/api/api.md for analysis-related AI augmentation (code review, docgen, warnings explanations). Cloud/user preferences are out of scope; see User-Guide settings separately.

AIProvider Protocol

class AIProvider(Protocol):
    @property
    def provider_id(self) -> str: ...
    @property
    def default_model(self) -> str: ...
    @property
    def is_local(self) -> bool: ...
    def chat(self, request: ChatRequest) -> ChatResponse: ...
    def validate_config(self) -> None: ...
    def estimate_cost(self, request: ChatRequest) -> Optional[float]: ...

Request/Response Models

@dataclass
class Message:
    role: Literal["system", "user", "assistant"]
    content: str
    name: Optional[str] = None

@dataclass
class ChatRequest:
    messages: List[Message]
    model: Optional[str] = None
    temperature: float = 0.7
    max_tokens: Optional[int] = None
    top_p: float = 1.0
    frequency_penalty: float = 0.0
    presence_penalty: float = 0.0
    stop: Optional[List[str]] = None
    feature: Optional[str] = None  # e.g., code_review, docgen
    safe_mode: bool = False        # restrict to local providers
    redacted: bool = False         # true if safety layer redacted content
    extra: Optional[Dict[str, Any]] = None

@dataclass
class TokenUsage:
    prompt_tokens: int
    completion_tokens: int
    total_tokens: int

@dataclass
class ChatResponse:
    content: str
    model: str
    finish_reason: Literal["stop", "length", "content_filter", "error"]
    provider_id: str
    usage: Optional[TokenUsage] = None
    provider_response_raw: Optional[Dict[str, Any]] = None
    created_at: Optional[datetime] = None

Provider Manager

manager = ProviderManager(config_path="ai_providers.toml")
provider = manager.get_provider("openai")
response = provider.chat(ChatRequest(messages=[Message(role="user", content="Explain this code")], feature="code_review"))
print(response.content)

Configuration (ai_providers.toml)

[openai]
enabled = true
api_key_env = "OPENAI_API_KEY"
default_model = "gpt-4"
base_url = "https://api.openai.com/v1"

[anthropic]
enabled = true
api_key_env = "ANTHROPIC_API_KEY"
default_model = "claude-3-opus-20240229"

[ollama]
enabled = true
default_model = "llama2"
base_url = "http://localhost:11434"

Invariants

  • Analysis features must honor safe_mode and prefer local providers when requested.
  • Provider validation is required before use; errors surface to the caller, not hidden.
  • No AI calls occur automatically without explicit user/config initiation.