Excusez-nous — cette page est actuellement disponible en anglais uniquement.
La traduction française n'est pas encore prête. Le contenu ci-dessous est la version anglaise la plus récente.
BranchPy works fully without Cloud or AI. All core features � Analyze, Flowchart, Pilot, Omega, Stats, Media Validation, Doctor � run locally and offline. Cloud and AI are strictly opt-in.
What Cloud features do
Cloud features connect BranchPy to the BranchPy backend for:
- License verification � your account and tier are validated server-side.
- Telemetry sync � anonymous usage data (run counts, error categories) uploaded for product improvement. Controlled separately under Telemetry Settings.
- Pre-release feature access � some features require a verified account.
Cloud connectivity is required only for license checks and telemetry. Nothing in your source files or analysis reports is ever sent to the cloud unless you explicitly export and share it.
What AI features do
When AI is enabled, BranchPy can use a language model to:
- Explain analysis warnings in plain language (VS Code Problems panel).
- Suggest fixes for detected issues (AI Autofix command).
- Summarise narrative structure in the AI panel.
All AI requests go through a provider you configure (OpenAI, Anthropic, Gemini, Ollama, Venice.ai). No AI provider is contacted until you explicitly trigger an AI action.
Enabling AI � what happens
AI features are off by default. When you toggle AI on in VS Code settings (branchpy.ai.enabled = true), a consent dialog appears:
“AI features require optional packages. Install them now?”
Confirming runs:
pip install "branchpy[ai]"
This installs the three provider client libraries:
| Package | Size | Purpose |
|---|---|---|
openai >=1.0.0 |
~1 MB | OpenAI + compatible APIs (Venice.ai) |
anthropic >=0.70.0 |
~2 MB | Anthropic Claude |
google-generativeai >=0.3.0 |
~3 MB | Google Gemini |
These packages are not installed with the base pip install branchpy. If you cancel the dialog, AI stays disabled and no packages are installed.
From the CLI:
pip install "branchpy[ai]"
Configuring your AI provider
Create .branchpy/ai_providers.toml (or ~/.branchpy/ai_providers.toml for per-user config):
[ai]
default_provider = "openai" # openai | anthropic | gemini | ollama | venice
safe_mode_default = true # true = block cloud providers; local only
redact_sensitive_data = true # strip PII before sending
max_context_tokens = 16000
[ai.providers.openai]
type = "openai"
api_key = "sk-..."
model = "gpt-4o"
[ai.providers.ollama]
type = "ollama"
base_url = "http://localhost:11434"
model = "llama3"
A full example with all supported providers is in ai_providers.toml.example in the BranchPy install directory.
Privacy defaults
| Setting | Default | Meaning |
|---|---|---|
safe_mode_default |
true |
Only local providers (Ollama, LM Studio) allowed by default |
redact_sensitive_data |
true |
Project paths and variable names stripped before cloud requests |
log_all_requests |
true |
Every AI request logged to .branchpy/logs/ for audit |
With safe_mode_default = true, cloud providers (OpenAI, Anthropic, Gemini) are blocked even if configured. Set it to false to allow them.
Disabling AI
Set branchpy.ai.enabled = false in VS Code settings, or simply leave branchpy[ai] uninstalled. If the optional packages are not present, all AI features silently deactivate � no errors, no impact on core analysis.
Learn more: AI Integration