HomeFR
Artificial Intelligence

Configuration

Choose an AI provider, configure the API key, and select a model.

Scene Cut's AI features require an external provider. The application does not include a built-in AI service — each user configures their own access in Preferences > AI.

Available Providers

ProviderRecommended modelsAPI key required
AnthropicClaude Sonnet 4.6, Claude Haiku 4.5Yes
OpenAIGPT-4o, GPT-4o miniYes
MistralMistral Large, Mistral SmallYes
GroqLlama 3.3 70BYes
OpenRouterAccess to multiple models (Claude, GPT, Gemini...)Yes
OllamaLlama 3.1, Mistral, Qwen 2.5No
CustomModel and URL of your choiceVariable

Configuring a Provider

  1. Open Preferences > AI
  2. Choose the provider from the dropdown list
  3. Enter the API key (available on the provider's website)
  4. Select the desired model
  5. Click Test Connection to verify everything works

The API key is stored securely in the macOS Keychain — it is never saved in plain text in preference files.

Scene Cut is designed to consume very few tokens per request. In practice, a few dollars are enough to perform many complete audits and dozens of suggestions, even on a feature film. The cost of AI usage remains marginal compared to the time saved.
For regular professional use, Claude Sonnet offers the best quality-to-relevance ratio for subtitling. For occasional use or a limited budget, Claude Haiku or GPT-4o mini are more affordable alternatives.

Using a Local Model with Ollama

For those who prefer keeping data local — or who work without an internet connection — Scene Cut is compatible with Ollama, a local AI model server that runs directly on your machine.

  1. Install Ollama from ollama.com
  2. Download a model: ollama pull llama3.1 (or mistral, qwen2.5...)
  3. In Scene Cut, choose Ollama as the provider
  4. The default address (http://localhost:11434) works without modification
  5. No API key needed
Local models are generally less capable than cloud models for fine linguistic analysis, but they work entirely offline and do not transmit any data externally.

Using a Custom Endpoint

The Custom provider lets you connect to any service compatible with the OpenAI API format. You only need to provide the base URL and the model name. This is useful for companies that host their own models or use an API proxy.