Artificial Intelligence
Configuration
Choose an AI provider, configure the API key, and select a model.
Scene Cut's AI features require an external provider. The application does not include a built-in AI service — each user configures their own access in Preferences > AI.
Available Providers
| Provider | Recommended models | API key required |
|---|---|---|
| Anthropic | Claude Sonnet 4.6, Claude Haiku 4.5 | Yes |
| OpenAI | GPT-4o, GPT-4o mini | Yes |
| Mistral | Mistral Large, Mistral Small | Yes |
| Groq | Llama 3.3 70B | Yes |
| OpenRouter | Access to multiple models (Claude, GPT, Gemini...) | Yes |
| Ollama | Llama 3.1, Mistral, Qwen 2.5 | No |
| Custom | Model and URL of your choice | Variable |
Configuring a Provider
- Open Preferences > AI
- Choose the provider from the dropdown list
- Enter the API key (available on the provider's website)
- Select the desired model
- Click Test Connection to verify everything works
The API key is stored securely in the macOS Keychain — it is never saved in plain text in preference files.
Scene Cut is designed to consume very few tokens per request. In practice, a few dollars are enough to perform many complete audits and dozens of suggestions, even on a feature film. The cost of AI usage remains marginal compared to the time saved.
For regular professional use, Claude Sonnet offers the best quality-to-relevance ratio for subtitling. For occasional use or a limited budget, Claude Haiku or GPT-4o mini are more affordable alternatives.
Using a Local Model with Ollama
For those who prefer keeping data local — or who work without an internet connection — Scene Cut is compatible with Ollama, a local AI model server that runs directly on your machine.
- Install Ollama from ollama.com
- Download a model:
ollama pull llama3.1(ormistral,qwen2.5...) - In Scene Cut, choose Ollama as the provider
- The default address (
http://localhost:11434) works without modification - No API key needed
Local models are generally less capable than cloud models for fine linguistic analysis, but they work entirely offline and do not transmit any data externally.
Using a Custom Endpoint
The Custom provider lets you connect to any service compatible with the OpenAI API format. You only need to provide the base URL and the model name. This is useful for companies that host their own models or use an API proxy.