Skip to content

feat: Add OpenAI-compatible API support to AI Writer operation#250

Open
aurkinet wants to merge 2 commits into
directus-labs:mainfrom
aurkinet:feature/openai-compatible-support
Open

feat: Add OpenAI-compatible API support to AI Writer operation#250
aurkinet wants to merge 2 commits into
directus-labs:mainfrom
aurkinet:feature/openai-compatible-support

Conversation

@aurkinet
Copy link
Copy Markdown

Adds support for any OpenAI-compatible API to the AI Writer operation, allowing users to use local models, cloud providers, and any service that implements the OpenAI Chat completions API format.

To test :

  1. Select "OpenAI Compatible (Custom)" as AI Provider
  2. Enter custom endpoint (e.g., https://router.huggingface.co/v1)
  3. Provide API key
  4. Choose model or enter custom model name (e.g., Qwen/Qwen3-Next-80B-A3B-Instruct:novita)

@NemeZZiZZ
Copy link
Copy Markdown

You can use field AI Model with meta.options: {.., allowOther: true} without any special custom field, so user can enter any desired model that is not hardcoded

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds an “OpenAI Compatible (Custom)” provider option to the AI Writer operation so users can target OpenAI Chat Completions–compatible endpoints beyond OpenAI itself.

Changes:

  • Add new UI options for “OpenAI Compatible (Custom)”, including custom endpoint, API key, and custom model name input.
  • Extend provider selection + OpenAI provider implementation to support custom endpoints and custom model names.
  • Expand Directus sandbox request scopes and update README documentation for the new provider.

Reviewed changes

Copilot reviewed 6 out of 6 changed files in this pull request and generated 6 comments.

Show a summary per file
File Description
packages/ai-writer-operation/src/app.ts Adds “OpenAI Compatible (Custom)” provider choice and related config fields in the operation UI.
packages/ai-writer-operation/src/api.ts Extends operation options interface with custom endpoint/key/model fields.
packages/ai-writer-operation/src/Provider/ProviderFactory.ts Routes openai-compatible provider to the OpenAI provider and fixes a typo in error text.
packages/ai-writer-operation/src/Provider/OpenAi.ts Adds custom endpoint + API key handling and allows custom model name substitution.
packages/ai-writer-operation/package.json Broadens sandbox request URL scopes to allow arbitrary OpenAI-compatible endpoints.
packages/ai-writer-operation/README.md Updates docs and adds guidance for OpenAI-compatible providers.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

throw new InvalidPayloadError({ reason: 'OpenAI API Key is missing' });
}
// Determine if this is a custom OpenAI-compatible provider or standard OpenAI
const isCustomProvider = options.aiProvider === 'openai-compatible';
Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

isCustomProvider is checked with a case-sensitive equality (options.aiProvider === 'openai-compatible'), but getProvider() routes here using a case-insensitive comparison. If aiProvider ever comes through with different casing (e.g., via API/DB), this will incorrectly treat it as standard OpenAI and require apiKeyOpenAi / use the OpenAI endpoint. Normalize casing here (e.g., compare options.aiProvider?.toLowerCase()), or pass a canonical provider value into the provider classes.

Suggested change
const isCustomProvider = options.aiProvider === 'openai-compatible';
const isCustomProvider = options.aiProvider?.toLowerCase() === 'openai-compatible';

Copilot uses AI. Check for mistakes.
Comment on lines +22 to +24
const endpoint = options.customEndpoint.endsWith('/chat/completions')
? options.customEndpoint
: `${options.customEndpoint.replace(/\/$/, '')}/chat/completions`;
Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The endpoint normalization can produce a duplicated path when the user provides a full /chat/completions/ URL with a trailing slash (e.g., .../chat/completions/ becomes .../chat/completions/chat/completions). Consider trimming trailing slashes before the endsWith('/chat/completions') check (or using new URL() path joining) to make this robust.

Suggested change
const endpoint = options.customEndpoint.endsWith('/chat/completions')
? options.customEndpoint
: `${options.customEndpoint.replace(/\/$/, '')}/chat/completions`;
const baseEndpoint = options.customEndpoint.replace(/\/+$/, '');
const endpoint = baseEndpoint.endsWith('/chat/completions')
? baseEndpoint
: `${baseEndpoint}/chat/completions`;

Copilot uses AI. Check for mistakes.
Comment on lines +39 to +41
// Use custom model name if provided, otherwise use the selected model
const modelName = this.options.model === 'custom' && this.options.customModelName
? this.options.customModelName
Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When model is set to 'custom' but customModelName is missing/empty, the request will be sent with model: 'custom'. Add server-side validation to require a non-empty customModelName when model === 'custom' (don’t rely on the app UI’s conditional required field).

Suggested change
// Use custom model name if provided, otherwise use the selected model
const modelName = this.options.model === 'custom' && this.options.customModelName
? this.options.customModelName
// Validate that a custom model name is provided when using a custom model
if (this.options.model === 'custom' && !this.options.customModelName) {
throw new InvalidPayloadError({
reason: 'Custom Model Name is missing',
});
}
// Use custom model name if provided, otherwise use the selected model
const modelName = this.options.model === 'custom' && this.options.customModelName
? this.options.customModelName

Copilot uses AI. Check for mistakes.
Comment on lines 51 to 52
],
"urls": [
Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The new request scope patterns (https://**/v1/** and especially http://**/v1/**) allow this operation to make network calls to essentially any host, including over plaintext HTTP. If custom endpoints are required, consider restricting to HTTPS-only and/or limiting HTTP to localhost, and validate customEndpoint to reject localhost/private IP ranges by default to reduce SSRF risk.

Suggested change
"urls": [
"https://**/v1/**"

Copilot uses AI. Check for mistakes.
}

throw new Error(`Unsoported AI Provider ${options.aiProvider}`);
throw new Error(`Unsupported AI Provider ${options.aiProvider}`);
Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For unsupported providers, this throws a generic Error while other validation errors use InvalidPayloadError. Consider throwing InvalidPayloadError here as well (with a clear reason) so API consumers get consistent error types/status codes.

Suggested change
throw new Error(`Unsupported AI Provider ${options.aiProvider}`);
throw new InvalidPayloadError({ reason: `Unsupported AI Provider: ${options.aiProvider}` });

Copilot uses AI. Check for mistakes.
Comment on lines +19 to +29
The extension now supports any OpenAI-compatible API by selecting "OpenAI Compatible (Custom)" as the AI Provider. This allows you to use services like:
- Local models (Ollama, LM Studio, etc.)
- Cloud providers (Azure OpenAI, Google Vertex AI, etc.)
- Other OpenAI-compatible services (Groq, Together AI, etc.)

When using a custom provider, you'll need to:
1. Select "OpenAI Compatible (Custom)" as the AI Provider
2. Enter your custom API endpoint (e.g., `https://api.example.com/v1`)
3. Provide your API key
4. Choose a model from the dropdown or select "Custom Model" to enter a specific model name

Copy link

Copilot AI Mar 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The README implies Azure OpenAI / Vertex AI work directly as “OpenAI-compatible APIs”, but Azure’s endpoints/auth (api-key header + api-version + deployments path) and Vertex AI are not Chat Completions-compatible by default. Please clarify that this works with services that expose the OpenAI Chat Completions-compatible endpoint + Bearer auth (or mention that Azure/Vertex require a compatibility/proxy layer).

Suggested change
The extension now supports any OpenAI-compatible API by selecting "OpenAI Compatible (Custom)" as the AI Provider. This allows you to use services like:
- Local models (Ollama, LM Studio, etc.)
- Cloud providers (Azure OpenAI, Google Vertex AI, etc.)
- Other OpenAI-compatible services (Groq, Together AI, etc.)
When using a custom provider, you'll need to:
1. Select "OpenAI Compatible (Custom)" as the AI Provider
2. Enter your custom API endpoint (e.g., `https://api.example.com/v1`)
3. Provide your API key
4. Choose a model from the dropdown or select "Custom Model" to enter a specific model name
The extension supports providers that expose an **OpenAI Chat Completions-compatible HTTP API** (OpenAI-style paths and request/response format, using `Authorization: Bearer <api-key>`), by selecting "OpenAI Compatible (Custom)" as the AI Provider. This allows you to use services like:
- Local models (Ollama, LM Studio, etc.) when configured to expose an OpenAI-compatible endpoint
- Cloud providers (Azure OpenAI, Google Vertex AI, etc.) **via an OpenAI-compatible/translation layer or proxy** that normalizes their endpoints and auth to the OpenAI Chat Completions format
- Other OpenAI-compatible services (Groq, Together AI, etc.) that implement the Chat Completions API
When using a custom provider, you'll need to:
1. Select "OpenAI Compatible (Custom)" as the AI Provider
2. Enter your custom API endpoint (e.g., `https://api.example.com/v1`) that implements the OpenAI Chat Completions-compatible interface
3. Provide your API key (sent as a Bearer token)
4. Choose a model from the dropdown or select "Custom Model" to enter a specific model name
> Note: Azure OpenAI and Google Vertex AI do **not** use the OpenAI Chat Completions URL structure or authentication scheme by default. To use them here, you must either:
> - Use a compatibility/proxy layer that exposes an OpenAI Chat Completions-compatible endpoint, or
> - Use a provider-managed endpoint explicitly documented as OpenAI Chat Completions-compatible.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants