Class ProviderConfig
This allows using your own OpenAI, Azure OpenAI, or other compatible API
endpoints instead of the default Copilot backend. All setter methods return
this for method chaining.
Example Usage - OpenAI
var provider = new ProviderConfig().setType("openai").setBaseUrl("https://api.openai.com/v1").setApiKey("sk-...");
Example Usage - Azure OpenAI
var provider = new ProviderConfig().setType("azure")
.setAzure(new AzureOptions().setEndpoint("https://my-resource.openai.azure.com").setDeployment("gpt-4"));
- Since:
- 1.0.0
- See Also:
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionGets the API key.getAzure()Gets the Azure-specific options.Gets the base URL for the API.Gets the bearer token.Gets the custom HTTP headers for outbound provider requests.Gets the maximum output token override.Gets the maximum prompt token override.Gets the well-known model name used by the runtime.getType()Gets the provider type.Gets the wire API format.Gets the model name sent to the provider API for inference.Sets the API key for authentication.setAzure(AzureOptions azure) Sets Azure-specific options for Azure OpenAI Service.setBaseUrl(String baseUrl) Sets the base URL for the API.setBearerToken(String bearerToken) Sets a bearer token for authentication.setHeaders(Map<String, String> headers) Sets custom HTTP headers to include in outbound provider requests.setMaxOutputTokens(Integer maxOutputTokens) Sets the maximum output tokens override.setMaxPromptTokens(Integer maxPromptTokens) Sets the maximum prompt tokens override.setModelId(String modelId) Sets the well-known model name used by the runtime.Sets the provider type.setWireApi(String wireApi) Sets the wire API format for custom providers.setWireModel(String wireModel) Sets the model name sent to the provider API for inference.
-
Constructor Details
-
ProviderConfig
public ProviderConfig()
-
-
Method Details
-
getType
Gets the provider type.- Returns:
- the provider type (e.g., "openai", "azure")
-
setType
Sets the provider type.Supported types include:
- "openai" - OpenAI API
- "azure" - Azure OpenAI Service
- Parameters:
type- the provider type- Returns:
- this config for method chaining
-
getWireApi
Gets the wire API format.- Returns:
- the wire API format
-
setWireApi
Sets the wire API format for custom providers.This specifies the API format when using a custom provider that has a different wire protocol.
- Parameters:
wireApi- the wire API format- Returns:
- this config for method chaining
-
getBaseUrl
Gets the base URL for the API.- Returns:
- the API base URL
-
setBaseUrl
Sets the base URL for the API.For OpenAI, this is typically "https://api.openai.com/v1".
- Parameters:
baseUrl- the API base URL- Returns:
- this config for method chaining
-
getApiKey
Gets the API key.- Returns:
- the API key
-
setApiKey
Sets the API key for authentication.- Parameters:
apiKey- the API key- Returns:
- this config for method chaining
-
getBearerToken
Gets the bearer token.- Returns:
- the bearer token
-
setBearerToken
Sets a bearer token for authentication.This is an alternative to API key authentication.
Note: The bearer token is a static token string. The SDK does not refresh this token automatically. If your token expires, requests will fail and you'll need to create a new session with a fresh token.
- Parameters:
bearerToken- the bearer token- Returns:
- this config for method chaining
-
getAzure
Gets the Azure-specific options.- Returns:
- the Azure options
-
setAzure
Sets Azure-specific options for Azure OpenAI Service.- Parameters:
azure- the Azure options- Returns:
- this config for method chaining
- See Also:
-
getHeaders
Gets the custom HTTP headers for outbound provider requests.- Returns:
- the headers map, or
nullif not set
-
setHeaders
Sets custom HTTP headers to include in outbound provider requests.Use this to pass additional authentication headers or custom metadata to the provider API.
- Parameters:
headers- the headers map- Returns:
- this config for method chaining
-
getModelId
Gets the well-known model name used by the runtime.Used to look up agent configuration (tools, prompts, reasoning behavior) and default token limits. Also used as the wire model when
getWireModel()is not set.- Returns:
- the model ID, or
nullif not set
-
setModelId
Sets the well-known model name used by the runtime.Used to look up agent configuration (tools, prompts, reasoning behavior) and default token limits. Also used as the wire model when
getWireModel()is not set. Falls back toSessionConfig.getModel().- Parameters:
modelId- the model ID- Returns:
- this config for method chaining
-
getWireModel
Gets the model name sent to the provider API for inference.- Returns:
- the wire model name, or
nullif not set
-
setWireModel
Sets the model name sent to the provider API for inference.Use this when the provider's model name (e.g. an Azure deployment name or a custom fine-tune name) differs from
getModelId(). Falls back togetModelId(), thenSessionConfig.getModel().- Parameters:
wireModel- the wire model name- Returns:
- this config for method chaining
-
getMaxPromptTokens
Gets the maximum prompt token override.- Returns:
- the max prompt tokens, or
nullif not set
-
setMaxPromptTokens
Sets the maximum prompt tokens override.Overrides the resolved model's default max prompt tokens. The runtime triggers conversation compaction before sending a request when the prompt (system message, history, tool definitions, user message) would exceed this limit.
- Parameters:
maxPromptTokens- the max prompt tokens- Returns:
- this config for method chaining
-
getMaxOutputTokens
Gets the maximum output token override.- Returns:
- the max output tokens, or
nullif not set
-
setMaxOutputTokens
Sets the maximum output tokens override.Overrides the resolved model's default max output tokens. When hit, the model stops generating and returns a truncated response.
- Parameters:
maxOutputTokens- the max output tokens- Returns:
- this config for method chaining
-