Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
- [x] [Hugging Face](https://huggingface.co/docs)
- [x] [Ollama](https://github.com/ollama/ollama/tree/main/docs)
- [ ] [Anthropic](https://docs.anthropic.com)
- [ ] [Naver](https://api.ncloud-docs.com/docs/ai-naver-clovastudio-summary)
- [ ] ~~[Naver](https://api.ncloud-docs.com/docs/ai-naver-clovastudio-summary)~~
- [x] [LG](https://github.com/LG-AI-EXAONE)
- [x] [OpenAI](https://openai.com/api)
- [x] [Upstage](https://console.upstage.ai/docs/getting-started)
Expand Down Expand Up @@ -77,7 +77,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
- [Use Azure AI Foundry](./docs/azure-ai-foundry.md#run-in-local-container)
- [Use GitHub Models](./docs/github-models.md#run-in-local-container)
- [Use Docker Model Runner](./docs/docker-model-runner.md#run-in-local-container)
- ~~Use Foundry Local~~ 👉 NOT SUPPORTED
- [Use Foundry Local](./docs/foundry-local.md#run-in-local-container)
- [Use Hugging Face](./docs/hugging-face.md#run-in-local-container)
- [Use Ollama](./docs/ollama.md#run-on-local-container)
- [Use LG](./docs/lg.md#run-in-local-container)
Expand Down
111 changes: 108 additions & 3 deletions docs/foundry-local.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# OpenChat Playground with Foundry Local

This page describes how to run OpenChat Playground (OCP) with Foundry Local models integration.
This page describes how to run OpenChat Playground (OCP) with [Foundry Local](https://learn.microsoft.com/azure/ai-foundry/foundry-local/what-is-foundry-local) integration.

## Get the repository root

Expand All @@ -18,7 +18,7 @@ This page describes how to run OpenChat Playground (OCP) with Foundry Local mode

## Run on local machine

1. Make sure the Foundry Local server is up and running.
1. Make sure the Foundry Local server is up and running with the following command.

```bash
foundry service start
Expand Down Expand Up @@ -74,4 +74,109 @@ This page describes how to run OpenChat Playground (OCP) with Foundry Local mode
--alias qwen2.5-7b
```

1. Open your web browser, navigate to `http://localhost:5280`, and enter prompts.
1. Open your web browser, navigate to `http://localhost:5280`, and enter prompts.

## Run in local container

1. Make sure the Foundry Local server is up and running.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Port가 랜덤으로 뽑히는 것 같습니다.
그래서 명시적으로 55438로 set 을 하는 가이드 문구를 추가 했습니다.

Copy link
Contributor

@justinyoo justinyoo Oct 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • 명시적으로 55438 포트를 사용한다는 건 무엇을 근거로 하나요?
  • 포트는 foundry service set --port 같은 명령어를 이용하면 우리가 정할 수 있기는 합니다.

Copy link
Contributor Author

@name-of-okja name-of-okja Oct 27, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

명시적으로 55438 포트를 사용한다는 건 무엇을 근거로 하나요?

appsettings.josn의 기본 값인 55438 입니다.

"FoundryLocal": {
  "Alias": "phi-4-mini",
  "Endpoint": "http://127.0.0.1:55438/v1"
},
  1. foundry local start command 에서 나온 port를 입력해 주세요.
  2. OCP의 Foundry Local Connector의 Port의 기본값은 55438로 할 테니 사용자 여러분께서 맞춰주세요.

이렇게 두 가지 방법으로 가이드를 줄 수 있을 것 같은데 후자가 조금 더 편할 것 같아서 foundry service set --port 로 했습니다.
그리고 다른 port를 사용하는 방법을 추가로 문서에 넣었습니다.

Alternatively, if you want to run with a different port, say `63997`, other than the default one, set it first by running the following command.

추가로 55438은 작업 당시에 foundry local start 에서 나온 랜덤으로 선택된 port 입니다.

```bash
foundry service start
```

1. Get the Foundry Local service port.

```bash
# bash/zsh
FL_PORT_NUMBER=$(foundry service set --show true | sed -n '/^{/,/^}/p' | jq -r ".serviceSettings.port")
```

```powershell
# PowerShell
$FL_PORT_NUMBER = (foundry service set --show true | `
ForEach-Object { `
if ($_ -match '^{') { $capture = $true } `
if ($capture) { $_ } `
if ($_ -match '^}') { $capture = $false } `
} | Out-String | ConvertFrom-Json).serviceSettings.port
```

1. Download the Foundry Local model. The default model OCP uses is `phi-4-mini`.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

우선은 phi-4-mini로 모델을 받고, 가이드 문서 하단에서 foundry service list command를 통해서 Model ID를 가져오는 흐름으로 문서를 작성했습니다.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

우리는 ModelId를 사용하지 않습니다.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Foundry Local SDK 예제 코드 에서도
var chatClient = client.GetChatClient(model?.ModelId); ModelId를 사용하고 있습니다.

만약 이와 같이 Alias를 사용하면, client.GetChatClient("phi-4-mini") 아래와 같은 에러가 발생합니다.

System.ClientModel.ClientResultException: Service request failed.
Status: 400 (Bad Request)

   at OpenAI.ClientPipelineExtensions.ProcessMessageAsync(ClientPipeline pipeline, PipelineMessage message, RequestOptions options)
   at OpenAI.Chat.ChatClient.CompleteChatAsync(BinaryContent content, RequestOptions options)
   at OpenAI.Chat.ChatClient.<>c__DisplayClass19_0.<<CompleteChatStreamingAsync>b__0>d.MoveNext()

```bash
foundry model download phi-4-mini
```

Alternatively, if you want to run with a different model, say `qwen2.5-7b`, other than the default one, download it first by running the following command.

```bash
foundry model download qwen2.5-7b
```

Make sure to follow the model MUST be selected from the CLI output of `foundry model list`.

1. Load the Foundry Local model. The default model OCP uses is `phi-4-mini`.

```bash
foundry model load phi-4-mini
```

Alternatively, if you want to run with a different model, say `qwen2.5-7b`, other than the default one, download it first by running the following command.

```bash
foundry model load qwen2.5-7b
```

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Build a container.

```bash
docker build -f Dockerfile -t openchat-playground:latest .
```

1. Run the app. The `{{Model ID}}` refers to the `Model ID` shown in the output of the `foundry service list` command.

> **NOTE**: Make sure it MUST be the model ID, instead of alias.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Model ID를 가져다가, Alias 파라미터로 넘겨줘야 하는 느낌으로 작성했습니다.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

우리는 ModelId를 사용하지 않습니다.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

client.GetChatClient 메서드에는 파라미터로 model id를 받습니다만,
FoundryLocalManager을 사용하지 않고서는 Alais만으로 ModeId를 가져올 수가 없습니다.

```bash
# bash/zsh - from locally built container
docker run -i --rm -p 8080:8080 openchat-playground:latest \
--connector-type FoundryLocal \
--base-url http://host.docker.internal:$FL_PORT_NUMBER/ \
--model "Phi-4-mini-instruct-generic-gpu:4" \
--disable-foundrylocal-manager
```

```powershell
# PowerShell - from locally built container
docker run -i --rm -p 8080:8080 openchat-playground:latest `
--connector-type FoundryLocal `
--base-url http://host.docker.internal:$FL_PORT_NUMBER/ `
--model {{Model ID}} `
--disable-foundrylocal-manager
```

```bash
# bash/zsh - from GitHub Container Registry
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest \
--connector-type FoundryLocal \
--base-url http://host.docker.internal:$FL_PORT_NUMBER/ \
--model {{Model ID}} \
--disable-foundrylocal-manager
```

```powershell
# PowerShell - from GitHub Container Registry
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest `
--connector-type FoundryLocal `
--base-url http://host.docker.internal:$FL_PORT_NUMBER/ `
--model {{Model ID}} `
--disable-foundrylocal-manager
```

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alternatively, if you want to run with a different model, say qwen2.5-7b, other than the default one, download it first by running the following command.

다른 모델 사용에 대한 예시에도 --alias {{Model ID}} 이거 말고, 생각이 안나서... 우선은 제외 했습니다..!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • 우리는 ModelId를 사용하지 않습니다.
  • 엔드포인트에 v1이 붙는 건 어떻게 확인하셨나요?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

우리는 ModelId를 사용하지 않습니다.

client.GetChatClient 메서드에는 파라미터로 model id를 받습니다만,
FoundryLocalManager을 사용하지 않고서는 Alais만으로 ModeId를 가져올 수가 없습니다.

엔드포인트에 v1이 붙는 건 어떻게 확인하셨나요?

FoundryLocalManager.cs에 이와 같이 있습니다.
public Uri Endpoint => new Uri(ServiceUri, "/v1");

1. Open your web browser, navigate to `http://localhost:8080`, and enter prompts.
28 changes: 19 additions & 9 deletions src/OpenChat.PlaygroundApp/Abstractions/ArgumentOptions.cs
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,11 @@ private static readonly (ConnectorType ConnectorType, string Argument, bool IsSw
(ConnectorType.DockerModelRunner, ArgumentOptionConstants.DockerModelRunner.BaseUrl, false),
(ConnectorType.DockerModelRunner, ArgumentOptionConstants.DockerModelRunner.Model, false),
// Foundry Local
(ConnectorType.FoundryLocal, ArgumentOptionConstants.FoundryLocal.BaseUrl, false),
(ConnectorType.FoundryLocal, ArgumentOptionConstants.FoundryLocal.Alias, false),
(ConnectorType.FoundryLocal, ArgumentOptionConstants.FoundryLocal.Model, false),
(ConnectorType.FoundryLocal, ArgumentOptionConstants.FoundryLocal.DisableFoundryLocalManager, true),
(ConnectorType.FoundryLocal, ArgumentOptionConstants.FoundryLocal.DisableFoundryLocalManagerInShort, true),
// Hugging Face
(ConnectorType.HuggingFace, ArgumentOptionConstants.HuggingFace.BaseUrl, false),
(ConnectorType.HuggingFace, ArgumentOptionConstants.HuggingFace.Model, false),
Expand Down Expand Up @@ -212,9 +216,11 @@ public static AppSettings Parse(IConfiguration config, string[] args)

case FoundryLocalArgumentOptions foundryLocal:
settings.FoundryLocal ??= new FoundryLocalSettings();
settings.FoundryLocal.Alias = foundryLocal.Alias ?? settings.FoundryLocal.Alias;
settings.FoundryLocal.BaseUrl = foundryLocal.BaseUrl ?? settings.FoundryLocal.BaseUrl;
settings.FoundryLocal.AliasOrModel = foundryLocal.AliasOrModel ?? settings.FoundryLocal.AliasOrModel;
settings.FoundryLocal.DisableFoundryLocalManager = foundryLocal.DisableFoundryLocalManager;

settings.Model = foundryLocal.Alias ?? settings.FoundryLocal.Alias;
settings.Model = foundryLocal.AliasOrModel ?? settings.FoundryLocal.AliasOrModel;
break;

case HuggingFaceArgumentOptions huggingFace:
Expand Down Expand Up @@ -361,10 +367,10 @@ private static void DisplayHelpForAmazonBedrock()
Console.WriteLine(" ** Amazon Bedrock: **");
Console.ForegroundColor = foregroundColor;

Console.WriteLine($" {ArgumentOptionConstants.AmazonBedrock.AccessKeyId} The AWSCredentials Access Key ID.");
Console.WriteLine($" {ArgumentOptionConstants.AmazonBedrock.SecretAccessKey} The AWSCredentials Secret Access Key.");
Console.WriteLine($" {ArgumentOptionConstants.AmazonBedrock.Region} The AWS region.");
Console.WriteLine($" {ArgumentOptionConstants.AmazonBedrock.ModelId} The model ID. Default to 'anthropic.claude-sonnet-4-20250514-v1:0'");
Console.WriteLine($" {ArgumentOptionConstants.AmazonBedrock.AccessKeyId} The AWSCredentials Access Key ID.");
Console.WriteLine($" {ArgumentOptionConstants.AmazonBedrock.SecretAccessKey} The AWSCredentials Secret Access Key.");
Console.WriteLine($" {ArgumentOptionConstants.AmazonBedrock.Region} The AWS region.");
Console.WriteLine($" {ArgumentOptionConstants.AmazonBedrock.ModelId} The model ID. Default to 'anthropic.claude-sonnet-4-20250514-v1:0'");
Console.WriteLine();
}

Expand Down Expand Up @@ -424,7 +430,10 @@ private static void DisplayHelpForFoundryLocal()
Console.WriteLine(" ** Foundry Local: **");
Console.ForegroundColor = foregroundColor;

Console.WriteLine(" TBD");
Console.WriteLine($" {ArgumentOptionConstants.FoundryLocal.BaseUrl} The endpoint URL. Default to 'http://localhost:<random_port>/'");
Console.WriteLine($" {ArgumentOptionConstants.FoundryLocal.Alias}|{ArgumentOptionConstants.FoundryLocal.Model} The alias or model ID. Default to 'phi-4-mini'");
Console.WriteLine($" {ArgumentOptionConstants.FoundryLocal.DisableFoundryLocalManager}|{ArgumentOptionConstants.FoundryLocal.DisableFoundryLocalManagerInShort} Disable the built-in Foundry local manager.");
Console.WriteLine($" When this flag is set, you must specify '--base-url'.");
Console.WriteLine();
}

Expand Down Expand Up @@ -471,7 +480,8 @@ private static void DisplayHelpForLG()
Console.WriteLine(" ** LG: **");
Console.ForegroundColor = foregroundColor;

Console.WriteLine(" TBD");
Console.WriteLine($" {ArgumentOptionConstants.LG.BaseUrl} The baseURL. Default to 'http://localhost:11434'");
Console.WriteLine($" {ArgumentOptionConstants.LG.Model} The model name. Default to 'hf.co/LGAI-EXAONE/EXAONE-4.0-1.2B-GGUF'");
Console.WriteLine();
}

Expand All @@ -493,7 +503,7 @@ private static void DisplayHelpForOpenAI()
Console.WriteLine(" ** OpenAI: **");
Console.ForegroundColor = foregroundColor;

Console.WriteLine($" {ArgumentOptionConstants.OpenAI.ApiKey} The OpenAI API key. (Env: OPENAI_API_KEY)");
Console.WriteLine($" {ArgumentOptionConstants.OpenAI.ApiKey} The OpenAI API key.");
Console.WriteLine($" {ArgumentOptionConstants.OpenAI.Model} The OpenAI model name. Default to 'gpt-4.1-mini'");
Console.WriteLine();
}
Expand Down
14 changes: 12 additions & 2 deletions src/OpenChat.PlaygroundApp/Configurations/FoundryLocalSettings.cs
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,17 @@ public partial class AppSettings
public class FoundryLocalSettings : LanguageModelSettings
{
/// <summary>
/// Gets or sets the alias of FoundryLocal.
/// Gets or sets the Base URL of Foundry Local. If `DisableFoundryLocalManager` is set, this value must be provided.
/// </summary>
public string? Alias { get; set; }
public string? BaseUrl { get; set; }

/// <summary>
/// Gets or sets either alias or model ID of Foundry Local.
/// </summary>
public string? AliasOrModel { get; set; }

/// <summary>
/// Gets or sets a value indicating whether to disable the automatic Foundry Local manager and use a manually configured endpoint.
/// </summary>
public bool DisableFoundryLocalManager { get; set; }
}
65 changes: 56 additions & 9 deletions src/OpenChat.PlaygroundApp/Connectors/FoundryLocalConnector.cs
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
using System.ClientModel;
using System.Text.RegularExpressions;

using Microsoft.AI.Foundry.Local;
using Microsoft.Extensions.AI;
Expand All @@ -16,6 +17,10 @@ namespace OpenChat.PlaygroundApp.Connectors;
/// <param name="settings"><see cref="AppSettings"/> instance.</param>
public class FoundryLocalConnector(AppSettings settings) : LanguageModelConnector(settings.FoundryLocal)
{
private const string ApiKey = "OPENAI_API_KEY";
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FoundryLocalManager에 있는 기본 값을 가져왔습니다.

public string ApiKey { get; internal set; } = "OPENAI_API_KEY";


private static readonly Regex modelIdSuffix = new(":[0-9]+$", RegexOptions.Compiled | RegexOptions.IgnoreCase);

private readonly AppSettings _appSettings = settings ?? throw new ArgumentNullException(nameof(settings));

/// <inheritdoc/>
Expand All @@ -26,9 +31,21 @@ public override bool EnsureLanguageModelSettingsValid()
throw new InvalidOperationException("Missing configuration: FoundryLocal.");
}

if (string.IsNullOrWhiteSpace(settings.Alias!.Trim()))
if (settings.DisableFoundryLocalManager == true &&
string.IsNullOrWhiteSpace(settings.BaseUrl!.Trim()) == true)
{
throw new InvalidOperationException("Missing configuration: FoundryLocal:Alias.");
throw new InvalidOperationException("Missing configuration: FoundryLocal:BaseUrl is required when DisableFoundryLocalManager is enabled.");
}

if (string.IsNullOrWhiteSpace(settings.AliasOrModel!.Trim()) == true)
{
throw new InvalidOperationException("Missing configuration: FoundryLocal:AliasOrModel.");
}

if (settings.DisableFoundryLocalManager == true &&
modelIdSuffix.IsMatch(settings.AliasOrModel!.Trim()!) == false)
{
throw new InvalidOperationException("When DisableFoundryLocalManager is true, FoundryLocal:AliasOrModel must be the exact model name with version suffix.");
}

return true;
Expand All @@ -38,23 +55,53 @@ public override bool EnsureLanguageModelSettingsValid()
public override async Task<IChatClient> GetChatClientAsync()
{
var settings = this.Settings as FoundryLocalSettings;
var alias = settings!.Alias!.Trim() ?? throw new InvalidOperationException("Missing configuration: FoundryLocal:Alias.");

var manager = await FoundryLocalManager.StartModelAsync(aliasOrModelId: alias).ConfigureAwait(false);
var model = await manager.GetModelInfoAsync(aliasOrModelId: alias).ConfigureAwait(false);
(Uri? endpoint, string? modelId) = settings!.DisableFoundryLocalManager == true
? ParseFromModelId(settings)
: await ParseFromManagerAsync(settings).ConfigureAwait(false);

var credential = new ApiKeyCredential(manager.ApiKey);
var credential = new ApiKeyCredential(ApiKey);
var options = new OpenAIClientOptions()
{
Endpoint = manager.Endpoint,
Endpoint = endpoint,
};

var client = new OpenAIClient(credential, options);
var chatClient = client.GetChatClient(model?.ModelId)
var chatClient = client.GetChatClient(modelId)
.AsIChatClient();

Console.WriteLine($"The {this._appSettings.ConnectorType} connector created with model: {alias}");
Console.WriteLine($"The {this._appSettings.ConnectorType} connector created with model: {modelId}");
Comment on lines +70 to +73
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

modelId 쓰지 않습니다.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

client.GetChatClient 에서 alias 값을 그대로 사용하면, 에러가 발생합니다.

System.ClientModel.ClientResultException: Service request failed.
Status: 400 (Bad Request)

   at OpenAI.ClientPipelineExtensions.ProcessMessageAsync(ClientPipeline pipeline, PipelineMessage message, RequestOptions options)
   at OpenAI.Chat.ChatClient.CompleteChatAsync(BinaryContent content, RequestOptions options)
   at OpenAI.Chat.ChatClient.<>c__DisplayClass19_0.<<CompleteChatStreamingAsync>b__0>d.MoveNext()


return chatClient;
}

private static (Uri? endpoint, string? modelId) ParseFromModelId(FoundryLocalSettings settings)
{
var baseUrl = settings.BaseUrl!.Trim() ?? throw new InvalidOperationException("Missing configuration: FoundryLocal:BaseUrl.");
if (Uri.IsWellFormedUriString(baseUrl, UriKind.Absolute) == false)
{
throw new UriFormatException($"Invalid URI: The Foundry Local base URL '{baseUrl}' is not a valid URI.");
}

var endpoint = new Uri($"{baseUrl.TrimEnd('/')}/v1");
var modelId = settings.AliasOrModel!.Trim() ?? throw new InvalidOperationException("Missing configuration: FoundryLocal:AliasOrModel.");
if (modelIdSuffix.IsMatch(modelId) == false)
{
throw new InvalidOperationException("When DisableFoundryLocalManager is true, FoundryLocal:AliasOrModel must be the exact model name with version suffix.");
}

return (endpoint, modelId);
}

private static async Task<(Uri? endpoint, string? modelId)> ParseFromManagerAsync(FoundryLocalSettings settings)
{
var alias = settings!.AliasOrModel!.Trim() ?? throw new InvalidOperationException("Missing configuration: FoundryLocal:AliasOrModel.");
var manager = await FoundryLocalManager.StartModelAsync(aliasOrModelId: alias).ConfigureAwait(false);
var model = await manager.GetModelInfoAsync(aliasOrModelId: alias).ConfigureAwait(false);

var endpoint = manager.Endpoint;
var modelId = model!.ModelId;

return (endpoint, modelId);
}
}
20 changes: 20 additions & 0 deletions src/OpenChat.PlaygroundApp/Constants/ArgumentOptionConstants.cs
Original file line number Diff line number Diff line change
Expand Up @@ -130,10 +130,30 @@ public static class DockerModelRunner
/// </summary>
public static class FoundryLocal
{
/// <summary>
/// Defines the constant for '--base-url'.
/// </summary>
public const string BaseUrl = "--base-url";

/// <summary>
/// Defines the constant for '--alias'.
/// </summary>
public const string Alias = "--alias";

/// <summary>
/// Defines the constant for '--model'.
/// </summary>
public const string Model = "--model";

/// <summary>
/// Defines the constant for '--disable-foundry-local-manager'.
/// </summary>
public const string DisableFoundryLocalManager = "--disable-foundry-local-manager";

/// <summary>
/// Defines the constant for '--disable-flm'.
/// </summary>
public const string DisableFoundryLocalManagerInShort = "--disable-flm";
}

/// <summary>
Expand Down
Loading