These samples demonstrate how to use the Foundry Local C# SDK. Each sample uses a unified project file that automatically detects your operating system and selects the optimal NuGet package:
- Windows: Uses
Microsoft.AI.Foundry.Local.WinMLfor hardware acceleration via Windows ML. - macOS / Linux: Uses
Microsoft.AI.Foundry.Localfor cross-platform support.
Both packages provide the same APIs, so the same source code works on all platforms.
| Sample | Description |
|---|---|
| native-chat-completions | Initialize the SDK, download a model, and run chat completions. |
| audio-transcription-example | Transcribe audio files using the Foundry Local SDK. |
| foundry-local-web-server | Set up a local OpenAI-compliant web server. |
| tool-calling-foundry-local-sdk | Use tool calling with native chat completions. |
| tool-calling-foundry-local-web-server | Use tool calling with the local web server. |
| model-management-example | Manage models, variant selection, and updates. |
| tutorial-chat-assistant | Build an interactive chat assistant (tutorial). |
| tutorial-document-summarizer | Summarize documents with AI (tutorial). |
| tutorial-tool-calling | Create a tool-calling assistant (tutorial). |
| tutorial-voice-to-text | Transcribe and summarize audio (tutorial). |
-
Clone the repository:
git clone https://github.com/microsoft/Foundry-Local.git cd Foundry-Local/samples/cs -
Open and run a sample:
cd native-chat-completions dotnet run