From 5f3d53cc2119d767fabcfd3367abb6fb05aa5eb6 Mon Sep 17 00:00:00 2001 From: "Qiutong Shen (from Dev Box)" Date: Tue, 12 May 2026 17:24:45 +0800 Subject: [PATCH] fix(windows-ai): typos and misspellings across documentation Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> --- docs/apis/image-generation.md | 2 +- docs/apis/index.md | 2 +- docs/apis/video-super-resolution.md | 4 ++-- docs/cards/windows-studio-effects-application-card.md | 2 +- docs/new-windows-ml/api-reference.md | 2 +- docs/npu-devices/index.md | 2 +- docs/samples/index.md | 2 +- 7 files changed, 8 insertions(+), 8 deletions(-) diff --git a/docs/apis/image-generation.md b/docs/apis/image-generation.md index 226709d8..93e4a874 100644 --- a/docs/apis/image-generation.md +++ b/docs/apis/image-generation.md @@ -259,7 +259,7 @@ public async Task CreateImageFromPromptAndCustomOptions() options.Seed = 1234; ContentFilterOptions contentFilterOptions = new ContentFilterOptions(); contentFilterOptions.PromptMaxAllowedSeverityLevel = TextContentFilterSeverity(SeverityLevel.Low); - contentFilterOptions.ImageMaxAllowedSeverityLevel = ImageContentFilterSeverity(SeverityLevel.Minimium); + contentFilterOptions.ImageMaxAllowedSeverityLevel = ImageContentFilterSeverity(SeverityLevel.Minimum); options.ContentFilterOptions = contentFilterOptions; var result = model.GenerateImageFromTextPrompt("Cat in spaceship", options); diff --git a/docs/apis/index.md b/docs/apis/index.md index fbcf822c..528e0440 100644 --- a/docs/apis/index.md +++ b/docs/apis/index.md @@ -35,7 +35,7 @@ See the [Windows AI APIs with WinUI sample app](https://github.com/microsoft/Win To build your first Windows app with Visual Studio and some simple Windows AI APIs, just meet the prerequisites and use the provided example code in [Get started building an app with Windows AI APIs](./get-started.md). -From there, you can jump into short tutorials that build an app leveraging specific Windows AI APIs such as the [Phi Silica walthrough](./phi-silica-tutorial.md), [Imaging walthrough](./imaging-tutorial.md) and [OCR walthrough](./text-recognition-tutorial.md). +From there, you can jump into short tutorials that build an app leveraging specific Windows AI APIs such as the [Phi Silica walkthrough](./phi-silica-tutorial.md), [Imaging walkthrough](./imaging-tutorial.md) and [OCR walkthrough](./text-recognition-tutorial.md). ## Try the APIs and models on your PC diff --git a/docs/apis/video-super-resolution.md b/docs/apis/video-super-resolution.md index c1be2389..5928c89f 100644 --- a/docs/apis/video-super-resolution.md +++ b/docs/apis/video-super-resolution.md @@ -35,7 +35,7 @@ VSR currently supports the following resolution, format, and FPS ranges: ## Create a VideoScaler session -The following example shows how to create a VSR session. First, get an instance of [ExecutionProviderCatalog](/windows/windows-app-sdk/api/winrt/microsoft.windows.ai.machinelearning.executionprovidercatalog) and call [EnsureAndRegisterCertifiedAsync](/windows/windows-app-sdk/api/winrt/microsoft.windows.ai.machinelearning.executionprovidercatalog.ensureandregistercertifiedasync) to load the available models. Call **GetReadyState** on the **VideoScalar** class to determine if the video scaler is ready to process frames. If not, call **EnsureReadyAsync** to initialize the video scaler. +The following example shows how to create a VSR session. First, get an instance of [ExecutionProviderCatalog](/windows/windows-app-sdk/api/winrt/microsoft.windows.ai.machinelearning.executionprovidercatalog) and call [EnsureAndRegisterCertifiedAsync](/windows/windows-app-sdk/api/winrt/microsoft.windows.ai.machinelearning.executionprovidercatalog.ensureandregistercertifiedasync) to load the available models. Call **GetReadyState** on the **VideoScaler** class to determine if the video scaler is ready to process frames. If not, call **EnsureReadyAsync** to initialize the video scaler. ```csharp @@ -177,7 +177,7 @@ Next, a [Direct3DSurface](/uwp/api/windows.graphics.directx.direct3d11.idirect3d ## Scale a SoftwareBitmap using ImageBuffer -The following code example demonstrates the use of **VideoScalar** class to upscale a [SoftwareBitmap](/uwp/api/windows.graphics.imaging.softwarebitmap). This example does not represent a typical usage of the VSR APIs. It is less performant than using Direct3D. But you can use this example to experiment with the VSR APIs without setting up a camera or video streaming pipeline. Because the video scaler requires a **BGR8** when using an **ImageBuffer**, some helper methods are required to convert the pixel format of the supplied **SoftwareBitmap**. +The following code example demonstrates the use of **VideoScaler** class to upscale a [SoftwareBitmap](/uwp/api/windows.graphics.imaging.softwarebitmap). This example does not represent a typical usage of the VSR APIs. It is less performant than using Direct3D. But you can use this example to experiment with the VSR APIs without setting up a camera or video streaming pipeline. Because the video scaler requires a **BGR8** when using an **ImageBuffer**, some helper methods are required to convert the pixel format of the supplied **SoftwareBitmap**. The example code in this article is based on the VSR component of the [Windows AI API samples](https://github.com/microsoft/WindowsAppSDK-Samples/tree/release/experimental/Samples/WindowsAIFoundry) diff --git a/docs/cards/windows-studio-effects-application-card.md b/docs/cards/windows-studio-effects-application-card.md index 165a0662..8be6d761 100644 --- a/docs/cards/windows-studio-effects-application-card.md +++ b/docs/cards/windows-studio-effects-application-card.md @@ -10,7 +10,7 @@ ms.service: windows # Application card: Windows Studio Effects -Microsoft's Application and Platform Cards are intended to help you understand how our AI technology works, the choices application owners can make that influence application performance and behavior, and the importance of considering the whole application, including the technology, the people, and the environment. Application cards are created for AI applications and platform cards are created for AI platform platforms. These resources can support the development or deployment of your own applications and can be shared with users or stakeholders impacted by them. +Microsoft's Application and Platform Cards are intended to help you understand how our AI technology works, the choices application owners can make that influence application performance and behavior, and the importance of considering the whole application, including the technology, the people, and the environment. Application cards are created for AI applications and platform cards are created for AI platform services. These resources can support the development or deployment of your own applications and can be shared with users or stakeholders impacted by them. As part of its commitment to responsible AI, Microsoft adheres to six core principles: fairness, reliability and safety, privacy and security, inclusiveness, transparency, and accountability. These principles are embedded in the Responsible AI Standard, which guides teams in designing, building, and testing AI applications. Application and Platform Cards play a key role in operationalizing these principles by offering transparency around capabilities, intended uses, and limitations. For further insight, readers are encouraged to explore Microsoft's [Responsible AI Transparency Report](https://www.microsoft.com/ai/responsible-ai) and code of conduct, which outline how [enterprise customers](/legal/ai-code-of-conduct) and [individuals](https://www.microsoft.com/servicesagreement#3_codeOfConduct) can engage with AI responsibly. diff --git a/docs/new-windows-ml/api-reference.md b/docs/new-windows-ml/api-reference.md index b4c827b8..94740e19 100644 --- a/docs/new-windows-ml/api-reference.md +++ b/docs/new-windows-ml/api-reference.md @@ -7,7 +7,7 @@ ms.topic: article # Windows ML APIs -For conceptual guidance, see [Run ONNX models with Windows ML)](./run-onnx-models.md). +For conceptual guidance, see [Run ONNX models with Windows ML](./run-onnx-models.md). You can think of the APIs in the *Microsoft.WindowsAppSDK.ML* NuGet package as being the superset of these two sets: diff --git a/docs/npu-devices/index.md b/docs/npu-devices/index.md index defa1b74..45bbced4 100644 --- a/docs/npu-devices/index.md +++ b/docs/npu-devices/index.md @@ -32,7 +32,7 @@ This guidance is specific to [Copilot+ PCs](https://www.microsoft.com/windows/co Many of the new Windows AI features require an NPU with the ability to run at 40+ TOPS, including but not limited to: - Microsoft Surface Laptop Copilot+ PC -- Microsoft Surface Pro Copilot + PC +- Microsoft Surface Pro Copilot+ PC - HP OmniBook X 14 - Dell Latitude 7455, XPS 13, and Inspiron 14 - Acer Swift 14 AI diff --git a/docs/samples/index.md b/docs/samples/index.md index 8a1dc4ab..4ad57502 100644 --- a/docs/samples/index.md +++ b/docs/samples/index.md @@ -52,7 +52,7 @@ These samples show how to enhance your Windows apps with AI using local APIs and **Description**: This AI-powered note taking application demonstrates the use of APIs including [OCR Text Recognition](../apis/text-recognition.md), Audio Transcription through local ML model, Semantic Search through a local embeddings model, local language model usage with Phi3 for summarization, autocomplete, and text reasoning, and Retrieval Augmented Generation (RAG) for grounding language models to real data. -**Features**: Semantic search with local model, Audio transcription with local model, Local Retreval Augmented generation (RAG) with [Phi3](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct-onnx), Local Text summarization and reasoning with Phi3, Text extraction from images with [OCR API](../apis/text-recognition.md) +**Features**: Semantic search with local model, Audio transcription with local model, Local Retrieval Augmented generation (RAG) with [Phi3](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct-onnx), Local Text summarization and reasoning with Phi3, Text extraction from images with [OCR API](../apis/text-recognition.md) **App Type**: [C#](/dotnet/csharp/), [WinUI 3](/windows/apps/winui/winui3/)