Skip to content
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions cspell.json
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
"appname",
"azurecontainerapps",
"Benitez",
"BYOM",
"candidateid",
"candidatenumber",
"chatbots",
Expand All @@ -21,6 +22,7 @@
"devtunnel",
"devtunnels",
"docusign",
"DTMF",
"Dunnam",
"duplicative",
"Durow",
Expand All @@ -30,6 +32,7 @@
"Frontmatter",
"Helpdesk",
"hiringagent",
"HITL",
"inlinehilite",
"Jammes",
"jobdescription",
Expand All @@ -44,6 +47,7 @@
"mkdocs",
"MSRC",
"multimodal",
"Noda",
"Omnichannel",
"onboardsupport",
"opensource",
Expand All @@ -61,6 +65,7 @@
"resumenumber",
"RXXXXX",
"Signup",
"SSML",
"Streamable",
"subprocessor",
"subprocessors",
Expand Down
33 changes: 33 additions & 0 deletions docs/commander-preview/01-advanced-mcp/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
---
prev:
text: 'Commander Preview Overview'
link: '/commander-preview'
next:
text: 'Computer Using Agents (CUA)'
link: '/commander-preview/02-computer-using-agents'
---

# 🔌 Mission 01: Advanced MCP Server Usage

## 🕵️‍♂️ **CLASSIFIED**

## 🎯 Mission Brief

Go beyond basic MCP concepts and connect your Copilot Studio agent to MCP servers using the MCP onboarding wizard or a custom connector in Power Apps. Configure authentication (API key or OAuth 2.0 with dynamic discovery, dynamic, or manual flows). Add MCP server tools and resources to your agent and selectively enable or disable individual tools. Explore the built-in MCP servers catalog - including Dataverse, Dynamics 365, Fabric, and the Microsoft 365 MCP servers - and troubleshoot common integration issues.

## 🔎 Objectives

In this mission, you'll learn:

1. How to connect your agent to an existing MCP server using the MCP onboarding wizard or a custom connector in Power Apps
1. How to configure MCP authentication (None, API key, OAuth 2.0 - dynamic discovery, dynamic, and manual)
1. How to add MCP server tools and resources to your agent and selectively enable or disable individual tools
1. How to use the built-in MCP servers catalog (Dataverse, Dynamics 365, and Fabric)
1. How to troubleshoot common MCP integration issues and known limitations

## 📖 References

- [Create a new MCP server](https://learn.microsoft.com/en-us/microsoft-copilot-studio/mcp-create-new-server)

> [!NOTE]
> 🚧 This mission is under construction. Check back soon for the full walkthrough.
33 changes: 33 additions & 0 deletions docs/commander-preview/02-computer-using-agents/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
---
prev:
text: 'Advanced MCP Server Usage'
link: '/commander-preview/01-advanced-mcp'
next:
text: 'Code Interpreter'
link: '/commander-preview/03-code-interpreter'
---

# 🖥️ Mission 02: Computer Using Agents (CUA)

## 🕵️‍♂️ **CLASSIFIED**

## 🎯 Mission Brief

Use Computer Use to let agents automate apps and websites with virtual mouse and keyboard. Learn when to use CUA vs RPA, how to prompt effectively, how to store secure credentials using Azure Key Vault, how to allow CUA to request human assistance (HITL), how to provide inputs, and the different types of hosts.

## 🔎 Objectives

In this mission, you'll learn:

1. How to enable and configure Computer Use for your agents
1. When to choose CUA over traditional RPA approaches
1. How to prompt computer-using agents effectively
1. How to store secure credentials with Azure Key Vault
1. How to implement human-in-the-loop assistance, provide inputs, and select appropriate host types

## 📖 References

- [Computer Use](https://learn.microsoft.com/en-us/microsoft-copilot-studio/computer-use)

> [!NOTE]
> 🚧 This mission is under construction. Check back soon for the full walkthrough.
33 changes: 33 additions & 0 deletions docs/commander-preview/03-code-interpreter/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
---
prev:
text: 'Computer Using Agents (CUA)'
link: '/commander-preview/02-computer-using-agents'
next:
text: 'Human in the Loop (HITL)'
link: '/commander-preview/04-human-in-the-loop'
---

# 🐍 Mission 03: Code Interpreter

## 🕵️‍♂️ **CLASSIFIED**

## 🎯 Mission Brief

Enable Code Interpreter to generate and execute Python code in a sandbox for data preparation, statistical analysis, table joins, forecasting, charting, and file generation using structured data (CSV, Excel). Leverage deterministic, reproducible computation instead of best-effort LLM reasoning. Ingest data via user-uploaded files in chat or SharePoint Documents library knowledge sources. Combine with image generation and custom skills to extend agent capabilities. Query and update Dataverse using code.

## 🔎 Objectives

In this mission, you'll learn:

1. How to enable Code Interpreter at the prompt and agent level
1. How to analyze structured data (CSV, Excel) for data preparation, statistical analysis, forecasting, and chart generation
1. The difference between deterministic (prompt-level) and non-deterministic (agent-level) execution
1. How to ingest structured data via user-uploaded files or SharePoint knowledge sources
1. How to extend capabilities with image generation, custom skills, and Dataverse code queries

## 📖 References

- [Code interpreter](https://learn.microsoft.com/en-us/microsoft-365-copilot/extensibility/code-interpreter)

> [!NOTE]
> 🚧 This mission is under construction. Check back soon for the full walkthrough.
35 changes: 35 additions & 0 deletions docs/commander-preview/04-human-in-the-loop/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
---
prev:
text: 'Code Interpreter'
link: '/commander-preview/03-code-interpreter'
next:
text: 'Voice-enabled Agents'
link: '/commander-preview/05-voice-enabled-agents'
---

# 🤝 Mission 04: Human in the Loop (HITL)

## 🕵️‍♂️ **CLASSIFIED**

## 🎯 Mission Brief

Build multistage approval workflows in agent flows that combine AI-driven decisions with manual human review stages. Configure the Human in the Loop connector with manual and AI approval stages, conditions for dynamic routing, and the Request for Information (RFI) action to pause flows and collect structured input from reviewers. Map flow variables to approval inputs, write effective AI stage instructions, and review decisions in the approvals center.

## 🔎 Objectives

In this mission, you'll learn:

1. How to add and configure **multistage approvals** using the Human in the Loop connector in agent flows
1. How to set up **manual and AI approval stages** with approval types, assignees, input mappings, and automated document evaluation
1. How to add **conditions** to dynamically route, approve, reject, or skip stages based on business rules
1. How to use the **Request for Information** action to pause flows, collect structured input from reviewers, and map flow variables to approval inputs
1. How to write **effective AI stage instructions** and review AI decisions and rationale in the approvals center

## 📖 References

- [Agent flows overview](https://learn.microsoft.com/en-us/microsoft-copilot-studio/flows-overview)
- [Multi-stage AI Approvals](https://learn.microsoft.com/en-us/microsoft-copilot-studio/flows-advanced-approvals)
- [Request more information](https://learn.microsoft.com/en-us/microsoft-copilot-studio/flows-request-for-information)

> [!NOTE]
> 🚧 This mission is under construction. Check back soon for the full walkthrough.
35 changes: 35 additions & 0 deletions docs/commander-preview/05-voice-enabled-agents/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
---
prev:
text: 'Human in the Loop (HITL)'
link: '/commander-preview/04-human-in-the-loop'
next:
text: 'Localization'
link: '/commander-preview/06-localization'
---

# 🎙️ Mission 05: Voice-enabled Agents

## 🕵️‍♂️ **CLASSIFIED**

## 🎯 Mission Brief

Build agents with interactive voice response (IVR) capabilities using the Voice template. Handle phone calls with speech and dual-tone multi-frequency (DTMF) input. DTMF allows your agent users to use their phone keypad to select options or provide information. Configure barge-in so users can interrupt, set up silence detection and timeouts, customize your agent's voice with Speech Synthesis Markup Language (SSML), and manage call flow with call transfer and termination. Use Azure Communication Services to provision phone numbers for your agent.

## 🔎 Objectives

In this mission, you'll learn:

1. How to create a voice-enabled agent using the Voice template
1. How to collect user input via speech and dual-tone multi-frequency (DTMF)
1. How to configure barge-in, silence detection, timeouts, and customize voice using SSML
1. How to manage call flow with latency messages, call transfer, and termination
1. How to provision phone numbers with Azure Communication Services

## 📖 References

- [Voice overview](https://learn.microsoft.com/en-us/microsoft-copilot-studio/voice-overview)
- [Voice template](https://learn.microsoft.com/en-us/microsoft-copilot-studio/voice-build-from-template)
- [Configure voice capabilities](https://learn.microsoft.com/en-us/microsoft-copilot-studio/voice-configuration)

> [!NOTE]
> 🚧 This mission is under construction. Check back soon for the full walkthrough.
38 changes: 38 additions & 0 deletions docs/commander-preview/06-localization/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
---
prev:
text: 'Voice-enabled Agents'
link: '/commander-preview/05-voice-enabled-agents'
next:
text: 'Extend with Azure AI & BYOM'
link: '/commander-preview/07-azure-ai-byom'
---

# 🌍 Mission 06: Localization

## 🕵️‍♂️ **CLASSIFIED**

## 🎯 Mission Brief

Configure multilingual agents that communicate in different languages and handle regional date and time correctly. Set primary and secondary languages, manage localization with JSON and ResX translation files, and enable automatic or dynamic language switching using generative orchestration. Localize Adaptive Cards with dynamic content, accommodate time zones using system variables and Power Fx, and test multilingual experiences in the test panel.

## 🔎 Objectives

In this mission, you'll learn:

1. How to configure primary and secondary languages with region settings
1. How to manage localization using JSON and ResX translation files
1. How to set up automatic language detection and dynamic language switching with generative orchestration
1. How to localize Adaptive Cards and test multilingual experiences in the test panel
1. How to accommodate time zones using `Conversation.LocalTimeZone`, Noda Time zone IDs, and Power Fx date conversion

## 📖 References

- [Configure and create multilingual agents](https://learn.microsoft.com/en-us/microsoft-copilot-studio/multilingual)
- [Language support](https://learn.microsoft.com/en-us/microsoft-copilot-studio/authoring-language-support)
- [Accommodate time zones](https://learn.microsoft.com/en-us/microsoft-copilot-studio/manage-date-and-time)
- [Regional settings and supported locales](https://learn.microsoft.com/en-us/microsoft-copilot-studio/data-localization)
- [Design effective language understanding - Localization and languages](https://learn.microsoft.com/en-us/microsoft-copilot-studio/guidance/language-understanding#localization-and-languages)
- [Auto-detect language sample solution](https://github.com/microsoft/CopilotStudioSamples/tree/main/AutoDetectLanguageSample)

> [!NOTE]
> 🚧 This mission is under construction. Check back soon for the full walkthrough.
34 changes: 34 additions & 0 deletions docs/commander-preview/07-azure-ai-byom/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
---
prev:
text: 'Localization'
link: '/commander-preview/06-localization'
next:
text: 'Knowledge Deep Dive'
link: '/commander-preview/08-knowledge-deep-dive'
---

# 🧠 Mission 07: Extend with Azure AI & BYOM

## 🕵️‍♂️ **CLASSIFIED**

## 🎯 Mission Brief

Use Azure AI Search for enterprise data retrieval, Azure AI Foundry for bring-your-own-data copilots, and BYOM to run custom or fine-tuned models. Add prompts as reusable tools inside topics to extend agent capabilities.

## 🔎 Objectives

In this mission, you'll learn:

1. How to integrate Azure AI Search for enterprise knowledge retrieval
1. How to use Azure AI Foundry for bring-your-own-data scenarios
1. How to bring your own models (BYOM) with custom or fine-tuned models
1. How to add prompts as reusable tools inside topics

## 📖 References

- [Azure AI Search](https://learn.microsoft.com/en-us/microsoft-copilot-studio/knowledge-azure-ai-search)
- [Azure AI Foundry](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/concepts/use-your-data)
- [BYOM](https://learn.microsoft.com/en-us/ai-builder/byom-for-your-prompts)

> [!NOTE]
> 🚧 This mission is under construction. Check back soon for the full walkthrough.
34 changes: 34 additions & 0 deletions docs/commander-preview/08-knowledge-deep-dive/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
---
prev:
text: 'Extend with Azure AI & BYOM'
link: '/commander-preview/07-azure-ai-byom'
next:
text: 'Handoff to Human Agents'
link: '/commander-preview/09-handoff-to-human'
---

# 📚 Mission 08: Knowledge Deep Dive

## 🕵️‍♂️ **CLASSIFIED**

## 🎯 Mission Brief

Optimize websites for agent ingestion, connect SharePoint and OneDrive content while understanding limitations and how your agent reasons over information. Use Bing Search for dynamic external knowledge retrieval. Understand security and identity topics on how information is accessed and used.

## 🔎 Objectives

In this mission, you'll learn:

1. How to optimize public websites for agent knowledge ingestion
1. How to connect SharePoint and OneDrive content as knowledge sources
1. How agents reason over connected knowledge and current limitations
1. How to use Bing Search for dynamic external knowledge retrieval
1. How security and identity govern knowledge access and usage

## 📖 References

- [Knowledge sources](https://learn.microsoft.com/en-us/microsoft-copilot-studio/knowledge-copilot-studio)
- [Guidance for public websites](https://learn.microsoft.com/en-us/microsoft-copilot-studio/guidance/generative-ai-public-websites)

> [!NOTE]
> 🚧 This mission is under construction. Check back soon for the full walkthrough.
36 changes: 36 additions & 0 deletions docs/commander-preview/09-handoff-to-human/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
---
prev:
text: 'Knowledge Deep Dive'
link: '/commander-preview/08-knowledge-deep-dive'
next:
text: 'Governance with Auth, Channels & DLP'
link: '/commander-preview/10-governance-auth-dlp'
---

# 🔄 Mission 09: Handoff to Human Agents

## 🕵️‍♂️ **CLASSIFIED**

## 🎯 Mission Brief

Configure your agent to hand off conversations to live human agents seamlessly and contextually. Understand how the Escalate system topic and Transfer conversation node work, and how handoff can be triggered implicitly (when a user asks for a human) or explicitly (within a topic flow). Learn how context variables such as conversation history, topics, and user-defined variables are passed to the engagement hub so the live agent can resume without the customer repeating themselves. Explore supported engagement hubs including Dynamics 365 Customer Service, ServiceNow, Salesforce, LivePerson, and generic hubs via custom adapters.

## 🔎 Objectives

In this mission, you'll learn:

1. How to configure the Escalate system topic and add a Transfer conversation node
1. The difference between implicit triggers (user requests a human) and explicit triggers (topic-driven escalation)
1. How context variables (conversation history, topics, phrases, private messages) are passed to the engagement hub
1. How to connect to supported engagement hubs: Dynamics 365 Customer Service, ServiceNow, Salesforce, LivePerson, and generic hubs
1. How to customize agent behavior for escalation scenarios (greeting, escalate link, no-match messages)

## 📖 References

- [Customer engagement overview](https://learn.microsoft.com/en-us/microsoft-copilot-studio/customer-copilot-overview)
- [Handoff pattern](https://learn.microsoft.com/en-us/microsoft-copilot-studio/advanced-hand-off)
- [Generic handoff configuration](https://learn.microsoft.com/en-us/microsoft-copilot-studio/configure-generic-handoff)
- [Configure handoff to Dynamics 365 Customer Service](https://learn.microsoft.com/en-us/microsoft-copilot-studio/configuration-hand-off-omnichannel)

> [!NOTE]
> 🚧 This mission is under construction. Check back soon for the full walkthrough.
35 changes: 35 additions & 0 deletions docs/commander-preview/10-governance-auth-dlp/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
---
prev:
text: 'Handoff to Human Agents'
link: '/commander-preview/09-handoff-to-human'
next:
text: 'Test, Monitor & Debug'
link: '/commander-preview/11-test-monitor-debug'
---

# 🔐 Mission 10: Governance Essentials with Authentication, Channels & DLP

## 🕵️‍♂️ **CLASSIFIED**

## 🎯 Mission Brief

Secure agent access using Single Sign-On with Entra ID and manual authentication for APIs. Compare Streaming API vs Direct Line for embedding agents in custom apps. Prevent anonymous access using DLP policies.

## 🔎 Objectives

In this mission, you'll learn:

1. How to configure Single Sign-On (SSO) with Entra ID
1. How to set up manual authentication for API access
1. How to compare Streaming API vs Direct Line for custom app embedding
1. How to prevent anonymous access using DLP policies

## 📖 References

- [Single Sign-On](https://learn.microsoft.com/en-us/microsoft-copilot-studio/configure-sso)
- [Manual authentication](https://learn.microsoft.com/en-us/microsoft-copilot-studio/configuration-end-user-authentication)
- [Data loss prevention policies](https://learn.microsoft.com/en-us/microsoft-copilot-studio/admin-data-loss-prevention)
- [Connect agent to Teams and Microsoft 365](https://learn.microsoft.com/en-us/microsoft-copilot-studio/publication-add-bot-to-microsoft-teams)

> [!NOTE]
> 🚧 This mission is under construction. Check back soon for the full walkthrough.
Loading