The Microsoft 365 Commercial Support Team resolves customer support cases and provides you assistance to be successful and realize the full potential and value of your purchase. Our support services extend across the entire lifecycle and include pre-sales, onboarding and deployment, usage and management, accounts and billing, and break-fix support. We also spend a considerable amount of time working to improve the supportability of Microsoft 365 services to reduce the number of issues you experience as well as minimize the effort and time it takes to resolve your issues if they do occur.
Today, we’re excited to share more about some of our supportability work with Microsoft 365 Copilot.
Copilot for Microsoft 365 combines the power of large language models (LLMs) with your organization’s data – all in the flow of work – to turn your words into one of the most powerful productivity tools on the planet. It works alongside Microsoft 365 Apps such as Word, Excel, PowerPoint, Outlook, Teams, and more. Copilot provides real-time intelligent assistance, enabling users to enhance their creativity, productivity, and skills. Copilot for Microsoft 365 has been in use by tens of thousands of enterprise customers as part of our Early Access Program (EAP) for several months and our support teams have been there every step of the way assisting IT Admins with their questions. Starting November 1, Copilot became generally available for enterprise customers worldwide.
A small group of senior engineers in our Microsoft 365 Commercial Support Team have been working closely with the Copilot product engineering teams since the early stages of development, well before the first customer ever onboarded, to learn the service and build readiness and training assets for our global support teams. Today, I’m excited to welcome three of these engineers to share their knowledge, insights, and guidance on getting ready for Copilot for Microsoft 365and ways IT Admins can be most effective in deploying, managing, and supporting Copilot with users in their organizations.
Brian: Welcome Jason, Rob, and Parth! It’s great to have your Copilot expertise here and I know you’ve been working with Copilot for Office apps (Word Excel, PowerPoint), Outlook, and Microsoft Teams for quite a while. What’s it been like getting the support team ready for the Copilot for Microsoft 365 launch? How has it been different than other product releases that you’ve worked on over the years?
Jason: The development cycle was really rapid and accelerated over a compressed period of time, especially from when we first engaged to the launch of the Early Access Program in July. It required a lot of teamwork and coordination to gather and digest information from across various sources and product technologies and then turn that into consumable content for support engineers. The synergy and the way teams have come together within support and across product engineering and marketing has been super impressive. I’m sure it will contribute to a positive experience for customers.
Parth: Just to add to Jason’s comments, Copilot was definitely a unique experience given how quickly it went from concept to getting into the hands of our valuable Early Access Program customers. Keeping up with feature progress, and specifically on which platform they would be landing, meant paying really close attention to engineering team progress daily. Our planning program managers were super awesome in coordinating with the engineering teams to get us access to new Copilot features and information so we could use and test the service and develop troubleshooting documentation.
Brian: What are some things that customers should know about the Copilot service and how it operates, including how data, security, and privacy are handled?
Rob: Copilot for Microsoft 365 respects the permissions model and only shows you data that you have permission to view. It’s important to use the permission models in Microsoft 365 to make sure the right people have access to the right content. Copilot only searches for information within your organization, and it does not search in other organizations that you might have access to. When you use Copilot, your prompts, the data retrieved, and the results stay within the Microsoft 365 service boundary, following our privacy, security, and compliance commitments. Copilot uses Azure OpenAI Services, not OpenAI’s public services, so all of the processing stays within the Microsoft 365 service boundary. Here’s a great illustration of the Copilot architecture, showing how this works across the different components:
Illustration courtesy of Microsoft 365 Copilot Overview
Brian: What are some of the key considerations for IT Admins to keep in mind as they prepare for onboarding and deployment?
Jason: First, get really familiar with the various public Microsoft resources available, at minimum the Learn article that describes requirements: Microsoft 365 Copilot requirements | Microsoft Learn. We have a Tech Community blog that steps through how to prepare your organization for Microsoft 365 Copilot: How to prepare for Microsoft 365 Copilot – Microsoft Community Hub. In addition, there is a Microsoft Learning path specific for Copilot so would recommend that too: MS-012 Prepare your organization for Microsoft 365 Copilot – Training | Microsoft Learn.
And then keep in mind a “Copilot” not an “Autopilot,” particularly when it comes to new content creation. Copilot is highly proficient at generating content, but as it pertains to newly generated content, it always needs to be factually verified.
Rob: Prompts are the commands or questions that you type or speak to Copilot to request assistance. If you are not sure what to ask Copilot, try choosing from our selection of prompts to create, edit, and get more done over at our Copilot Lab on https://aka.ms/CopilotLab.
Brian: Talk a little about indexing with Semantic Index and what to expect at both the user and tenant level following Copilot purchase and license assignment.
Jason: Semantic Index is an indexing map of data within the tenant and triggers automatically after Copilot has been purchased. It provides semantically related results in Search and Copilot based on the indexed content and relationships, which is important for providing context and keeping the results “grounded” in relevant data. Note you can disable specific SharePoint sites from appearing in search and omit them from the index.
Semantic index does not require any administrative configuration. The indexing process is automated and adheres to the same trust standards as the rest of the Microsoft 365 suite. The indexing process at the tenant-level works through about 95% coverage of OneDrive and SharePoint Online text-based documents within a week, with full coverage taking up to 28 days (about 4 weeks).
Semantic indexing respects any site and library indexing restrictions as well as the People and Item insights settings found in the Search & Intelligence admin portal. User-level indexing includes user mailbox content, which is indexed in 2 days, with outliers taking longer, sometimes up to 2 weeks.
After initial indexing is complete, updates are likewise automatically done in near real-time, with OneDrive/SharePoint document content sometimes taking up to 48 hours (about 2 days) to be added to the index.
Check out this video to learn more – Semantic Index for Copilot: Explained by Microsoft.
Brian: Can you share some tips and guidance for some of the most common questions you’re seeing from customers right now?
All: Make sure you’re fully aware and understand the Copilot for Microsoft 365 requirements. For the Microsoft 365 desktop apps, make sure they are on the Current channel and running the latest build. This will also apply later when Copilot appears in the Monthly Enterprise Channel (MEC).
Parth: Right now, it can take up to 72 hours for Copilot license assignment. We expect that time to reduce in the near future as Copilot continues to evolve.
In terms of language support, here are the languages currently supported by Copilot for Microsoft 365 for prompts and responses: Chinese (Simplified), English, French, German, Italian, Japanese, Portuguese (Brazil), Spanish. Note that Copilot in Excel is currently supported in English only.
Here are a few other suggestions for getting the most out of Microsoft 365 Copilot interactions:
|Be Clear and specific.
Provide specific instructions to Copilot, such as topic, purpose, tone, and required length.
|Check for accuracy.
Occasionally, Copilot may make mistakes. Always check Copilot’s responses for accuracy, grammar, and style, and watch out for irrelevant or inappropriate content.
|Keep it conversational.
Give feedback to Copilot based on the quality of its responses to help the AI learn and match your preferences.
Provide Copilot with contextual details to help it generate more accurate, consistent responses. For example, the genre, characters, and plot to a story.
Use clear and specific keywords or phrases when asking Copilot to write a piece of text for you. This helps it generate more relevant and creative copy.
Using kind and respectful language when chatting with Copilot helps foster collaboration and improves the AI’s responsiveness and performance.
|Ask for feedback.
Requesting feedback from Copilot helps it to understand your needs and preferences, and to provide you with more relevant, helpful responses.
Use correct punctuation, capitalization, and grammar when writing prompts, as this will help the AI produce better quality text and responses.
When prompting Copilot, avoid using vague language, and be as clear as possible to receive better quality responses.
|Give conflicting instructions.
Prompting Copilot to perform a task that includes multiple or conflicting pieces of information in the same request can confuse the AI and result in lower quality responses.
|Request inappropriate or unethical content.
Copilot is not responsible for the content or the consequences of your writing. You should respect the local laws, rules, and the rights of others.
|Interrupt or change topics abruptly.
This could disrupt the Copilot’s writing process. Always close or finish a task before starting a new one. When starting a new task write “new task”.
|Use slang, jargon, or informal language.
This may cause Copilot to give low quality, inappropriate or unprofessional responses.
|Copilot is not showing up or shows up only in some places.
|When Copilot is not appearing as expected, and all requirements have been met, consider the following:
|Some SharePoint content is being excluded.
|Content for any site or library that has been excluded from search indexing will not be included in Copilot results.
|Impact of Search & Intelligence portal controls on Copilot behavior.
Disabling People or Item Insights will prevent “people data” from document collaboration and recommended documents from being included in Copilot results.
Disabling will also have an effect beyond Copilot such as in the microsoft365.com Feed and My Content areas.
Brian: And finally, what are some of your favorite resources and sites that admins should know about and keep close at hand as they work with Copilot?
All: Here are a few of our favorites:
- Copilot requirements: Microsoft 365 Copilot requirements
- Features currently available in Copilot: Microsoft 365 Copilot – Service Description
- Copilot learning site: Microsoft 365 Copilot documentation
- Microsoft Mechanics YouTube playlist on Microsoft 365 Copilot: Microsoft 365 Copilot Mechanics – YouTube
Prompt engineering: Copilot Lab (cloud.microsoft) and Get tips on Copilot prompts (aka.ms)
- Responsible AI: Empowering responsible AI practices | Microsoft AI
- Data, Privacy, and Security for Copilot: Data, Privacy, and Security for Microsoft 365 Copilot (aka.ms)
- Microsoft Teams & Copilot: Frequently asked questions about Copilot in Microsoft Teams – Microsoft Support
- Microsoft 365 Blog: Copilot articles
Brian: Thank you, Jason, Rob, and Parth, for sharing all these great insights and information!
Jason Haak is a Senior Support Escalation Engineer in the CSS Modern Work Supportability Team focused on Office with Microsoft 365.
Parth Sharma is a Support Escalation Engineer in the CSS Modern Work Supportability Team focused on Microsoft Teams with Microsoft 365.
Rob Whaley is a Senior Support Escalation Engineer in the CSS Modern Work Supportability Team focused on Exchange and Outlook with Microsoft 365.
Brian Stoner is a Director in the CSS Modern Work Supportability Team where he leads a team of technical and business program managers.