Copilot for Microsoft 365 – Office 365 for IT Pros https://office365itpros.com Mastering Office 365 and Microsoft 365 Fri, 13 Sep 2024 09:05:56 +0000 en-US hourly 1 https://i0.wp.com/office365itpros.com/wp-content/uploads/2024/06/cropped-Office-365-for-IT-Pros-2025-Edition-500-px.jpg?fit=32%2C32&ssl=1 Copilot for Microsoft 365 – Office 365 for IT Pros https://office365itpros.com 32 32 150103932 Copilot Usage Report APIs Available https://office365itpros.com/2024/09/13/copilot-usage-report-api/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-usage-report-api https://office365itpros.com/2024/09/13/copilot-usage-report-api/#comments Fri, 13 Sep 2024 07:00:00 +0000 https://office365itpros.com/?p=66347

Copilot Usage Reports Weak on Detail

Announced in message center notification MC877369 (29 August 2024, Microsoft 365 roadmap item 396562), the Microsoft Graph beta usage reports API now includes support for Copilot for Microsoft 365 tenant usage data. All tenants that use Copilot for Microsoft 365 should now have access to the usage data.

Microsoft says that the availability of this information will “facilitate the creation of customized reporting and analytics,” but the fact is that the data exposed by the API is bare-bones. On the upside, the data matches what’s available in the report section of the Microsoft 365 admin center (Figure 1).

  • Tenant-level summary of Copilot-enabled (licensed) users and active users.
  • Adoption trend (tenant summary) over time.
  • Last activity date for Copilot interaction in different apps for each user.
Copilot usage reports in the Microsoft 365 admin center.
Figure 1: Copilot usage reports in the Microsoft 365 admin center

Accounts accessing the Graph data must have a Copilot for Microsoft 365 license.

User Count Summary

The user count summary report returns a count of the user accounts licensed for Copilot for Microsoft 365 (enabled users) and a count of the users with an active interaction with Copilot in each app during the reporting period (7, 30, 90, or 180 days). Unsurprisingly, when someone is enabled for Copilot in one app, they’re usually enabled for all:

  • Teams
  • Outlook (classic, new Outlook for Windows, OWA).
  • Excel.
  • PowerPoint.
  • Copilot Graph-grounded chat (aka Copilot Chat).
  • OneNote.
  • Loop.
$Uri = "https://graph.microsoft.com/beta/reports/getMicrosoft365CopilotUserCountSummary(period='D90')"
$Data = Invoke-GraphRequest -Uri $Uri -Method Get
$Data.value.adoptionByProduct

Name                           Value
----                           -----
loopEnabledUsers               100
reportPeriod                   90
oneNoteActiveUsers             3
wordEnabledUsers               100
powerPointEnabledUsers         100
microsoftTeamsActiveUsers      97
oneNoteEnabledUsers            100
excelActiveUsers               43
loopActiveUsers                2
copilotChatEnabledUsers        100
outlookEnabledUsers            100
anyAppEnabledUsers             100
anyAppActiveUsers              97
microsoftTeamsEnabledUsers     100
excelEnabledUsers              100
wordActiveUsers                61
powerPointActiveUsers          12
copilotChatActiveUsers         73
outlookActiveUsers             18

User Activity Detail

This report is the most interesting because it details the last activity date for Copilot interaction by users with each of the various Copilot-enabled apps. In addition, the last activity date for any Copilot interaction with any of the supported apps is published (lastActivityDate). An array (value) holds a separate usage report for each Copilot-enabled account.

The user principal name and display name is obfuscated if the tenant data privacy control is enabled. In the following extract, we see that the user has never used Copilot for Loop and OneNote and hasn’t used Copilot with PowerPoint since April 11, 2024:

$Uri = https://graph.microsoft.com/beta/reports/getMicrosoft365CopilotUsageUserDetail(period='D90')
$Data = Invoke-GraphRequest -Uri $Uri -Method Get
$Data.value[0]

Name                           Value
----                           -----
copilotActivityUserDetailsByP… {System.Collections.Hashtable}
microsoftTeamsCopilotLastActi… 2024-09-05
outlookCopilotLastActivityDate 2024-08-29
lastActivityDate               2024-09-05
reportRefreshDate              2024-09-08
excelCopilotLastActivityDate   2024-09-05
loopCopilotLastActivityDate
oneNoteCopilotLastActivityDate
copilotChatLastActivityDate    2024-09-05
powerPointCopilotLastActivity… 2024-04-11
userPrincipalName              Tony.Redmond@office365itpros.com
displayName                    Tony Redmond
wordCopilotLastActivityDate    2024-09-05

Adoption Trend over Time

This report returns an array called adoptionByDate with entries for each day during the reporting period (7, 30, 90, or 180 days). The purpose of the report is to track progress in Copilot adoption over time and to note if any specific action had an effect. For instance, you might run an education campaign to teach users how to generate effective results using Copilot in Excel. Over the weeks following the campaign, you’d expect to see the number of users who use Copilot in Excel to grow.

$Uri = "https://graph.microsoft.com/beta/reports/getMicrosoft365CopilotUserCountTrend(period='D90')"
$Data = Invoke-GraphRequest -Uri $Uri -Method Get
$Data.Value.copilotActivityUserDetailsByPeriod

reportDate                     2024-06-17
excelEnabledUsers              100
wordActiveUsers                51
powerPointActiveUsers          11
copilotChatActiveUsers         66
outlookActiveUsers             15
loopEnabledUsers               100
oneNoteActiveUsers             1
wordEnabledUsers               100
powerPointEnabledUsers         100
microsoftTeamsActiveUsers      86
oneNoteEnabledUsers            1
excelActiveUsers               21
loopActiveUsers                1
copilotChatEnabledUsers        100
outlookEnabledUsers            100
anyAppEnabledUsers             100
anyAppActiveUsers              86
microsoftTeamsEnabledUsers     100

Track Copilot Activity Using Audit Records instead of Copilot Usage Reports

Although it’s nice to have Copilot usage reports included in the Graph API, the information exposed isn’t very informative in terms of how people use Copilot. The data tells you that someone used Copilot in an app during a day. At least, they clicked a Copilot button. The information doesn’t reveal any more insight than that. Any enterprise who invests large sums of money in expensive Copilot for Microsoft 365 licenses will find a dearth of detail here in terms of understanding whether the investment is justified. In many cases, you will be better off analyzing the audit records captured for Copilot interactions to figure out what’s really going on.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2024/09/13/copilot-usage-report-api/feed/ 1 66347
Copilot’s Automatic Summary for Word Documents https://office365itpros.com/2024/09/05/automatic-document-summary-word/?utm_source=rss&utm_medium=rss&utm_campaign=automatic-document-summary-word https://office365itpros.com/2024/09/05/automatic-document-summary-word/#comments Thu, 05 Sep 2024 07:00:00 +0000 https://office365itpros.com/?p=66234

Automatic Document Summary in a Bulleted List

Last week, I referenced the update for Word where Copilot for Microsoft 365 generates an automatic summary for documents. This is covered in message center notification MC871010 (Microsoft 365 roadmap item 399921). Automatic summaries are included in Copilot for Microsoft 365 and Microsoft Copilot Pro (the version that doesn’t ground prompts using Graph data).

As soon as I published the article where I referred to the feature, it turned up in Word. Figure 1 shows the automatic summary generated for a document (in this case, the source of an article).

 Copilot generates an automatic document summary
Figure 1: Copilot generates an automatic document summary

The summary is the same output as the bulleted list Copilot will generate if you open the Copilot pane and ask Copilot to summarize this doc. Clicking the Ask a question button opens the Copilot pane with the summary prepopulated ready for the user to delve deeper into the summary.

The summary is only available after a document is saved and closed. The next time someone opens the document, the summary pane appears at the top of the document and Copilot generates the summary. The pane remains at the top of the document and doesn’t appear on every page. If Copilot thinks it necessary (for instance, if more text is added to a document), it displays a Check for new summary button to prompt the user to ask Copilot to regenerate the summary.

Apart from removing the Copilot license from an account (in which case the summaries don’t appear), there doesn’t seem to be a way to disable the feature. You can collapse the summary, but it’s still there and can be expanded at any time.

Summarizing Large Word Documents

When Microsoft launched Copilot support for Word, several restrictions existed. For instance, Word couldn’t ground user prompts against internet content. More importantly, summarization could only handle relatively small documents. The guidance was that Word could handle documents with up to 15,000 words but would struggle thereafter.

This sounds a lot, and it’s probably enough to handle a large percentage of the documents generated within office environments. However, summaries really come into their own when they extract information from large documents commonly found in contracts and plans. The restriction, resulting from the size of the prompt that could be sent to the LLM, proved to be a big issue.

Microsoft responded in in August 2024 with an announcement that Word could now summarize documents of up to 80,000 words. In their text, Microsoft says that the new limit is four times greater than the previous limit. The new limit is rolling out for desktop, mobile, and browser versions of Word. For Windows, the increased limit is available in Version 2310 (Build 16919.20000) or later.

Processing Even Larger Word Documents

Eighty thousand words sounds a lot. At an average of 650 words per page, that’s 123 pages filled with text. I wanted to see how Copilot summaries coped with larger documents.

According to this source, the maximum size of a text-only Word document is 32 MB. With other elements included, the theoretical size extends to 512 MB. I don’t have documents quite that big, but I do have the source document for the Office 365 for IT Pros eBook. At 1,242 pages and 679,800 characters, including many figures, tables, cross-references, and so on, the file size is 29.4 MB.

Copilot attempted to generate a summary for Office 365 for IT Pros but failed. This wasn’t surprising because the file is so much larger than the maximum supported.

The current size of the Automating Microsoft 365 with PowerShell eBook file is 1.72 MB and spans 113,600 words in 255 pages. That’s much closer to the documented limit, and Copilot was able to generate a summary (Figure 2).

Automatic document summary generated for the Automating Microsoft 365 with PowerShell eBook.
Figure 2: Automatic document summary generated for the Automating Microsoft 365 with PowerShell eBook

Although the bulleted list contains information extracted from the file, it doesn’t reflect the true content of the document because Copilot was unable to send the entire file to the LLM for processing. The bulleted list comes from the first two of four chapters and completely ignores the chapters dealing with the Graph API and Microsoft Graph PowerShell SDK.

Summaries For Standard Documents

Microsoft hasn’t published any documentation that I can find for Copilot’s automatic document summary feature. When it appears, perhaps the documentation will describe how to disable the feature for those who don’t want it. If not, we’ll just have to cope with automatic summaries. At least they will work for regular Word documents of less than 80,000 words.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/09/05/automatic-document-summary-word/feed/ 5 66234
Microsoft Withdraws Copilot Catch Up Feature https://office365itpros.com/2024/08/30/copilot-catch-up/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-catch-up https://office365itpros.com/2024/08/30/copilot-catch-up/#comments Fri, 30 Aug 2024 07:00:00 +0000 https://office365itpros.com/?p=66196

Copilot Catch Up Fails to Impress

On August 28. 2024, Microsoft posted a revision to message center notification MC799636 (Microsoft 365 roadmap item 388746) to announce their decision to completely roll back their deployment of the new “catch up” feature for Copilot for Microsoft 365 and Teams.  The roll back operation will start on September 2. Previously, the feature was supposed to reach general availability in early August 2024.

According to Microsoft, the deployment had reached 50% of users. These are eligible accounts with Copilot for Microsoft 365 licenses. Given that many large enterprises have committed to Copilot for Microsoft 365, the number of affected users might be in the low several million range.

The Black Box of Card Determination

The blurb for the Copilot catch up feature says that it “helps users [to] take action on important updates.” In other words, Copilot had found something it deemed of interest to the signed-in user and brought the item to their attention in a series of cards that rotated through a carousel. In my case, the feature only turned up a couple of days ago. Figure 1 shows what I see. The three cards cover a Word document (chapter 2 for the Office 365 for IT Pros eBook), a calendar invitation, and a Loop workspace.

Copilot catch up.

Copilot for Microsoft 365
Figure 1: Copilot catch up

Like many elements of artificial intelligence, the process to determine what cards to display is a black box. Looking at the cards chosen by Copilot, I can see the logic of selecting the Word document because it’s a reminder that its content has changed, and I need to review the updates. The meeting is probably there because it happens later today. To help me prepare, Copilot found an email sent about a Teams service outage which frankly is of zero relevance to the meeting. The sole connection is that the word Teams appears in the subject for both the meeting invitation and email. The selection reminds me of “I must find something” instead of “I must find something useful.” I can’t account for why Copilot chose the Loop workspace because nothing has happened in it for months. Perhaps Copilot Catch Up wanted to be diverse in its choice of application sources.

I don’t think I shall miss the carousel. It seems like a modern take on the many ways that Office apps suggest documents to users or the way that the now-deprecated Delve highlights documents to users. The Viva Insights for email feature available in Outlook clients is another example of how Microsoft seeks to extract value from user data to highlight “things to do.”

Automatic Document Summaries by Copilot for Microsoft 365

The list key points link on the card for the Word document (Figure 2) appears to do what I expect to see when Microsoft deploys message center notification MC871010 (Microsoft 365 roadmap item 399921), scheduled for late August. This update promises that Copilot “will generate a summary in the window at the top of the Word document.”

Key points for a Word document generated by Copilot for Microsoft 365.
Figure 2: Bulleted points for a Word document generated by Copilot for Microsoft 365

MC866152 (23 August 2024) also covers the same ground for Microsoft Copilot (the version that doesn’t use the Graph).

I have some problems with the idea of generating automatic summaries because I fear it might slow down the opening of large documents based on the current performance in the Copilot chat app. It seems like a terrific demo feature that works well for 2-page documents. I can’t imagine how it will cope with the 1,300 pages of the Office 365 for IT Pros eBook source file. MC866452 says that the summary window can be collapsed to hide it. There’s no detail about what I really want to do, which is to disable automatic summaries for all Word documents.

Copilot Can’t do Everything

Figuring out what features will really be interesting and useful is an aspect of software engineering that is often very difficult. Sometimes I think Microsoft tries too hard, tries to be too clever, or utterly fails to understand how people outside Microsoft work. Applying artificial intelligence to as many aspects of Microsoft 365 as possible is madness, even if it keeps senior management happy.

The truth is that not every idea discussed around a conference table in Redmond or virtually in a Teams call is valuable. Sometimes it takes exposure in the harsh light of reality to figure out what works and what doesn’t. The demise of Copilot catch up is a reminder to us all that just because a new feature appears, its value needs to be assessed in terms of how it contributes to the success of the business.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/08/30/copilot-catch-up/feed/ 2 66196
Stream Development Presses Ahead After Migration Finishes https://office365itpros.com/2024/05/24/stream-browser-client-may24/?utm_source=rss&utm_medium=rss&utm_campaign=stream-browser-client-may24 https://office365itpros.com/2024/05/24/stream-browser-client-may24/#comments Fri, 24 May 2024 07:00:00 +0000 https://office365itpros.com/?p=64853

New Features Show up in Stream Browser Client

The migration period for Stream classic to Stream on SharePoint finished on April 15, 2024. Given that Stream no longer has a management client because everything’s handled by SharePoint Online and OneDrive for Business, it’s possible that Stream slipped down the list of tenant administrator priorities once the migration finished. Development continues on the Stream browser client (but Microsoft plans to retire the Stream mobile client on 1 July 2024), and three recent advances deserve some attention.

Copilot for Stream

Message center notification MC765809 (2 April 2024, Microsoft 365 roadmap item 188616) announces the arrival of Copilot for Stream. This only matters if a tenant has Copilot for Microsoft 365, but I think its arrival underscores the importance of transcripts.

Most people are familiar with Teams recording transcripts and know that the transcript is basis for Copilot party tricks like extracting a list of action items or summarizing what happens during a meeting. At one time, Microsoft stopped creating transcripts for videos uploaded to Stream, possibly because the transcripts complicated the migration process. However, this point passed and Microsoft resumed the generation of transcripts following the upload of video files in mid-2023. If a video doesn’t have a transcript, it’s easy to generate one.

Once a transcript is available, Copilot for Stream can use it to do much the same thing as it does for Teams meeting recordings:

  • Summarize what happens during the video.
  • Create a list of action items.
  • Ask questions about who said what.
  • Ask if specific topics are discussed in a video.

Figure 1 shows Copilot for Stream after generating a list of actions from a video (which happens to be a Teams meeting recording featuring the awesome talent of the Office 365 for IT Pros eBook author team).

Copilot for Stream in the Stream browser client.
Figure 1: Copilot for Stream in the Stream browser client

Video Trimming

The old Stream client could trim content from the start and end of videos. However, this meant regenerating video files, and that took some time. Today, Stream uses a different approach. Instead of removing parts of a video from the start, end, or middle, Stream hides content based on time codes. For example, you can tell Stream to hide the first fifteen seconds of a Teams meeting recording to avoid viewers having to watch the Teams recording introduction screen. Trimming a video in this manner avoids the need to regenerate an edited video without the trimmed sections. When the time comes to play the video, Stream simply ignores the trimmed sections.

Hidden content is always available to video owners and others with full access. The trimmed sections are viewable if someone downloads and plays a copy of the video. In addition, trimming doesn’t affect the transcript and although Stream doesn’t display the parts of the transcript for trimmed section, the full text is available for eDiscovery and for Copilot to process. For these reasons, if you want a video where it is impossible for viewers to access removed content, use a tool like Clipchamp to regenerate a new version of the video after removing the parts you don’t want people to see. Stream should then produce a transcript for the new video that reflects the edited content.

Video Interactivity

The interactivity feature of the Stream player supports the addition of elements at specific time codes within a video. You can add:

  1. Forms. Insert a link to a Microsoft form (created beforehand) for display to viewers starting at a timecode. A form can conduct a survey, poll viewers, or quiz the viewers about the content of the video.
  2. Callouts. Add a text or hyperlink callout to display to viewers between two timecodes. This function can be used to highlight important points in a video with a couple of lines of text. The callout editor is rudimentary and supports the selection of a limited range of colors. You can insert multiple callouts in a video.

The ability to add forms to a video is covered in MC688632 (3 April 2024, Microsoft 365 roadmap 180796), while callouts are covered in MC688631 (last updated 4 April 2024, Microsoft 365 roadmap item 180795)

During my testing, I discovered that adding a callout to a video often caused Stream to become unresponsive to a point where I had to kill the page. Do not attempt to add a callout (or form) when playing a video! Make sure that it’s stopped, else a high chance of something going wrong happens. Maybe I’m just unfortunate or the problem was with the Edge browser. In any case, Figure 2 shows what a callout (with text and a hyperlink) added to a video looks like.

A callout displayed for a video by the Stream browser client.
Figure 2: A callout displayed for a video by the Stream browser client

As noted above, Stream displays a callout for a defined period. This could be the complete video or just specific sections.

Keep an Eye on Stream

I confess it had been a while since I took a detailed look at the Stream client to acquaint myself with recent developments and new features. As is the norm within a Microsoft 365 tenant, other things had demanded attention and stole time, or maybe it was just that I now expect the Stream client to work without me having to think too much about what’s going on. In any case, the new features are welcome, even if you don’t have Copilot.


Support the work of the Office 365 for IT Pros team by subscribing to the Office 365 for IT Pros eBook. Your support pays for the time we need to track, analyze, and document the changing world of Microsoft 365 and Office 365.

]]>
https://office365itpros.com/2024/05/24/stream-browser-client-may24/feed/ 1 64853
Disabling Bits of Copilot for Microsoft 365 https://office365itpros.com/2024/04/30/copilot-for-microsoft-365-service-plans/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-microsoft-365-service-plans https://office365itpros.com/2024/04/30/copilot-for-microsoft-365-service-plans/#comments Tue, 30 Apr 2024 07:00:00 +0000 https://office365itpros.com/?p=64575

Exerting Control Over Individual Copilot for Microsoft 365 Components

No doubt inspired by the article explaining how to remove individual features (service plans) from Microsoft 365 licenses, a reader asked if it is possible to control where Copilot for Microsoft 365 functionality surfaces in different apps. There’s no GUI in the Microsoft 365 admin center to disable bits of Copilot for a tenant. You can disable apps belonging to the Copilot license for a user account (Figure 1), but the question is what apps are bundled with Copilot for Microsoft 365 and what happens if administrators disable the apps for users.

Copilot for Microsoft 365 apps for a user account.
Figure 1: Copilot for Microsoft 365 apps for a user account

The Copilot for Microsoft 365 Service Plans

Looking into the details of the Copilot for Microsoft 365 license with the Microsoft Graph PowerShell SDK, we discover that the product (SKU) identifier is 639dec6b-bb19-468b-871c-c5c441c4b0cb and that the license covers eight service plans. As you’ll recall, a service plan governs functionality within a license that can be enabled or disabled. The Microsoft 365 admin center refers to service plans as apps when displaying the license information for a user.

Here’s how to find the license detail with PowerShell:

Connect-MgGraph -Scopes Directory.Read.All -NoWelcome
$CopilotSKU = Get-MgSubscribedSku | Where-Object SkuPartNumber -match "Microsoft_365_Copilot"
$CopilotSku.ServicePlans | Format-Table ServicePlanName, ServicePlanId

ServicePlanName                    ServicePlanId
---------------                    -------------
COPILOT_STUDIO_IN_COPILOT_FOR_M365 fe6c28b3-d468-44ea-bbd0-a10a5167435c
M365_COPILOT_SHAREPOINT            0aedf20c-091d-420b-aadf-30c042609612
GRAPH_CONNECTORS_COPILOT           82d30987-df9b-4486-b146-198b21d164c7
M365_COPILOT_CONNECTORS            89f1c4c8-0878-40f7-804d-869c9128ab5d
M365_COPILOT_APPS                  a62f8878-de10-42f3-b68f-6149a25ceb97
M365_COPILOT_TEAMS                 b95945de-b3bd-46db-8437-f2beb6ea2347
M365_COPILOT_BUSINESS_CHAT         3f30311c-6b1e-48a4-ab79-725b469da960
M365_COPILOT_INTELLIGENT_SEARCH    931e4a88-a67f-48b5-814f-16a5f1e6028d

Table 1 summarizes the service plans included in the Copilot for Microsoft 365 license.

Service Plan NameUser Friendly Feature NameService Plan Id
GRAPH_CONNECTORS_COPILOTGraph Connectors in Microsoft 365 Copilot82d30987-df9b-4486-b146-198b21d164c7
M365_COPILOT_INTELLIGENT_SEARCHIntelligent Search (Semantic Index)931e4a88-a67f-48b5-814f-16a5f1e6028d
M365_COPILOT_BUSINESS_CHATMicrosoft Copilot with Graph-grounded chat3f30311c-6b1e-48a4-ab79-725b469da960
M365_COPILOT_TEAMSMicrosoft 365 Copilot in Microsoft Teamsb95945de-b3bd-46db-8437-f2beb6ea2347
M365_COPILOT_APPSMicrosoft 365 Copilot in Productivity Apps (Office)a62f8878-de10-42f3-b68f-6149a25ceb97
M365_COPILOT_CONNECTORSPower Platform Connectors in Microsoft 365 Copilot89f1c4c8-0878-40f7-804d-869c9128ab5d
M365_COPILOT_SHAREPOINTMicrosoft 365 Copilot in SharePoint0aedf20c-091d-420b-aadf-30c042609612
COPILOT_STUDIO_IN_COPILOT_FOR_M365Copilot Studiofe6c28b3-d468-44ea-bbd0-a10a5167435c
Table 1: Copilot for Microsoft 365 Service Plans
COPILOT_STUDIO_IN_COPILOT_FOR_M365Copilot Studiofe6c28b3-d468-44ea-bbd0-a10a5167435c

What the Copilot for Microsoft 365 Service Plans Do

The Copilot service plans split into those governing user-facing features and background or administrative functionality.

User functionality:

  • Microsoft Copilot with Graph-grounded chat
  • Microsoft 365 Copilot in Microsoft Teams (app, summarization of chats and meeting discussions, ability to rewrite/adjust messages before posting to chats or channel conversations)
  • Microsoft 365 Copilot in Productivity Apps (Word, Excel, PowerPoint, Outlook (Win32 and Monarch), Loop, OneNote)

Teams and the productivity apps support Copilot in the desktop, browser, and mobile platforms.

Background and administrative functionality:

Copilot Studio.
Figure 2: Copilot Studio

Turning Off Bits of Copilot

Getting back to the original question, control is available over the chat app, Copilot in Teams, and the generalized bucket of productivity apps. For example, you cannot turn off Copilot for Word and Excel and have it available in PowerPoint and Outlook. The productivity apps are either enabled or disabled for Copilot. Granular control isn’t available.

Copilot for Office depends on the Microsoft 365 enterprise apps (subscription version of Office). Using another version, like Office 2024 (preview available now) isn’t possible because these apps don’t include the necessary UI and code to communicate with Copilot.

The answer to the question is that you can turn bits of Copilot for Microsoft 365 off. For instance, not everyone needs access to Copilot Studio. I’m not sure that I would disable any of the other service plans for background and administrative activity because you don’t know if the action might affect how the user-facing apps work. Disabling a user app certainly works and the license change will be effective within fifteen minutes for browser-based apps (Figure 3) and a few hours for desktop apps, depending on when the app refreshes its license information.

Microsoft Copilot chat discovers that it doesn't have a license.
Figure 3: Microsoft Copilot chat discovers that it doesn’t have a license

But if an organization is paying $360/year for Copilot for Microsoft 365 licenses, surely the imperative is to extract maximum value for the investment instead of restricting what people can use? But if you do decide to disable service plans from the Copilot for Microsoft 365 license, the script will happily do the job for you.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/04/30/copilot-for-microsoft-365-service-plans/feed/ 1 64575
Interpreting Audit Events for Copilot for Microsoft 365 https://office365itpros.com/2024/04/03/copilot-for-microsoft-365-audit/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-microsoft-365-audit https://office365itpros.com/2024/04/03/copilot-for-microsoft-365-audit/#respond Wed, 03 Apr 2024 08:00:00 +0000 https://office365itpros.com/?p=64310

Using Audit Events to Determine Level of Copilot Activity

Updated 9-Apr-2024

In an article reporting how Copilot usage had dropped off after the initial novelty of having a digital assistant, MVP Joe Stocker described how he used ChatGPT to analyze information about Copilot interactions using Copilot for Microsoft 365 audit events retrieved from the unified audit log. It’s an interesting technique that caused me to investigate the information captured for Copilot interactions.

Running an audit search to find Copilot interactions returns the audit events captured when Microsoft 365 apps use Copilot. Outlook is an exception. Neither OWA nor Outlook for Windows capture Copilot audit events when drafting or refining message text or summarizing a message thread. I wasn’t too surprised at this because Microsoft had said that the Monarch client would be the only supported email client for Copilot until recently.

Interestingly, despite Monarch or any other Outlook client failing to log audit events for Copilot interactions, usage data shows up for Copilot for Outlook in the usage reports section of the Microsoft 365 admin center. The usage data does not describe which Outlook client is used. I assume that the lack of audit events for Outlook interactions is an oversight that Microsoft will address soon.

Update: According to message center notification MC773621 (9 Apr 2024), Microsoft plans to ship support for eDiscovery of Copilot interactions in Outlook (all clients) in mid-April. This implies that audit events will become available.

What’s in the Copilot for Microsoft 365 Audit Events

The unified audit log is a terrific resource for Microsoft 365 administrators to learn about what really happens inside a tenant. Audit events have two parts. The first is fixed and is consistent across all workloads. This part contains properties like the record identifier, user identifier, and timestamp. The second part contains the AuditData multi-value property and is under the control of the workload developers. Audit events generated by SharePoint Online are different to those generated by Exchange Online or Teams. The inconsistency in audit content means that it takes time to understand and interpret the information contained in audit events.

Microsoft defines a schema for Copilot audit events. Regretfully, the formatting of the information written into the AuditData property is inconsistent across the different clients. Some differences are expected because of the variation in content processed by Copilot, but the inconsistencies make it more difficult to understand and analyze what happened than it should be.

Adjusting Audit Search Results for Accuracy

When reviewing audit events generated for Copilot interactions, the fundamental principle is that every prompt processed by Copilot results in an audit event. For instance, if a user asks Copilot for Microsoft Chat a question and follows up with a couple of clarifications, the audit log ingests three audit events. The overall total of Copilot interaction audit events is therefore not the total of discrete user conversations with Copilot. Rather, it is the count of Copilot prompts.

It’s also important to remove duplicates of audit events from the set returned by an audit search. To improve the accuracy of audit searches, Microsoft is introducing a high completeness mode for the Search-UnifiedAuditLog cmdlet. If you don’t use a high completeness search (which takes much longer to complete), make sure to remove duplicates before analyzing the search results. I usually sort the audit search results by unique values in the identity property to remove duplicates. The Purview audit search feature does the same when it downloads audit search results.

After some trial and error, I created a PowerShell script to extract Copilot audit events from the unified audit log and parse the information contained in the AuditData property. An example of the output is shown in Figure 1.

Copilot for Microsoft 365 audit events.
Figure 1: Copilot for Microsoft 365 audit events

You can download the script I used to find and analyze Copilot for Microsoft 365 audit events from GitHub.

Using Copilot for Excel to Analyze Audit Events

Given that we’re exploring Copilot for Microsoft 365, it made sense to try to use Copilot for Excel to interpret the results. Copilot for Excel can only work with tables and the workbooks must be enabled for AutoSave. In other words, the workbooks must be stored in OneDrive for Business or SharePoint Online. I used the ImportExcel PowerShell module to generate a worksheet with a table of audit events with the following command:

$Report | Export-Excel -Path c:\Temp\Copilot.xlsx -WorksheetName "Copilot Interactions" -Title "Copilot for Microsoft 365 Interactions" -TitleBold -TableName "CopilotEvents"

Figure 2 shows some of the analysis generated by Copilot for Excel. It’s important to say that Copilot for Excel is still in preview, which accounts for the difficulties reported by many users when prompting Copilot for Excel to do something simple, like create a chart.

Using Copilot for Excel to analyze audit events
Figure 2: Using Copilot for Excel to analyze audit events

Some Gaps in Copilot Audit Coverage

Microsoft touts the data security and compliance coverage for Copilot in Purview solutions like auditing. Not capturing audit events for Outlook clients seems like a big gap that must be filled ASAP. Improving the consistency of the information included in the audit events to make it easier to understand and use the data. But I have been waiting for various Microsoft development groups to improve their audit events since 2016 and I suspect that I’ll be waiting before I see better Copilot events.


Learn how to exploit the data available to Microsoft 365 tenant administrators through the Office 365 for IT Pros eBook. We love figuring out how things work.

]]>
https://office365itpros.com/2024/04/03/copilot-for-microsoft-365-audit/feed/ 0 64310
Microsoft Grounds Copilot Apps with Graph and Web Content https://office365itpros.com/2024/03/25/copilot-for-microsoft-365-grounding/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-microsoft-365-grounding https://office365itpros.com/2024/03/25/copilot-for-microsoft-365-grounding/#comments Mon, 25 Mar 2024 08:00:00 +0000 https://office365itpros.com/?p=64268

Office Apps Get Better Grounding in Copilot for Microsoft 365

Message center notification MC734281 (12 March 2024) might have passed by without too much attention unless you’re particularly interested in Copilot for Microsoft 365. The notification informs tenants that Word, Excel, PowerPoint, and OneNote will ground user prompts by reference to enterprise data and the web. As Microsoft notes, this is like what happens when users interact with Copilot for Microsoft 365 chat.

Grounding against enterprise data means that when Copilot responds to user prompts, it will seek additional context by attempting to find relevant information in Microsoft 365 repositories using Graph requests. Web grounding means that Copilot will use Bing search to find relevant information from sites within and outside the enterprise. The fact that major apps will start to use grounded requests from April 2024 might come as a surprise. After all, Microsoft has long cited Copilot’s ability to use the “abundance of data” stored in Microsoft 365 as a major advantage of Copilot for Microsoft 365 over other AI tools that don’t have access to Microsoft 365 repositories.

The roll out starts with Word (Windows and Online) and progresses to PowerPoint, Excel, and OneNote. Microsoft expects to complete the deployment by September 2024.

The Importance of Grounding

Microsoft explains that grounding is “the process of using large language models (LLMs) with information that is use-case specific, relevant, and not available as part of the LLM’s trained knowledge.” In other words, if you ask Copilot for Microsoft 365 to do something and grounding doesn’t happen, it relies on the user prompt to query the LLM.

Until now, users have been able to ground prompts in apps like Word by including up to three reference documents in the prompt. Let me illustrate the importance of grounding by showing an example of two briefing notes generated by Copilot in Word about the Midnight Blizzard attack against Microsoft in January 2024. Copilot generated the first briefing note without any reference documents. Because it couldn’t search the Graph or web for relevant information, the grounding of the prompt was poor, and Copilot could only use whatever information is in the LLM.

As shown in Figure 1, the generated text included several inaccurate statements (hallucinations), including the remarkable assertion that the attack led to a drop of $400 billion in Microsoft’s market value together with a declaration had deprived millions of Microsoft cloud users from accessing services.

Briefing note about Midnight Blizzard generated by Copilot for Microsoft 365 (without reference documents).
Figure 1: Briefing note about Midnight Blizzard generated by Copilot for Microsoft 365 (without reference documents)

If some relevant reference documents are included in the prompt, Copilot’s generated text becomes more accurate and balanced (Figure 2).

Briefing note about Midnight Blizzard generated by Copilot for Word with reference material.
Figure 2: Briefing note about Midnight Blizzard generated by Copilot for Word with reference material

The important point here is that after Microsoft updates Copilot to allow the Office apps to ground prompts using Graph and web material, the chances of Copilot generating absolute rubbish lessen considerably. That is, if Copilot can find relevant information through its searches. Adding reference documents to prompts in Copilot for Word will generate even better results because the reference documents should give Copilot a more precise context to work with.

Microsoft says that Graph grounding is enabled for all user prompts and that Copilot requests will use “the file context” (whatever file is open at the time) plus web searches as well. Copilot for Microsoft 365 chat uses Graph and web lookups today.

The Quality of AI-Generated Text

In some respects, I was shocked that it has taken so long for Microsoft to ground Copilot requests in these important apps. Copilot for Microsoft 365 is evolving rapidly, but the ability to generate high-quality text at general availability seems like an essential rather than a nice to have feature. I’ve always been suspicious about the quality of the text generated by Word and this revelation certainly explains a lot.

Take Your Time

The advice of Directions on Microsoft analyst Wes Miller that organizations should pace themselves and understand exactly what they are buying before they invest in expensive Copilot licenses is accurate. Things are changing, and the hyperbole around Copilot is like a dust storm that obscures detail. Why rush in where angels fear to tread?

Before making your mind up about Copilot, take the time to read the article posted by MVP Joe Stocker where he reports a drop-off of Copilot activity after the novelty effect of asking the AI to perform tasks wears off. Although the sample size was small, this emphasizes the need to support users on their Copilot journey, especially as important new functionality like Graph and web grounding appears.

And if you attend the Microsoft 365 Conference in Orlando at the end of April, make sure that you come to my session about not letting Copilot for Microsoft 365 become a vanity project. You might even enjoy what I have to say!


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem, including in Copilot. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/03/25/copilot-for-microsoft-365-grounding/feed/ 1 64268
Can Copilot for Microsoft 365 Save Users 14 Hours a Month? https://office365itpros.com/2024/03/12/copilot-for-microsoft-365-14hrs/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-microsoft-365-14hrs https://office365itpros.com/2024/03/12/copilot-for-microsoft-365-14hrs/#respond Tue, 12 Mar 2024 01:00:00 +0000 https://office365itpros.com/?p=64051

It All Depends on the Person and How They Use Office

Personal perspectives of using technology are often valuable guides to how useful products will be in production. Given the current hype around Copilot for Microsoft 365, I was interested to read a LinkedIn post by Microsoft employee Luka Perne. Based on his use of Copilot over several months logged on a per-task basis, Perne believes he saves 14 hours per month. That’s quite an impressive number that more than justifies the $30/month Copilot license.

It’s always important to put personal observations in context and ask yourself if a product would work as well for you, especially when reading a report written by someone who works for the vendor. I’m sure that some gain enormously from Copilot for Microsoft 365, just as I’m equally convinced that success with Copilot depends on many individual factors.

Not a Marketing Document

What I liked about this report is that it is not trying to sell Copilot. If you look at Microsoft’s marketing material, Copilot works wonderfully because what you see are carefully selected scenes that show Copilot working with data selected to demonstrate its strengths. This coverage is more practical and informative.

For instance, Perne makes the point that people go through a learning curve as they interact with Copilot. Some progress faster and discover how to extract value quickly. Others struggle with prompts or are unsure how Copilot can help. That’s why it’s important to educate and support users during a Copilot deployment project.

Where Success is Found for Copilot for Microsoft 365

Microsoft employees working in engineering and services roles tend to be more comfortable with new technology than the average Microsoft 365 user. Copilot support for users (informal and formal) is likely better and more comprehensive than elsewhere, and users are motivated to explore the capabilities of the technology, including mastering the technique of constructing effective prompts. Overall, I suspect that a technology like Copilot is adopted more easily inside Microsoft than in customer environments.

Perne says that he’s been working with Copilot for four months. Some will gain the same increase in productivity he reports, but I suspect it will take others many months before they do the same.

As Perne notes, he values specific Copilot features. This matches my own experience where the summaries generated by Copilot for Teams meetings, Outlook email threads, and documents (Figure 1) are easily the most valuable in terms of time savings. Anyone who has ever worked with Microsoft (especially the corporate teams) can attest to the number of meetings that people attend and the ability to generate a quality summary based on the meeting transcript is much appreciated, especially when multiple meetings occur at the same time.

Working with Copilot for Microsoft 365 in a Word document.
Figure 1: Working with Copilot for Microsoft 365 in a Word document

Copilot’s ability to create and rewrite text can help people unsure of their writing skills. In my case, I think I do as well in terms of rewriting text by reviewing the suggestions made by Editor or Grammarly. Copilot is good at generating the outline of a document. However, the accuracy of the material Copilot uses to flesh out the outline depends on being able to find relevant information in SharePoint Online or OneDrive for Business. Without something to use, Copilot often strays into made-up text that reads well without being accurate.

Perne generated the graphics in his article with Excel. but notes the limitations Copilot currently has in Excel, like only working for tables with less than 10K rows. I’m sure this is an area that Microsoft will improve in the future. For now, I agree with the observation that I’ve picked up enough Excel over the years to survive without Copilot for the kind of worksheets I deal with.

The assertion that Copilot always delivered improved results for a non-native English speaker when it came to generating or rewriting text was insightful, and I think fair. Many large organizations have a corporate language that most communication is in. For Microsoft, that language is English, and I can see how useful Copilot is when asked to rewrite or correct text. The output will be bland, but it will be precise and readable, and that’s important in email and documents.

Can You Track Your Copilot Results?

The net is that many factors influence the ability of Copilot for Microsoft 365 to save time for people. If you’re technically literate, skilled in using Word, PowerPoint, Outlook, and Excel, and attend a lot of meetings, and store the material you work with in SharePoint Online and OneDrive for Business, the probability is that you will achieve good results. Whether you save 14 hours per month is another matter. Tracking savings using the same methodology as Perne is certainly one way to assess the outcome, if you’re as good as he was at noting results.


Keep up to date with developments like Copilot for Microsoft 365 by subscribing to the Office 365 for IT Pros eBook. Our monthly updates make sure that our subscribers understand the most important changes happening across Office 365.

]]>
https://office365itpros.com/2024/03/12/copilot-for-microsoft-365-14hrs/feed/ 0 64051
Microsoft Announces Restricted SharePoint Search https://office365itpros.com/2024/03/07/restricted-sharepoint-search/?utm_source=rss&utm_medium=rss&utm_campaign=restricted-sharepoint-search https://office365itpros.com/2024/03/07/restricted-sharepoint-search/#comments Thu, 07 Mar 2024 01:00:00 +0000 https://office365itpros.com/?p=64044

Restricted SharePoint Search Confines Copilot for Microsoft 365 Access to Curated Sites and User Content

Restricted SharePoint Search limits Copilot access to documents.

The lights are obviously burning late in SharePoint engineering as Microsoft figures out new methods to help customers avoid inadvertent disclosure of confidential information through text generated by Copilot for Microsoft 365. Last month, we discussed how to exclude SharePoint Online sites and document libraries from search results to block Copilot access. Now, Microsoft has created Restricted SharePoint Search to deliver a more elegant (but possibly flawed) solution to allow organizations to control the sites accessible to Copilot.

Tenants with Copilot for Microsoft 365 licenses can enable Restricted SharePoint Search “from April onward.” By default, Restricted SharePoint Search is disabled and administrators will have to run some PowerShell commands to enable it for a tenant.

What is Restricted SharePoint Search?

Essentially, Restricted SharePoint Search disables enterprise-wide search. Instead, tenants can select up to 100 SharePoint Online sites that users can access (subject to the access defined for the sites). In this context, “users” includes Copilot for Microsoft 365 because when Copilot is active, it always operates as the signed-in user.

The restriction placed on enterprise search is pretty severe. Given the profusion of Microsoft 365 Groups and Teams, each of which has their own SharePoint Online site (and possibly several sub-sites for shared and private channels), limiting search to a curated list of 100 sites excludes most sites in anything but small tenants. Even my own small tenant supports over 400 sites.

Easing the Restrictions to Accommodate User Data

Microsoft will no doubt argue that the impact of limiting search to 100 sites is ameliorated by allowing users to search content from:

  • Frequently visited SharePoint Online sites. However, Microsoft has not defined how many sites are in this category and how users can add sites to the list. For instance, does marking a site as a favorite put it on the list?
  • Files in OneDrive for Business accounts that they have access to. This includes the user’s own OneDrive account, so they have full access to all their personal documents and other information stored in OneDrive.
  • Files that are directly shared with users.
  • Files that users create, edit, or view. In other words, if you touch a file stored in SharePoint Online, Copilot for Microsoft 365 can find and use that content.

I don’t know how Microsoft determined that 100 was a good number for the curated sites list. Determining what sites go onto the list and what sites are excluded will be an interesting exercise for many organizations. It seems like the intention is for tenants to include important corporate sites that everyone needs access to, like those holding HR information or details of released products and public documentation while relying on the frequently visited sites lists to deliver user-specific search results. It would be interesting to know how Microsoft uses Restricted SharePoint Search and if so, how they selected the 100 sites.

Restricted SharePoint Search does not affect how Microsoft Purview solutions like eDiscovery work. One way of thinking about the restriction is that it’s a form of trimming similar to the security trimming that Search already does to make sure that users only ever see sites and files in search results that they are entitled to access. This trimming further limits results to the 100 curated files plus the user’s OneDrive for Business account and files shared with them.

The Impact of Restricted SharePoint Search

It could be that this scheme will work well, but as Microsoft points out, Restricted SharePoint Search “limits the content Copilot can search and reference when responding” and “may impact its ability to provide accurate and comprehensive responses to prompts.”

Microsoft says that the new solution will help customers review and audit site permissions while continuing to deploy Copilot for Microsoft 365. A cynic might say that Restricted SharePoint Search is a cobbled together patch rushed out to assuage the concerns of customers who have heard about potential data disclosure problems and slowed the planning process for Copilot. It’s absolutely the right thing for Microsoft to address those concerns, but Restricted SharePoint Search seems like a sticking plaster that’s been applied until Microsoft can come up with a more flexible long-term solution. I guess we’ll know more when the software reaches customers in April and can assess just how well the 100 site limit works.


]]>
https://office365itpros.com/2024/03/07/restricted-sharepoint-search/feed/ 2 64044
Stopping Copilot Access to SharePoint Online Sites and Document Libraries https://office365itpros.com/2024/02/21/exclude-sharepoint-site-from-copilot/?utm_source=rss&utm_medium=rss&utm_campaign=exclude-sharepoint-site-from-copilot https://office365itpros.com/2024/02/21/exclude-sharepoint-site-from-copilot/#comments Wed, 21 Feb 2024 01:00:00 +0000 https://office365itpros.com/?p=63738

Exclude SharePoint Site from Copilot by Blocking Search Indexing

One of the fundamental concepts underpinning Copilot for Microsoft 365 is the use of Graph queries to find information stored in Microsoft 365 to help ground user prompts. Grounding is the process of providing additional context to make it easier for Copilot to return high-quality responses to user prompts. For instance, if someone asks Copilot to write a briefing note about Office 365, Copilot first queries Microsoft 365 repositories like SharePoint Online to discover what information the user already has about the topic. Optionally, if allowed by the tenant, Copilot can query the web to find additional information.

After gathering information, Copilot refines the prompt and sends it to the Large Language Model (LLM) for processing. Eventually, possibly after further refinement, Copilot returns the response to the user.

Copilot Access to Content Stored in Microsoft 365 Repositories

One of the things you quickly learn about Copilot for Microsoft 365 is that the quality and reliability of generated text is highly dependent on the availability of information. For instance, Copilot is very good at summarizing Teams meetings because it has the meeting transcript to process. However, if you ask Copilot to draft text about a topic where it cannot find anything in Microsoft 365 to ground the prompt, Copilot will certainly generate a response, but the text might not be as useful as you expect. The output will certainly follow the requested format (a report, for instance), but the content is likely to surprise because it is likely to come from a web search that might or might not retrieve useful information.

Users can guide Copilot for Word by providing up to three reference documents. In effect, the user instructs Copilot that it should use the reference documents to ground the prompt. This works well, unless the documents you want to use are large (I am told that Microsoft is increasing the maximum supported size for reference documents).

All of this means that anyone contemplating a deployment of Copilot for Microsoft 365 should store information within Microsoft 365 to create what Microsoft calls an “abundance of data” for Copilot to consume. SharePoint Online and OneDrive for Business are prime repositories, but it’s possible that some SharePoint Online sites contain confidential or other information that the organization doesn’t want Copilot to consume.

Remember, Copilot can only use information that the signed-in account using Copilot can access. An account that has access to a site holding confidential information could find that Copilot retrieves and uses that information in its responses. The user is responsible for checking the text generated by Copilot, but accidents do happen, especially when time is short to get a document out.

Preventing Copilot Access to Sensitive Information

Two methods help to avoid accidental disclosure of confidential information. First, you can protect files with sensitivity labels. If Copilot consumes protected documents, it applies the same sensitivity label to the output.

However, not every organization uses sensitivity labels. In this situation, an organization can decide to exclude selected SharePoint Sites from indexing (Figure 1) by both Microsoft Search and the semantic index. If content is not indexed, it can’t be found by queries and therefore cannot be consumed by Copilot.

Configuring a SharePoint site to exclude it from search results.

Exclude sharepoint site from copilot
Figure 1: Exclude SharePoint Site from Copilot Access by Stopping it Appearing in Search Results

But what happens if you have a SharePoint site with several document libraries and want to make the content available from some libraries and not others? The answer is the same except that the exclusion from search results is applied through the advanced settings of document library settings (Figure 2).

Settings for a document library.
Figure 2: Settings for a document library

The downside of excluding sites or libraries from search results is that people can’t use SharePoint search to find documents.

Testing Excluded Sites and Document Libraries

How do you know site and document library exclusions work? The easiest way is to create a document with an unusual phrase in the excluded site or library and then attempt to use it with Copilot for Word. I created a document about ‘Project Derrigimlagh’ and included the phrase ‘wicked worms’ several times in the content. I then created a new Word document and added the document from the excluded library as a reference (Figure 3).

Selecting a reference file for Copilot for Word
Figure 3: Selecting a reference file for Copilot for Word

You might ask why the document can be added as a reference. The dialog shows recent documents, and the document is in this category, so it shows up. However, when Copilot attempts to consume the document, it cannot access the content. The result is that the prompt cannot be grounded and Copilot flags this as a failure to generate high-quality content (Figure 4). This is a general-purpose error that Copilot issues anytime it believes that it cannot respond to a prompt.

Copilot for Word can't generate high-quality content
Figure 4: Copilot for Word can’t generate high-quality content

Interestingly, when I removed the reference document and reran the prompt, Copilot generated text explaining the potential use of wicked worms as a biofuel source. This is emphatically not the content stored in the excluded document library. The information about Derrigimlagh came from the internet, and making wicked worms into a biofuel source is probably due to published material about using worms in a biorefinery. In any case, it’s a good example of how AI-based text generation needs to be treated with caution.

Use Sensitivity Labels If Possible

If an organization has implemented sensitivity labels, I think this is a better method to protect confidential material, if only because of the persistence of labels to generated documents. You can also define a default sensitivity label for a document library to make sure that everything stored in the library is protected and use auto-label policies to find and protect confidential material stored across all sites.

In a nutshell, sensitivity labels are more flexible and powerful, but it’s nice to have the backup of being able to exclude complete sites and individual document libraries. Just another thing to consider in a Copilot deployment!


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/02/21/exclude-sharepoint-site-from-copilot/feed/ 1 63738
Copilot for Microsoft 365 to Support Outlook Classic https://office365itpros.com/2024/02/15/outlook-win32-copilot-support/?utm_source=rss&utm_medium=rss&utm_campaign=outlook-win32-copilot-support https://office365itpros.com/2024/02/15/outlook-win32-copilot-support/#comments Thu, 15 Feb 2024 01:00:00 +0000 https://office365itpros.com/?p=63712

Outlook Win32 Copilot Support Coming. Teams Gets a Better Integration

After removing the major barriers blocking adoption of Microsoft 365 Copilot last month, Microsoft has quietly dropped its insistence that Copilot would only support the Outlook Monarch client. The latest version of the Microsoft 365 Copilot requirements documentation (2 February 2024) says that Copilot works with the new Outlook client on Windows and Mac (Outlook mobile is also supported) and then notes that “Microsoft Copilot for Microsoft 365 will be supported on classic Outlook for Windows (Win32 desktop app) in the future.”

A link to the Microsoft 365 roadmap lists three items relating to the introduction of Copilot functionality in the classic Outlook client together with dates when the rollout is supposed to start:

  • Coaching by Copilot (190927) –February 2024
  • Draft by Copilot (190937) – March 2024. Figure 1 shows the draft created by Copilot in OWA.
  • Summarize by Copilot (180900) –November 2023
Copilot drafts a message for OWA.

Outlook Win32 Copilot Support
Figure 1: Copilot drafts a message about Outlook Win32 Copilot Support

According to the items, Microsoft added 190927 and 190937 on 6 December 2023, and 180900 on December 10, 2023. Don’t pay too much attention to the purported rollout dates until you see a Microsoft 365 message center announcement describing when the new functionality will be available in the preview and other Office channel. Even then, announced dates are often optimistic and end up being delayed. I’m pretty sure that Outlook Win32 support will only extend to the subscription version of Outlook packaged in Microsoft 365 enterprise apps, but we’ll see when Microsoft shares more details.

No Formal Announcement for Outlook Win32 Copilot Support

Speaking of details, I can’t find a formal Microsoft announcement about the change in direction. Ever since the original Copilot for Microsoft 365 announcement in March 2023, Microsoft held to the line that Monarch was the only supported Outlook desktop client. As I noted in August, this position applied despite the fact that Microsoft’s One Outlook program includes the ability for Outlook desktop to use code developed for Monarch/OWA. The only logical conclusion is that Microsoft hoped to use Copilot to drive customers to embrace Monarch.

The sad fact is that Monarch is still not fit for purpose in the eyes of many Outlook users. The lack of offline access and PST support are just two issues that must be addressed before Monarch has a chance to replace the classic client.

Although they’re rolling Monarch out as a replacement for the standard Windows mail and calendar client, Microsoft knows that the software lacks many features needed for success in commercial environments. All the missing functionality is on a list for development, but the fact remains that it’s very hard to force people to change to a client that doesn’t do what they need, and this became a blocking factor for Copilot adoption.

Given that making it easy for customers to use Copilot is much more important for Microsoft than achieving an earlier switchover to Monarch is, the choice for senior management must have been simple, and that’s probably why the restriction is gone. Customers will applaud the new reality.

New Copilot Experience in Teams

Meanwhile, on February 12, Microsoft announced a new Copilot experience in Teams. Like the rest of Teams, the experience is in the form of an app that administrators can control through setup policies. According to Microsoft, the major changes are better prompts, access to Copilot Lab to see prompts that you might use, and a list of your Copilot chat history.

The app delivers a chat experience, so it should come as no surprise that Teams can store and reveal previous interactions with Copilot. The chat messages are captured for compliance purposes, just like personal and group chats, and can be retrieved by content searches for eDiscovery.

Just to be sure that Copilot support for Outlook Win32 is a reality, I asked Copilot in Teams (Figure 2) about Outlook Win32 Copilot support. After thinking for a bit, Copilot duly responded to confirm support and noted two references, one being the requirements documentation, the other a document stored in a SharePoint Online site. Website content is only available to Copilot if enabled for the tenant and the user chooses to enable it for searches.

Copilot in Teams confirms support for Outlook Win32.
Figure 2: Copilot in Teams confirms Outlook Win32 Copilot Support

More Change Coming

I suspect that the Copilot for Microsoft 365 journey will have other ups and downs as customers identify and Microsoft removes barriers to adoption, problems, bugs, and other issues. Like the initial development of Teams in the 2017-2020 period (albeit accelerated in some part by the Covid pandemic), I expect lots of change. Stay tuned.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2024/02/15/outlook-win32-copilot-support/feed/ 5 63712
Copilot for Teams Extracts Real Value from Meeting Transcripts https://office365itpros.com/2024/01/22/copilot-for-teams-meeting/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-teams-meeting https://office365itpros.com/2024/01/22/copilot-for-teams-meeting/#comments Mon, 22 Jan 2024 01:00:00 +0000 https://office365itpros.com/?p=63345

Too Many Meetings – Perhaps an Opportunity for Copilot for Teams

We live in a world of perpetual meetings. At least, it seems like that at times. Microsoft has invested substantial effort to make team meetings more useful (and bearable) through innovations like avatars and the Meet app. To help people manage their calendars more intelligently, since 2015, enterprise Office 365 SKUs include analytics information about meeting habits (now in Viva Insights – Figure 1) based on calendar events. Humans are creatures of habit, so whether anyone uses analytics to moderate how they participate in meetings is quite another matter.

Viva Insights analyzes meeting habits.
Figure 1: Viva Insights analyzes meeting habits.

Generating Meeting Content from Transcripts

This brings me to Copilot for Teams (part of Copilot for Microsoft 365) and its ability to generate meeting notes for Teams meetings and allow people to ask questions about what happened during a meeting. This aspect of Copilot for Microsoft 365 is one of its most compelling features.

Copilot uses the meeting transcript to generate notes and summaries. The transcript captures spoken contributions to meetings, including speaker attributions, in the form of short snippets. The transcript later becomes the basis for the captions for the meeting recording.

Copilot needs about five minutes of transcript capture before it can make sense of what’s happening in a meeting. After this point, participants can query Copilot to prompt for information about topics, action items, and contributions based on what’s happened in the meeting so far. This is all very nice, but if you’re paying attention to the meeting, you’ll probably be able to figure out how to move the meeting forward or end discussion about a topic. On the other hand, people who nod off (as I have) in a meeting or pay more attention to email or chat while others are talking will appreciate the ability to catch up.

True magic happens once the meeting is over. I really like the AI notes generated from the transcript. The overall accuracy of the notes is reasonably high, but it’s important to check the notes before accepting that AI-generated text is a full and faithful account of meeting proceedings. Some errors are simple (like referring to Michel de Rooij as Michelle – Figure 2). Other errors are more fundamental when the AI misunderstands what was said.

Using a meeting transcript to interact with Copilot for Teams.
Figure 1: Using a meeting transcript to interact with Copilot for Teams

What’s really good is using Copilot for Teams to interrogate the meeting notes. Microsoft’s documentation suggests some questions to ask, like what was the mood of a call. I asked Copilot to summarize the mood of a recent call about the Office 365 for IT Pros eBook (Figure 2), and was told:

The mood of the meeting was mostly friendly and cooperative, with some jokes and banter among the participants. There were no signs of conflict or tension, and everyone seemed to respect each other’s opinions and contributions. The only exception was when Tony made a sarcastic remark about Paul’s airplane being stuck in the snow, which Paul did not appreciate. 1 However, this did not seem to affect the overall tone of the meeting, which was positive and productive.

I’m not sure that I was being sarcastic when I asked Paul about his airplane being grounded by 7 inches of snow in a Kentucky municipal airport, but there you are. Like other aspects of Copilot for Microsoft 365, extracting full value from meeting notes with Copilot can only occur when users receive education about how to interact with the AI.

The intelligent recap feature available in Teams Premium includes some of the capabilities used by Copilot for Teams like AI-generated notes. The big difference between the two is that intelligent recap only includes a static transcript that can’t be interrogated after the event.

Using Copilot Without a Meeting Transcript

Originally, meetings had to be recorded and transcribed for Copilot to work. This is a suitable arrangement for most meetings, but sometimes people don’t want to record sensitive meetings. A Teams meeting option is available to allow Copilot to work without recording a meeting. Essentially, instead of capturing a permanent transcript that’s stored for eDiscovery purposes, Teams uses Microsoft Audio Services to create a temporary transcript that Copilot can use during the meeting. Once the meeting concludes, Teams discards the transcript.

Chasing Value from Copilot

Hype surrounds Copilot for Microsoft 365. Despite the removal of previous limitations, investment in a Copilot deployment still involves a $30/user/month investment with a year-long commitment. In other words, a ten-user test will cost $3,600 for Copilot licenses. Figuring out to extract value for that investment is important unless you like throwing money away. Extracting information from Teams meeting transcripts can be extraordinarily valuable for some people (I wish I had had this capability available when I worked as a Vice President in HP). Whether it’s sufficient to justify the investment in Copilot is a question that only you can answer.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2024/01/22/copilot-for-teams-meeting/feed/ 1 63345