Copilot for Word – Office 365 for IT Pros https://office365itpros.com Mastering Office 365 and Microsoft 365 Thu, 05 Sep 2024 09:34:14 +0000 en-US hourly 1 https://i0.wp.com/office365itpros.com/wp-content/uploads/2024/06/cropped-Office-365-for-IT-Pros-2025-Edition-500-px.jpg?fit=32%2C32&ssl=1 Copilot for Word – Office 365 for IT Pros https://office365itpros.com 32 32 150103932 Copilot’s Automatic Summary for Word Documents https://office365itpros.com/2024/09/05/automatic-document-summary-word/?utm_source=rss&utm_medium=rss&utm_campaign=automatic-document-summary-word https://office365itpros.com/2024/09/05/automatic-document-summary-word/#comments Thu, 05 Sep 2024 07:00:00 +0000 https://office365itpros.com/?p=66234

Automatic Document Summary in a Bulleted List

Last week, I referenced the update for Word where Copilot for Microsoft 365 generates an automatic summary for documents. This is covered in message center notification MC871010 (Microsoft 365 roadmap item 399921). Automatic summaries are included in Copilot for Microsoft 365 and Microsoft Copilot Pro (the version that doesn’t ground prompts using Graph data).

As soon as I published the article where I referred to the feature, it turned up in Word. Figure 1 shows the automatic summary generated for a document (in this case, the source of an article).

 Copilot generates an automatic document summary
Figure 1: Copilot generates an automatic document summary

The summary is the same output as the bulleted list Copilot will generate if you open the Copilot pane and ask Copilot to summarize this doc. Clicking the Ask a question button opens the Copilot pane with the summary prepopulated ready for the user to delve deeper into the summary.

The summary is only available after a document is saved and closed. The next time someone opens the document, the summary pane appears at the top of the document and Copilot generates the summary. The pane remains at the top of the document and doesn’t appear on every page. If Copilot thinks it necessary (for instance, if more text is added to a document), it displays a Check for new summary button to prompt the user to ask Copilot to regenerate the summary.

Apart from removing the Copilot license from an account (in which case the summaries don’t appear), there doesn’t seem to be a way to disable the feature. You can collapse the summary, but it’s still there and can be expanded at any time.

Summarizing Large Word Documents

When Microsoft launched Copilot support for Word, several restrictions existed. For instance, Word couldn’t ground user prompts against internet content. More importantly, summarization could only handle relatively small documents. The guidance was that Word could handle documents with up to 15,000 words but would struggle thereafter.

This sounds a lot, and it’s probably enough to handle a large percentage of the documents generated within office environments. However, summaries really come into their own when they extract information from large documents commonly found in contracts and plans. The restriction, resulting from the size of the prompt that could be sent to the LLM, proved to be a big issue.

Microsoft responded in in August 2024 with an announcement that Word could now summarize documents of up to 80,000 words. In their text, Microsoft says that the new limit is four times greater than the previous limit. The new limit is rolling out for desktop, mobile, and browser versions of Word. For Windows, the increased limit is available in Version 2310 (Build 16919.20000) or later.

Processing Even Larger Word Documents

Eighty thousand words sounds a lot. At an average of 650 words per page, that’s 123 pages filled with text. I wanted to see how Copilot summaries coped with larger documents.

According to this source, the maximum size of a text-only Word document is 32 MB. With other elements included, the theoretical size extends to 512 MB. I don’t have documents quite that big, but I do have the source document for the Office 365 for IT Pros eBook. At 1,242 pages and 679,800 characters, including many figures, tables, cross-references, and so on, the file size is 29.4 MB.

Copilot attempted to generate a summary for Office 365 for IT Pros but failed. This wasn’t surprising because the file is so much larger than the maximum supported.

The current size of the Automating Microsoft 365 with PowerShell eBook file is 1.72 MB and spans 113,600 words in 255 pages. That’s much closer to the documented limit, and Copilot was able to generate a summary (Figure 2).

Automatic document summary generated for the Automating Microsoft 365 with PowerShell eBook.
Figure 2: Automatic document summary generated for the Automating Microsoft 365 with PowerShell eBook

Although the bulleted list contains information extracted from the file, it doesn’t reflect the true content of the document because Copilot was unable to send the entire file to the LLM for processing. The bulleted list comes from the first two of four chapters and completely ignores the chapters dealing with the Graph API and Microsoft Graph PowerShell SDK.

Summaries For Standard Documents

Microsoft hasn’t published any documentation that I can find for Copilot’s automatic document summary feature. When it appears, perhaps the documentation will describe how to disable the feature for those who don’t want it. If not, we’ll just have to cope with automatic summaries. At least they will work for regular Word documents of less than 80,000 words.


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/09/05/automatic-document-summary-word/feed/ 5 66234
Microsoft Grounds Copilot Apps with Graph and Web Content https://office365itpros.com/2024/03/25/copilot-for-microsoft-365-grounding/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-microsoft-365-grounding https://office365itpros.com/2024/03/25/copilot-for-microsoft-365-grounding/#comments Mon, 25 Mar 2024 08:00:00 +0000 https://office365itpros.com/?p=64268

Office Apps Get Better Grounding in Copilot for Microsoft 365

Message center notification MC734281 (12 March 2024) might have passed by without too much attention unless you’re particularly interested in Copilot for Microsoft 365. The notification informs tenants that Word, Excel, PowerPoint, and OneNote will ground user prompts by reference to enterprise data and the web. As Microsoft notes, this is like what happens when users interact with Copilot for Microsoft 365 chat.

Grounding against enterprise data means that when Copilot responds to user prompts, it will seek additional context by attempting to find relevant information in Microsoft 365 repositories using Graph requests. Web grounding means that Copilot will use Bing search to find relevant information from sites within and outside the enterprise. The fact that major apps will start to use grounded requests from April 2024 might come as a surprise. After all, Microsoft has long cited Copilot’s ability to use the “abundance of data” stored in Microsoft 365 as a major advantage of Copilot for Microsoft 365 over other AI tools that don’t have access to Microsoft 365 repositories.

The roll out starts with Word (Windows and Online) and progresses to PowerPoint, Excel, and OneNote. Microsoft expects to complete the deployment by September 2024.

The Importance of Grounding

Microsoft explains that grounding is “the process of using large language models (LLMs) with information that is use-case specific, relevant, and not available as part of the LLM’s trained knowledge.” In other words, if you ask Copilot for Microsoft 365 to do something and grounding doesn’t happen, it relies on the user prompt to query the LLM.

Until now, users have been able to ground prompts in apps like Word by including up to three reference documents in the prompt. Let me illustrate the importance of grounding by showing an example of two briefing notes generated by Copilot in Word about the Midnight Blizzard attack against Microsoft in January 2024. Copilot generated the first briefing note without any reference documents. Because it couldn’t search the Graph or web for relevant information, the grounding of the prompt was poor, and Copilot could only use whatever information is in the LLM.

As shown in Figure 1, the generated text included several inaccurate statements (hallucinations), including the remarkable assertion that the attack led to a drop of $400 billion in Microsoft’s market value together with a declaration had deprived millions of Microsoft cloud users from accessing services.

Briefing note about Midnight Blizzard generated by Copilot for Microsoft 365 (without reference documents).
Figure 1: Briefing note about Midnight Blizzard generated by Copilot for Microsoft 365 (without reference documents)

If some relevant reference documents are included in the prompt, Copilot’s generated text becomes more accurate and balanced (Figure 2).

Briefing note about Midnight Blizzard generated by Copilot for Word with reference material.
Figure 2: Briefing note about Midnight Blizzard generated by Copilot for Word with reference material

The important point here is that after Microsoft updates Copilot to allow the Office apps to ground prompts using Graph and web material, the chances of Copilot generating absolute rubbish lessen considerably. That is, if Copilot can find relevant information through its searches. Adding reference documents to prompts in Copilot for Word will generate even better results because the reference documents should give Copilot a more precise context to work with.

Microsoft says that Graph grounding is enabled for all user prompts and that Copilot requests will use “the file context” (whatever file is open at the time) plus web searches as well. Copilot for Microsoft 365 chat uses Graph and web lookups today.

The Quality of AI-Generated Text

In some respects, I was shocked that it has taken so long for Microsoft to ground Copilot requests in these important apps. Copilot for Microsoft 365 is evolving rapidly, but the ability to generate high-quality text at general availability seems like an essential rather than a nice to have feature. I’ve always been suspicious about the quality of the text generated by Word and this revelation certainly explains a lot.

Take Your Time

The advice of Directions on Microsoft analyst Wes Miller that organizations should pace themselves and understand exactly what they are buying before they invest in expensive Copilot licenses is accurate. Things are changing, and the hyperbole around Copilot is like a dust storm that obscures detail. Why rush in where angels fear to tread?

Before making your mind up about Copilot, take the time to read the article posted by MVP Joe Stocker where he reports a drop-off of Copilot activity after the novelty effect of asking the AI to perform tasks wears off. Although the sample size was small, this emphasizes the need to support users on their Copilot journey, especially as important new functionality like Graph and web grounding appears.

And if you attend the Microsoft 365 Conference in Orlando at the end of April, make sure that you come to my session about not letting Copilot for Microsoft 365 become a vanity project. You might even enjoy what I have to say!


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across the Microsoft 365 ecosystem, including in Copilot. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/03/25/copilot-for-microsoft-365-grounding/feed/ 1 64268
Stopping Copilot Access to SharePoint Online Sites and Document Libraries https://office365itpros.com/2024/02/21/exclude-sharepoint-site-from-copilot/?utm_source=rss&utm_medium=rss&utm_campaign=exclude-sharepoint-site-from-copilot https://office365itpros.com/2024/02/21/exclude-sharepoint-site-from-copilot/#comments Wed, 21 Feb 2024 01:00:00 +0000 https://office365itpros.com/?p=63738

Exclude SharePoint Site from Copilot by Blocking Search Indexing

One of the fundamental concepts underpinning Copilot for Microsoft 365 is the use of Graph queries to find information stored in Microsoft 365 to help ground user prompts. Grounding is the process of providing additional context to make it easier for Copilot to return high-quality responses to user prompts. For instance, if someone asks Copilot to write a briefing note about Office 365, Copilot first queries Microsoft 365 repositories like SharePoint Online to discover what information the user already has about the topic. Optionally, if allowed by the tenant, Copilot can query the web to find additional information.

After gathering information, Copilot refines the prompt and sends it to the Large Language Model (LLM) for processing. Eventually, possibly after further refinement, Copilot returns the response to the user.

Copilot Access to Content Stored in Microsoft 365 Repositories

One of the things you quickly learn about Copilot for Microsoft 365 is that the quality and reliability of generated text is highly dependent on the availability of information. For instance, Copilot is very good at summarizing Teams meetings because it has the meeting transcript to process. However, if you ask Copilot to draft text about a topic where it cannot find anything in Microsoft 365 to ground the prompt, Copilot will certainly generate a response, but the text might not be as useful as you expect. The output will certainly follow the requested format (a report, for instance), but the content is likely to surprise because it is likely to come from a web search that might or might not retrieve useful information.

Users can guide Copilot for Word by providing up to three reference documents. In effect, the user instructs Copilot that it should use the reference documents to ground the prompt. This works well, unless the documents you want to use are large (I am told that Microsoft is increasing the maximum supported size for reference documents).

All of this means that anyone contemplating a deployment of Copilot for Microsoft 365 should store information within Microsoft 365 to create what Microsoft calls an “abundance of data” for Copilot to consume. SharePoint Online and OneDrive for Business are prime repositories, but it’s possible that some SharePoint Online sites contain confidential or other information that the organization doesn’t want Copilot to consume.

Remember, Copilot can only use information that the signed-in account using Copilot can access. An account that has access to a site holding confidential information could find that Copilot retrieves and uses that information in its responses. The user is responsible for checking the text generated by Copilot, but accidents do happen, especially when time is short to get a document out.

Preventing Copilot Access to Sensitive Information

Two methods help to avoid accidental disclosure of confidential information. First, you can protect files with sensitivity labels. If Copilot consumes protected documents, it applies the same sensitivity label to the output.

However, not every organization uses sensitivity labels. In this situation, an organization can decide to exclude selected SharePoint Sites from indexing (Figure 1) by both Microsoft Search and the semantic index. If content is not indexed, it can’t be found by queries and therefore cannot be consumed by Copilot.

Configuring a SharePoint site to exclude it from search results.

Exclude sharepoint site from copilot
Figure 1: Exclude SharePoint Site from Copilot Access by Stopping it Appearing in Search Results

But what happens if you have a SharePoint site with several document libraries and want to make the content available from some libraries and not others? The answer is the same except that the exclusion from search results is applied through the advanced settings of document library settings (Figure 2).

Settings for a document library.
Figure 2: Settings for a document library

The downside of excluding sites or libraries from search results is that people can’t use SharePoint search to find documents.

Testing Excluded Sites and Document Libraries

How do you know site and document library exclusions work? The easiest way is to create a document with an unusual phrase in the excluded site or library and then attempt to use it with Copilot for Word. I created a document about ‘Project Derrigimlagh’ and included the phrase ‘wicked worms’ several times in the content. I then created a new Word document and added the document from the excluded library as a reference (Figure 3).

Selecting a reference file for Copilot for Word
Figure 3: Selecting a reference file for Copilot for Word

You might ask why the document can be added as a reference. The dialog shows recent documents, and the document is in this category, so it shows up. However, when Copilot attempts to consume the document, it cannot access the content. The result is that the prompt cannot be grounded and Copilot flags this as a failure to generate high-quality content (Figure 4). This is a general-purpose error that Copilot issues anytime it believes that it cannot respond to a prompt.

Copilot for Word can't generate high-quality content
Figure 4: Copilot for Word can’t generate high-quality content

Interestingly, when I removed the reference document and reran the prompt, Copilot generated text explaining the potential use of wicked worms as a biofuel source. This is emphatically not the content stored in the excluded document library. The information about Derrigimlagh came from the internet, and making wicked worms into a biofuel source is probably due to published material about using worms in a biorefinery. In any case, it’s a good example of how AI-based text generation needs to be treated with caution.

Use Sensitivity Labels If Possible

If an organization has implemented sensitivity labels, I think this is a better method to protect confidential material, if only because of the persistence of labels to generated documents. You can also define a default sensitivity label for a document library to make sure that everything stored in the library is protected and use auto-label policies to find and protect confidential material stored across all sites.

In a nutshell, sensitivity labels are more flexible and powerful, but it’s nice to have the backup of being able to exclude complete sites and individual document libraries. Just another thing to consider in a Copilot deployment!


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2024/02/21/exclude-sharepoint-site-from-copilot/feed/ 1 63738
Don’t Feed Large Reference Documents to Copilot for Word https://office365itpros.com/2024/01/02/copilot-for-word-reference/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-word-reference https://office365itpros.com/2024/01/02/copilot-for-word-reference/#comments Tue, 02 Jan 2024 01:00:00 +0000 https://office365itpros.com/?p=62989

Copilot for Word Reference Documents Can be Too Large to Process

I’m happily using Copilot for Word to generate, refine, and summarize text when I run into an issue that afflicts all AI technologies based on large language models (LLMs): the prompts generated for the LLM to process support a limited number of characters. I can’t say precisely what that limit is because I can’t find any documentation for the issue, but I can say that incorporating a large reference document into a prompt causes Copilot some difficulty.

Take the prompt shown in Figure 1. As a reference document, I added a 518 KB 27-page Word document which happens to be the first chapter of the Office 365 for IT Pros eBook. I asked Copilot to use the information to help it generate a brief overview of the value Office 365 brings to customers.

Adding a reference document to a Copilot for Word prompt.

Copilot for Word reference document.
Figure 1: Adding a reference document to a Copilot for Word prompt

Copilot worked away and began to generate text. After several seconds, the output was ready but came with the caveat that Copilot couldn’t process the reference document fully (Figure 2). The output generated by Copilot is “based only on the first part of those files.” In some cases, this might not make a difference, but the latter half of the reference document contained information that I thought Copilot should include.

Copilot for Word reports a reference document is too long.
Figure 2: Copilot for Word reports a reference document is too long

The question is why can’t Copilot use the full content of large reference documents. Here’s what I think is happening.

Grounding and Retrieval Augmented Generation

Copilot for Word uses reference documents to help ground the prompt entered by the user with additional context. In other words, the content of the reference document help Copilot understand what the user wants. Copilot uses a technique called Retrieval Augmented Generation (RAG). According to an interesting Microsoft article about grounding LLMs, “RAG is a process for retrieving information relevant to a task, providing it to the language model along with a prompt, and relying on the model to use this specific information when responding.”

Limits exist in grounding large language models. Copilot allows users to include a maximum of 2,000 characters in their prompts. Copilot adds content extracted from the reference documents and other information found in the semantic index to the prompt to provide the context for the LLM to process. The semantic index holds information about documents available to the user stored in SharePoint Online or OneDrive for Business or ingested via a Graph Connector. The maximum size of a prompt must cover whatever the user enters plus the information extracted from reference documents during grounding.

I have very large Word documents of well over 1,000 pages, but it would be unreasonable to tell Copilot to use these files to ground prompts. There’s too much content covering too many varying topics for Copilot to make much sense of such beasts.

Good Copilot for Word Reference Documents

A good reference document is one whose content is adjacent to the topic you ask Copilot to generate text about. Ideally, the document is well structured by being divided into clear sections that cover different points. A human should be able to scan the document quickly and tell you what it’s about. My tests indicate that Copilot for Word generates the best results when reference documents are structured, contain material pertinent to the prompt, and are less than 10 pages. Your mileage might vary.

Although chapter 1 of the Office 365 for IT Pros eBook is packed full of useful and pertinent information, it’s just too much for Copilot to consider when attempting to respond to the user prompt. Copilot would be much happier if I provided it with a five-page overview of Office 365.

Other Copilots Have Limits Too

Encountering difficulties using long reference documents is similar to the limit that exists when Copilot for Outlook attempts to summarize a long email thread. According to the support article covering the topic, “In the case of a very long thread, not all messages may be used, as there are limitations of how much can be passed into the LLMs.”

Copilot for GitHub also has limits, as attested in many questions developers ask about its use (here’s an example).

In other Copilots, the type of information being processed might reduce the possibility that Copilot might run into issues. For instance, when Copilot for Teams summarizes the discussion from a meeting, it uses the meeting transcription as its basis. Even a very long meeting is unlikely to trouble Copilot too much because (assuming the meeting has an agenda), the discussion flows from point to point and has a reasonable structure.

Preparing for Copilot

All of which brings me back to a central point about preparing for a Copilot for Microsoft 365 deployment. You can deploy all the software you want, including the tools available in Syntex (soon to be SharePoint Premium) to prepare content and Microsoft Purview to protect content. But at the end of the day, Copilot will be asked to process documents created by human beings. Whether those documents make good reference documents remains to be seen.

It’s a hard nut to crack. Humans never wrote documents to be processed by AI. They created documents to meet goals, explain projects, lay out solutions, and so on. Sometimes the documents are well-structured and easily navigated. Other times they’re a challenge for even their authors to interpret, especially as time goes by. Some documents remain accurate even after years and some are outdated in the weeks following publication. It will be interesting to see how Copilot copes with the flaws and imperfections of human output.


Insight like this doesn’t come easily. You’ve got to know the technology and understand how to look behind the scenes. Benefit from the knowledge and experience of the Office 365 for IT Pros team by subscribing to the best eBook covering Office 365 and the wider Microsoft 365 ecosystem.

]]>
https://office365itpros.com/2024/01/02/copilot-for-word-reference/feed/ 1 62989
Using Microsoft 365 Copilot for Word https://office365itpros.com/2023/12/14/copilot-for-word/?utm_source=rss&utm_medium=rss&utm_campaign=copilot-for-word https://office365itpros.com/2023/12/14/copilot-for-word/#comments Thu, 14 Dec 2023 01:00:00 +0000 https://office365itpros.com/?p=62822

Copilot for Word Will Help Many Authors Create Better Text

As folks might know, I write quite a few articles about technical topics. Recently, I’ve had the assistance of Microsoft 365 Copilot in Word. Not because I felt the need for any help but rather in the spirit of discovering if Copilot lives up to its billing of ushering in “a new era of writing, leveraging the power of AI. It can help you go from a blank page to a finished document in a fraction of the time it would take to compose text on your own.”

Good technical articles tell a story. They start by introducing a topic and explaining why it’s of interest before progressing to a deeper discussion covering interesting facets of the topic. The final step is to reach a conclusion. Copilot for Word aims to help by assisting authors to structure their text, write concise sentences, and start drafting based on a prompt submitted by the author.

Starting Off with Copilot for Word

Writing the first few sentences can be the hardest part of an article. To help, Copilot for Word can generate text by responding to a user prompt. A prompt is how to tell Copilot what to do. It can be up to 2,000 characters.

Crafting good prompts is a skill, just like it is to build good keyword searches of the type used to find information with Google or another search engine. Figure 1 shows my first attempt at a prompt for this article.

Prompting Copilot for Word.
Figure 1: Prompting Copilot for Word

I wasn’t happy with the content generated by Copilot because it read like the text of a marketing brochure. This isn’t altogether surprising given two facts. First, my prompt wasn’t precise enough. Second, generative AI tools like Copilot can only create text based on previous content. The response obviously originated from Microsoft marketing content that lauded the powers of Copilot.

A second attempt was more concise and precise (Figure 2) and produced more acceptable text (Figure 3).

Refining a prompt for Copilot for Word.
Figure 2: Refining a prompt for Copilot for Word
The text generated by Copilot for Word.
Figure 3: The text generated by Copilot for Word

Although better, I would never use the text generated by Copilot. It has value (especially the last three points), but it’s just not my style. The point to remember is that Copilot supports refinement of its output through further prompts. The text shown in Figure 3 is the result of asking Copilot to “make the text more concise.”

Using Reference Documents

A prompt can include links (references) for up to three documents, which must be stored in a Microsoft 365 repository. Copilot uses references to “ground” the prompt with additional context to allow it to respond to prompts better. When starting to write about a new topic, you might not have a usable reference, but in many business situations there should be something that helps, such as a document relating to a project or customer. The prompt shown in Figure 4 asks Copilot to write an article about the January 2024 update for the Office 365 for IT Pros eBook and includes a reference document (an article about the December 2023 update).

Including a reference document in a Copilot for Word prompt
Figure 4: Including a reference document in a Copilot for Word prompt

The generated text (Figure 5) follows the structure of the reference document and I no complaints about the opening paragraph. Copilot even figured out that the January update is #103. The problems mount swiftly thereafter as Copilot’s generated text promises a new chapter on Microsoft Viva and an updated chapter on Copilot for Microsoft 365, neither of which exist. I also don’t know what the integration between Teams and Syntex refers to, and the new Teams Pro license is a predecessor of Teams Premium. Later, we’re told that Microsoft Lists will launch in February 2024. These are Copilot hallucinations.

Copilot generates an article about an Office 365 for IT Pros monthly update.
Figure 5: Copilot generates an article about an Office 365 for IT Pros monthly update

This experience underlines the necessity to check everything generated by Copilot. You have no idea where Copilot might source information and whether that data is obsolete or simply just wrong. Tenants can limit Copilot’s range by preventing it from searching internet sources for information, but even the best corporate information stored in SharePoint Online or OneDrive for Business can contain errors (and often does).

Rewrites with Copilot for Word

Apart from generating text, Copilot for Word can rewrite text. Figure 6 shows a rewrite of the second paragraph from this article. The version generated by Copilot uses the “professional” style (the other styles are “neutral”, “casual”, “concise,” and “imaginative.”

Text rewritten by Copilot for Word.
Figure 6: Text rewritten by Copilot for Word

The two versions are reasonably close. I prefer mine because it’s written in my style, but the alternative is acceptable.

Rewrite is useful when reviewing someone else’s text. I often edit articles submitted to Practical365.com for publication. Because authors come from many countries, their level of English technical writing varies greatly. Being able to have CoPilot rewrite text often helps me understand the true intent of an author.

The Usefulness of Copilot for Word

I’ve tried many different text proofing tools in Word, from the built-in ones like Microsoft Editor to external ones like Grammarly. They all have their pros and cons, and their own quirks. Copilot for Word is more user-friendly and intuitive than any existing tool. If they remember to check the generated text carefully, Copilot will help many people write better. The downside is the $30/user/month cost for Microsoft 365 Copilot licenses (currently, you can’t buy a Copilot license just for Word).

Microsoft 365 Copilot obviously covers much more than generating better text with Word. That being said, it’s nice that the integration of AI into one of the more venerable parts of Microsoft 365 works so well.

Summarizing Copilot for Word

It seems apt to close with the summary generated by Copilot for this article (Figure 7). Copilot summarizes documents by scanning the text to find the main ideas. What’s surprising in this text is the inclusion of ideas that are not in document, such as “What Copilot for Word cannot do.” Copilot cites paragraphs five and six as the source, but neither paragraph mentions anything about weather or visuals, or that Copilot for Word is limited to outputting text in bullet points or paragraphs. This information must have come from the foundational LLMs used by Copilot.

Copilot summary of a document's content.
Figure 7: Copilot summary of a document’s content

I’m sure Copilot included the information to be helpful but it’s jarring to find the AI introducing new ideas in summaries. Oh well, this kind of stuff gives people like me stuff to write about…


So much change, all the time. It’s a challenge to stay abreast of all the updates Microsoft makes across Office 365. Subscribe to the Office 365 for IT Pros eBook to receive monthly insights into what happens, why it happens, and what new features and capabilities mean for your tenant.

]]>
https://office365itpros.com/2023/12/14/copilot-for-word/feed/ 3 62822