Published on 24 January 2024

Orange trials Copilot, Microsoft 365’s latest AI tool

Since September 2023, 300 employees have been testing Copilot, an AI-powered tool that works alongside the Microsoft 365 suite. The user trial has enabled us to identify the most promising use cases and any potential deployment issues, thanks to the breadth of experience and diversity of the business units involved. Here’s an overview of a project that is as stimulating as it is challenging.

Illustration d'un fond étoilé
Illustration d'un ordinateur avec le logo copilot dedans

What is Copilot? 

Microsoft 365 is mainly known for its suite of apps, including Teams, Outlook, Word, Excel, and PowerPoint. Since last November, this range of tools has been joined by Copilot, a built-in AI chatbot, which can generate content, combine various interactions, accomplish tasks faster, and provide user assistance. The aim is to make life easier for professionals by creating a more seamless experience when using existing software. 


A broad and representative test across departments

To identify the real potential for progress, we conducted a large-scale experiment involving 300 employees, representing a full range of the Group’s business units and departments. 

The goal is to determine exactly where Copilot can create the most value and specifically target the lines of business who will benefit the most in a possible deployment.

Pierrick Besson, Orange Consulting, responsible for supporting the testers
Miniature Pierrick Besson

From September 2023 to the first half of 2024, everyone involved is testing the various tools and proposed use cases and providing weekly feedback. As real and ongoing engagement is essential to the quality of the survey, some employees have left the program while others have joined it. 

Company and employee security comes first

When it comes to generative AI, security comes first for the company, its employees, and the data used, which is why many precautions have been taken. 

This starts with the fact that Copilot is hosted in Europe alongside Orange’s data and only data native to each tester’s desktop environment can be accessed. Everything classified as confidential – for example sensitive or top-level executive data – has been excluded from the test.  

As Pierrick Besson notes, “Security, and the need to understand what was being done and how it was being done, was a key user concern from the beginning. That’s why clear guidelines were defined at the start, so everyone was as well informed as possible.

The full analysis and outcomes will be available at the end of the test; however, some interesting trends are already emerging. 

A need for support

The initial decision was to let everyone freely access the use cases made available by Microsoft. But to include as many people as possible, explains Pierrick Besson, it was essential to start by properly explaining how AI works, to identify the user benefits and teach everyone how to use it to innovate in their working practices. Open Sessions and special support for each profession were also gradually introduced. 

Proportional ownership of the data processed

Secondly, with developers not included in the trial, the businesses using Copilot the most are those that handle the most data, for example the legal department, which drafts and compares contracts, and needs to look for certain clauses. 

The tool helps us produce FAQs on contracts to enable operational staff to understand and search for important provisions, for example. The first draft produced by the assistant requires verification and precision, but the tone and the terms used are accessible and the time saved is significant.

Yannick Jobard, lawyer and tester
Miniature Yannick Jobart

The same goes for customer relations, sales teams, and human resources, i.e. people who need to interact with a lot of people. Norbert Andor, in charge of recruitment and employee experience, has enjoyed using the tool, for example, to create and analyze Excel tables to provide effective support when writing job offers and job descriptions.

Limitations when integrating with traditional tools

There are limitations, for example, in some cases, users have identified value-creating use cases but have been unable to integrate them into their daily working practices, i.e. situations in which Copilot may be useful but where it can’t yet fully interact with certain software, particularly PowerPoint. 

However, the tool is very recent and has been distributed very quickly.It is normal that it still needs to learn and that various apps have to be adjusted to work well with it.

Norbert Andor, in charge of recruitment and employee experience
Miniature Norbert Andor

The importance of data quality

Another observation, made by Yannick Jobard, but something Pierrick Besson raises as well, is that it is also important to think about the types of data that could be useful in advance. That way use cases can be selected according to how they best meet the needs of our teams. “Each profession will need an investment in terms of time and data,” says Norbert Andor.

A dynamic organization, the key to integration

Finally, we can already emphasize the importance of iterative deployments when it comes to introducing Copilot into the organization and integrating lines of business upstream to develop the system properly. This includes integrating existing business tools into Copilot. 

Positive criteria for successful long-term adoption

The success of our Lead the Future strategic plan will be linked, among other things, to evolving the Group’s enterprise model to make things simpler, faster, and more efficient. The use of generative AI will support this transformation, while keeping people, organizational agility, and simplified processes at its heart. We must ensure any efficiency gained by generative AI is actually felt by employees and enables better quality, simplified processes, and a better overall experience. It’s essential everyone understands the tools and feels reassured about using them.  

Using the tool will require us to develop certain behaviors and skills,” emphasizes Norbert Andor: “We’ll have to learn to work with it and develop the right lines of prompting and questioning. In addition, everyone must be aware of the biases generated by these assistants and always make sure that when using them, they continue to uphold the company’s priorities, values, and commitments.”

In light of the extensive feedback on user experiences, it is dealt with in two ways: the first helps to identify use cases and the second helps to detect and deal with issues related to the tool’s ethics, human resources, legal, intellectual property, security, and governance issues.

Deploying the tool is under careful consideration. On the one hand, several other generative AI tools exist and are already available to access securely within the Group, for example, Dinootoo, a conversational agent designed to help Orange discover the power of generative AI. On the other hand, the cost of such a tool requires consideration to balance any investments with the benefits that can be gained from it.