Before a public organisation or school implements Microsoft’s artificial intelligence service Copilot, it’s a good idea to study the report ‘Piloting Copilot for Microsoft 365’ from the Norwegian University of Science and Technology, NTNU. Under the auspices of the Norwegian Data Protection Authority’s ‘sandboxes’, it has tested Copilot and published 8 main conclusions in early summer 2024 that everyone can benefit from. Below is a thorough summary of the conclusions.
Conclusion 1:
Copilot is great when you already know what you’re asking it to help you with. Copilot can be a great tool for those who already have a good understanding of the task at hand, but can be a challenge for those who are tackling a task for the first time.
Having a thorough understanding of the knowledge you’re trying to impart via Copilot means you can easily identify when there are errors. It also means you can give the service more precise instructions and get more out of it. You can get started on a task faster and get help structuring your work. As a sounding board, Copilot can also contribute ideas and angles and suggest improvements. But if you don’t have a thorough knowledge and are doing a task for the first time, it’s a different story. You may end up spending more time quality-assuring what comes out of Copilot.
Conclusion 2:
Copilot can influence the exercise of public authority. The Nordics have a high level of trust in the public sector and clear public administration rules, and it’s important that tools like Copilot make us better, not worse.
The public sector has a responsibility to avoid errors and must have full control over how we arrive at a decision. It must be able to account for data and judgement and the process must be transparent to maintain public trust. Copilot can give wrong answers or make erroneous judgements. It is important to recognise this and have mechanisms in place to identify and correct such errors – in other words, human control. There is a high risk that the public sector will make mistakes if we take everything Copilot suggests at face value.
Conclusion 3:
Copilot processes huge amounts of personal data in new and uncontrolled ways and can pose major challenges to individual rights and public administration.
Everything you as an individual or public body give Copilot access to can be used and combined in completely new ways. And what happens to all the data that citizens can send to the public sector? It’s important to conduct a DPIA – a Data Protection Impact Assessment – before deployment.
Conclusion 4:
Microsoft’s ecosystem, Microsoft 365 and especially with Copilot, is a huge challenge in management and requires specialised staff to keep up with the many ongoing changes.
Microsoft has an “opt out” policy on many of its services. This means that new features, add-ons and extensions are automatically enabled and administrators must actively disable them if they are not to be available to users. This requires enough staff to keep an eye on the system.
It’s also very expensive, which is why you need to keep an eye on it. It’s a good idea to have an exit strategy from Microsoft because it’s expensive and unhealthy to be dependent on one supplier. Having an exit strategy allows you to stay in control and be more flexible and gives your organisation a better negotiating position with the vendor if you are not locked into one solution.
Conclusion 5:
Copilot is still in development. It is an immature technology that is constantly changing.
An effective way to adapt to evolving products is to actively use piloting, testing and project methodology. This involves trying out new technologies, such as Copilot, in a controlled environment before fully implementing them into the organisation.
Conclusion 6:
New tools affect an organisation and it is important to involve all stakeholders, including trade unions
New features in established tools can create reactions among users. For example
automatically recording of digital meetings with subsequent transcription and summarisation/interpretation of content can be perceived negatively – especially if the content is misrepresented. There is no guarantee that the tool will interpret the content correctly, and this can lead to misunderstandings or misinformation. There is also the risk of deterrent effects such as employees unwillingness to attend meetings where recordings are made, or behaving differently
than they would otherwise.
Conclusion 7:
Copilot is also bossware. Employers can assess employee performance and behaviour based on the written content they have access to, such as files, documents, Teams chats, email correspondence, Teams content, emotional reactions (emojis), meeting transcripts with automatic recording, etc. Employees have no way of knowing if such an assessment of themselves has taken place.
It is possible to deal with this problem. In addition to thorough training of all parties, Copilot can, for example, be prevented from accessing areas where people are informal in tone or what they consider to be their private areas. This will vary from organisation to organisation.Giving Copilot access to email or team chat, for example, should be carefully considered. Another and probably more controversial option is to consider whether managers with HR responsibilities, for example, should not have access to Copilot. This may leave managers “technologically behind”, but it could help protect employee privacy and prevent potential misuse of the tool. There also appears to be a tool that can detect if you’re asking such questions, but it hasn’t been tested to this extent.
Conclusion 8:
Copilot sometimes works really well. But don’t rely blindly on its references.
During the project period, there have been several “aha” moments where Copilot has proved to be an excellent tool. For example, it is good at extracting the essence of large files and compiling them into a new, more focused document. It’s a task that would quickly take a person days to complete, but Copilot can do it in minutes. This means you can get started on tasks faster.As a tool for writing a first draft, Copilot can be very useful for many. Copilot has the option to provide references, but it’s important to note that you can’t rely blindly on these as the tool works today.
Co-translated by deepl.com