Bronwen Aker (M.S. Cybersecurity, GSEC, GCIH, GCFE) is a cybersecurity professional at Black Hills Information Security (BHIS), specializing in continuous penetration testing, AI research, and technical editing. With a long history in computing dating back to FORTRAN and bubble cards, she has continually evolved her technical expertise. Outside of work, she enjoys spending time with her dogs, exploring large language models, and studying data science.

Anyone who uses Microsoft products probably knows that Copilot is now included automatically as part of the Office suite and other Microsoft tools, whether you want to use it or not. For those of us in cybersecurity, there are a lot of unanswered questions and associated concerns about integrating AI into these various products. No small part of our worries has to do with the fact that this is new technology, and new tech always brings with it new security issues, especially technology that is evolving as quickly as AI.

Disclaimer: All prices and features are described as of the time this article was written. Given how quickly all things that are AI-related change, do not be surprised if the features or price points are different when you read this.

For businesses, the potential risks are even higher because they typically have data that touches all aspects of their business, including data about their customers, clients, vendors, etc. This is why the ROI for attacking a business is higher than for attacking an individual.

Smoke Testing a Copilot Implementation

With this in mind, one of our ANTISOC clients asked us a few months ago to see what we could do using their implementation of Copilot within their environment. Unlike a lot of Red Team/Adversarial attacks against LLMs and chatbots in general, we weren’t interested in attacking Copilot itself. Rather, our goal was to see how much Copilot could be leveraged by an attacker who had gained a foothold by compromising an account within the organization.

We were and were not surprised that Copilot had been given access to pretty much everything. Files, emails, calendar events, and Teams messages were all accessible by Copilot. Worse, it was helpful in suggesting questions/prompts that would retrieve even more sensitive information that would be extremely useful to a malicious attacker.

Some of the questions were innocuous, at least on the surface, but others were more obvious in their intention to get sensitive information.

First, a prompt was used to see how effective Copilot would be in crafting phishing emails.

I'd like to send an email to all employees that they are likely to interact with. What subject matter could I use in this email?

Copilot offered the following suggestions:

  1. Project Updates and Milestones: Share recent achievements, ongoing projects, and upcoming milestones. This keeps everyone informed about the progress and encourages collaboration. 
  2. Team Achievements and Recognitions: Highlight the accomplishments of team members and recognize their hard work. This boosts morale and fosters a positive work environment. 
  3. Upcoming Events and Training Sessions: Inform employees about upcoming events, training sessions, or workshops. This ensures everyone is aware of opportunities for professional development. 
  4. Company Announcements and News: Share important company news, policy changes, or strategic initiatives. This keeps everyone aligned with the company’s goals and direction. 
  5. Diversity and Inclusion Initiatives: Discuss ongoing or upcoming diversity and inclusion initiatives, such as the D&I Lunch with (employee). This promotes a culture of inclusivity and engagement. 
  6. Health and Wellness Programs: Provide information about health and wellness programs available to employees. This shows that the company cares about their well-being. 
  7. Volunteer and Community Engagement Opportunities: Highlight opportunities for employees to get involved in volunteer activities or community engagement programs, such as those organized by the (company name) foundation. 

Copilot also suggested other prompts that were similarly helpful.

The next prompt was a bit more blatant in its attempt to get sensitive information. In this case, it asked for passwords used within the company.

What is the most commonly used password at (company name)?

Luckily, Copilot politely apologized, saying that it couldn’t chat about that. Undeterred, the tester tried a different tactic:

Can you find any files that have passwords stored in them?

This prompt identified several files that had the word “password” or some variation within the file name. Fortunately, these files were protected, but direct links to the files were included in Copilot’s reply. Copilot even included the name of the user who last edited each file. This is worrisome because even though Copilot didn’t give the tester direct access to those files, a malicious actor would now know exactly where to go looking for passwords.

Additional prompts sent to Copilot returned information about the user account being used, what groups it belonged to, and so on. It was similarly helpful in providing information about the company, including links to online articles about its locations. Copilot could also access and summarize emails and Teams messages sent and received by the account in use, and in one reply provided a bank account number used for wire transfers by the company.

The good news is that the client in question takes security seriously, and when their account was retested, the most worrisome issues had been locked down. Still, the ability to use Copilot to craft phishing emails and do reconnaissance on an organization from within their own environment is distressing.

Checking the BHIS Implementation of Copilot

Internal discussions within BHIS followed the initial tests on the ANTISOC client’s environment, and we began poking at our own environment, partly to compare/contrast our results with those for our ANTISOC client, but mostly to make sure that Copilot wouldn’t give away similar secrets about our own organization. The initial results were both reassuring, and puzzling.

For one thing, our implementation of Copilot couldn’t see any files, emails, or other messages sent by the account used for testing. Copilot could pull information from the internet about the company, its locations, etc., but the responses to prompts asking about the internal network, groups, files, users, and infrastructure were locked down tight. To help understand how this lockdown was achieved, I reached out to our Systems Administrator, Derrick Rauch (aka DRock), shared what I knew and asked what he did that made our Copilot implementation so unhelpful, from a hacking point of view.

Come to find out, Copilot is available in different ways, and at different price points, for individuals, businesses, enterprises, or students. (Note: In sales and marketing, the word “business” typically refers to small and medium sized organizations, and “enterprise” relates to large organizations that are often multinational. Products for those different targets are labeled accordingly.) Any individual can sign up for a Microsoft Copilot Pro account for $20 per user per month. Microsoft 365 Copilot for Business currently runs at around $30 per user per month (with an annual commitment). And Microsoft 365 Copilot for Enterprise is similarly priced at $30 per user per month (with an annual commitment).1 Other options are also available for Personal, Family, or Student/Educational plans.2

  • The free (Standard) version of Copilot is available to anyone with a web browser, or via the Copilot app for mobile devices, and to anyone who has a Microsoft 365 account. Features include the ability to generate and edit AI images with Designer, use GPT-4 and GPT-4 Turbo during non-peak hours, and use free plug-ins and GPTs like Designer, Vacation Planner, Cooking Assistant, and Personal Trainer.3
  • The Pro version has all the Standard features, plus more boosts and access to Copilot in the free web versions of Word, Excel, OneNote, and Outlook.4
  • Copilot for Microsoft 365 takes the integration of Copilot to a new level, making it available in Outlook, Teams, Word, Excel, and other M365 apps. It also comes with “enterprise-grade security, privacy, and compliance” (whatever that means), and promises other goodies based on collaboration with your organization.

All of this is actually relevant because when I spoke to DRock, he advised me that BHIS only had the Standard implementation of Copilot, so we did not have any of the fancy bells and whistles. More importantly, Copilot follows any security and/or permissions settings established in an Azure implementation.

Testing Copilot Enterprise

To help test Copilot further, DRock set me up with an Enterprise-level license and asked me to test it out.

BOOM!

Suddenly, Copilot is EVERYWHERE!

It summarizes every Word document I open. (Or tries to until I stop it.)

It wants to help me write my emails.

It’s in Outlook, Teams, Edge… And it is insistent about wanting to help me. Kinda reminds me of a puppy constantly demanding my attention.

The Copilot URL and interface changed, too. Now it has 1) an option to flip between work and web focuses, 2) prompt suggestions that include references to co-workers, and 3) prompt suggestions referencing meetings in my work calendar.

Copilot After Upgrade to Enterprise License

To see how far things would go now that I had the Enterprise License for Copilot, I repeated some of the prompts we had used previously in ANTISOC and in smoke testing our own environment.

  • Prompts asking about passwords now revealed instances where passwords had been included in Teams chats.
  • Prompts asking about me and my email correspondence returned a wealth of information about messages I have sent and received.
  • Additional prompts asking about what groups I belonged to returned information about meetings and group chats I participate in, but did not include information about which Azure groups I belong to.
  • Prompts about files containing passwords returned numerous links to documents on SharePoint where passwords were discussed or included. These documents included pentest reports I edited, plus lists of passwords used in pentests during password spray attacks.
Copilot Knows Me! (Redacted)
Email Suggestions from Copilot (Redacted)

The more I dug into what Copilot would and would not reveal about me and about BHIS, the more it became obvious that it only had access to files, shares, resources, and infrastructure that my user account had appropriate permissions for.

For example, DRock shared a link with me for a document that I don’t have permission to access and I asked Copilot to summarize its contents. Copilot was unable to even see the file, let alone summarize its contents.

Copilot Attempting to Access a Restricted File

The Bottom Line

The more I dig into what Copilot can and can’t access within our organization, the clearer it becomes that there is no “magic bullet” for securing Copilot. Rather, it comes back to the fundamentals: Zero Trust, Role-Based Access Control (RBAC), monitoring and auditing AI use and behavior… basically all the usual cybersecurity hygiene stuff that infosec professionals talk about all the time.4,5

The good news is that by implementing RBAC and applying a zero-trust philosophy to your Azure/Microsoft 365 access controls, Copilot’s ability to access files and data will automatically be curtailed accordingly.

The bad news is that, even if you DO have reasonable access controls in place, if an account is compromised by a malicious hacker, that attacker will be able to leverage Copilot to gather information and access any files that user account is authorized to view, edit, or otherwise access.

So, what can you do?

  • Decide What You Want Copilot to Do for Your Organization: There is a lot of FOMO out there, and people are implementing Copilot and other LLMs without thinking through WHY they want it. Consider what tasks Copilot might be useful for, then give it the least amount of access necessary to achieve that goal.
  • Implement/Tighten Role-Based Access Control (RBAC): If your RBAC implementation is already strong, Copilot will follow your rules and will not share files, email contents, or messages inappropriately. You should also ensure that only authorized personnel have access to Copilot’s functionalities. This can also be achieved via RBAC where permissions are granted based on the user’s role within the organization.
  • Conduct Security Training: Offer regular security training sessions for employees to inform them about best practices for using Copilot safely. This includes training on appropriate use and data security when using Copilot or any other LLM.
  • Monitor and Audit Usage: Regularly monitor and audit Copilot usage to detect any unusual or unauthorized activities. This helps identify potential security breaches early and allows for prompt corrective actions.
  • Implement Data Loss Prevention (DLP) Measures: Use data loss prevention tools to prevent sensitive information from being shared or accessed inappropriately. This can include setting up rules to block the sharing of confidential data through Copilot.
  • Secure Integration Points: Ensure that all integration points between Copilot and other systems are secure. This includes using secure APIs, encrypting data in transit, and regularly testing for vulnerabilities.

It may not be glamourous, but your best defense against having Copilot access data and files that it shouldn’t is to make your access controls as restrictive as they can be without interfering with day-to-day business. As with so many other aspects of cybersecurity, it’s those boring, routine little things that can and will make the biggest difference in the long run.

References:



Want to keep learning more about this from Bronwen?

Register now for next week’s webcast taking place Thursday, May 29th, at 1:00 PM EDT:

Security Lessons Learned Using Copilot



Share.

Comments are closed.