We help Microsoft-centric enterprises fully adopt the cloud & adapt to new ways of working.
Essential Solutions :
Get essential updates instantly! Subscribe
Reduce AI Risk by using your LMS to centralise your AI acceptable use policy and AI best practices training, automate acceptance and understanding, and manage AI policy updates.

Learning Management

How to Use Your LMS to Circulate AI Acceptable Usage Policies

Clare Knight

Commercial Operations Lead

As organisations adopt new technologies such as Generative AI, the need to circulate relevant policies and ensure employees truly understand them has never been greater.

Traditional methods – such as sending PDFs to be digitally signed via email and then storing them in shared areas – are no longer sufficient. They’re hard to find, hard to track and tedious to chase up for managers. They are also easy for the recipients of such requests to simply ignore.

This is where combining policy distribution with your Learning Management System (LMS) can be a really useful approach.

Here’s what’s possible using the right LMS to help with your AI Policy Circulation:

Centralised Policy Distribution & Tracking

When configured correctly, using your LMS to distribute and manage AI policies can deliver everything you’d expect from a dedicated policy management solution – and more besides.

Just like any other learning content, your AI policies can be centrally published, version-controlled and targeted to the right audiences.

And, just like you might do with mandatory learning, you can associate your AI acceptable usage policies with acknowledgements, audit trails and reminders in the event that individuals have not responded to your Policy review requests within a defined time-frame.

Checking AI Policy Understanding with Quizzes

Importantly, rather than simply requiring a ‘read and understood’ tick in a box, you can use your LMS to reinforce your policy statement with short quizzes or knowledge checks, helping to confirm understanding, highlight key principles, and identify areas where additional guidance or training may be required.

Most LMSs support the ability to automatically generate quizzes, such as:

Q: What is the primary purpose of our AI Policy?

A. To replace staff with AI
B. To support safe and effective use of AI in the business
C. To promote the use of public AI tools
D. To eliminate the need for human oversight
Correct answer: B

Tuning AI Training According to Audience

You can also pair policy circulation with targeted training on AI. This might include role-specific learning for different audiences, ensuring employees receive guidance that reflects how they will actually use AI in their day-to-day work.

Where your LMS supports it, AI learning can also be embedded directly into relevant workspaces – for example, incorporating Copilot training into the relevant Microsoft Teams and in ‘bit sized chunks’ – helping to reinforce good practice at the point of use rather than relying solely on standalone courses.

Blocking Copilot Access Until AI Training Is Completed

While Microsoft 365 does not natively ‘gate’ access to AI licences behind training, organisations with the right skills can use their LMS completion data alongside bit of Power Automate to allocate access to Copilot licences only after your required policy review and training has been completed.

Apart from ensuring that AI technology in your enterprise is used responsibly, ensuring users are properly training can reduce the other costs associated with AI. Read more on this subject>

Easing Policy Management & Updates

As with managing eLearning, using an LMS to manage policy distribution makes it far easier for those responsible for governance and compliance to see who has and hasn’t complied, follow up with individuals or groups, and manage renewals over time. We expect there may be many updates to AI policies over time as new risks emerge!

And, just as with the mandatory learning you might have, your AI policies can be reissued as and when they change, acknowledgements can be time-limited, and reports can be produced on demand to demonstrate due diligence.

This provides a clear, auditable record that policies were communicated, understood, and reinforced – not simply published as a one-off exercise and then forgotten.

Such proof may become very important later down the track if/when the ‘proverbial’ hits the fan.

Examples of AI misuse that have hit the UK press in the last year have been quite shocking, and have involved what you might consider ‘respectable organisations’.

Conclusion

As we said in our Halloween article, AI can be scary, but with a clear policy, proper training, good data governance and enterprise-grade tools, you can reduce the chance of accidental data exposure, loss of reputation and uncontrolled expenditure (let’s stop with the downsides already) whilst still extracting the goodness in AI for your business.

Deliver & track your AI learning, understanding and policy compliance in Microsoft 365

This is just one of the solutions we can offer to enterprises to ‘tame’ the AI beast…