While everyone is enthusiastically adopting AI there are things that you need to have in your contracts with clients and your contracts with your team to be transparent and safe in how AI is used. In this guest blog post from Annabel Kaye of KoffeeKlatch we explore what you need to be aware of.

Generative AI
Generative AI is in things like Canva, and many other applications helping you and your team to generate content. That might be visual content, or words, or just ideas.
You may not own all the copyright in content created using AI. The idea of just ‘changing it enough’ is not entirely accurate. You may end up with a mixed copyright ownership or your copyright and non-copyright material. It is hard to tell as most of the world’s governments are still working out what to do about that.
Copyright in your ts and cs
If your work is about creating unique content and charging your clients to do it, you are going to have to make it plain in your own client ts and cs what type of copyright you are assigning. We did a major update for this on KoffeeKlatch VA contracts in 2024 and it really has helped everyone to get clear on when unique content is for us, and whether that includes the copyright (assuming of course that you own it).
Copyright and your team
To assign the right type of copyright to your client at the right time you need to be sure that your team members are assigning the appropriate rights to you. They too may not own the rights or all the rights depending on how that content was created.
Whilst it is a dangerous thing to try to restrict the tools that your self-employed team are using, in this instance you will want them to notify you if they are using AI to generate content that is being sold on to your clients. That way you can check the platform, check the rights, and get those ducks in a row.
There is on suggestion that you should prohibit the use of AI, just that you should be ‘platform-aware’ and check out the ts&cs on that platform. Some require crediting the AI (Canva does) and that may be an issue for some clients!
Analytical AI
Analytical AI goes through data and information and helps you see patterns or trends. You need to be careful about data that is shared with you by your clients. If that includes personal data, you can’t just run an analytics bot through it. It would be up to your client to specify that is OK.
As many of those bots are ‘learning’ bots there may be data privacy implications if they are going through data that identifies named customers and your client will need to update their data privacy and make suitable arrangements with customers before you do so.
Your own data privacy policy
You will always want to update your own data privacy policy and data audit when you add new software to your business. If you are using this type of AI on data directly collected by you for your business this is where you start.
Before you let the bot into your data, you should be sure:
– It is from a reputable supplier
- You are clear how it uses, stores and shares data.
- That is compatible with your data privacy policy.
You may want to create your own AI policy to set alongside your data privacy policy – or simply update your data privacy policy.
Your team processes
Where your team access personal data shared by your end client with you and your team, that will be governed by the Data Processing Agreement issued by your client (or your Koffeeklatch Data Processing form and documents if they did not). You will pass those instructions on to your outsourced team members using a similar format.
You should double check with them that they are not allowing AI bots into their local hard drives for other clients. Some will go through everything on that hard drive and if you permit local storage you may be having a data breach!
As the data controller (the client) specifies what should be done with their data you should never let AI bots anywhere near it and avoid letting your team let bots run wild.
Rogue bots
Not all AI bots are created equal. Some are rogue using common names of well-respected Ais slightly altered to trick you. Others seem harmless in what they promise but set off processes in your software that you may not be aware of.
You will need your contracts to require your team members to notify you of AI bots they are using so you can check out what is going on. If it is your equipment, you can simply ban them installing anything without your say so. But if its there’s you both need to arrive at transparency as they are self-employed, and it is their machine!
If you’re running a team you will want to be sure they know what’s what. Having a child use your smart phone or laptop for entertainment and downloading a bot by mistake can set off all sorts of security problems that nobody wants to have.
Update your contracts.
We updated KoffeeKlatch contracts for clients and for team members and expect to do so several times as the world of AI changes and the legislative framework around it evolves. At this stage we have kept it as simple as possible, but we are keeping an eye on developments so that we know when and what to update.
You can’t afford to sit on old contracts, old data privacy agreements, old data privacy policies think that what you did before is fine for today. You don’t need to be afraid, but you do need to be aware and keep your contracts up tod ate so they give you the tools you need to protect your data, your client’s data, and everybody’s business.

Annabel Kaye has been helping VAs with contracts and GDPR support for over 16 years. There have been a lot of changes in that time. Two years ago, year Jo Brianti joined the KoffeeKlatch team as a Director . Together the two of them provide a clear technical support system designed to help VAs create a profitable business that does not ignore the realities and legalities of the way you work today.
Annabel Kaye