LICENSING YOUR WORK, IMAGE, OR LIKENESS TO AI COMPANIES: LEGAL RISKS AND KEY CONSIDERATIONS.

In recent months, some of the world’s most recognisable voices have taken deliberate steps to engage with artificial intelligence (AI) on their own terms. Hollywood Actors Matthew McConaughey and Michael Caine in November 2025 announced that they had licensed digital versions of their voices to ElevenLabs.[1] McConaughey in January of 2026, revealed that he had trademarked his signature phrase “alright, alright, alright” aimed at curbing potential AI misuse.[2] Taken together, these developments signal a shift towards performance artists and creatives taking control of the narrative instead of waiting on an infringing act to occur in order to take action.

For Kenyan creatives and performers, this global conversation has profound local implications as AI technologies become more accessible and widespread across Africa. Questions around ownership, control and compensation are no longer abstract, they are real considerations that performance artists, sports personalities, or creatives may be confronted with. This article briefly discusses the key factors that performance artists, sports personalities, or creatives should pay close attention to when considering licensing their work, image or likeness to AI companies.

What AI Platforms Need: Understanding Your Licensable Assets.

There are three kinds of data that AI platforms need for training, particularly generative systems such as large language models used by companies like OpenAI. These include publicly available information; images, sound recordings, videos, logos, or text that are protected by copyright, image rights, or other related intellectual property regimes; and information that may have previously been protected as IP but, due to the lapse of the protection period, is now freely available for use.

What you can license depends on your role and what you’ve created. Creatives who produce original works (writers, photographers, artists, producers) can license their copyrighted material; performers can license both their performances under copyright and their voice or likeness under constitutional privacy protections as well as data protection laws; while sportspeople, influencers, and public figures typically license their image, name, and likeness based on constitutional privacy protections rather than copyright.

AI Licensing Agreements: A New paradigm.

While licensing is not a new concept, licensing your work, image, or likeness to an AI company represents a different paradigm. The legal and commercial assumptions that underpin traditional licensing arrangements do not translate neatly into the AI context. Here are five things that we think you need to pay attention to if you are considering licensing your work, image, or likeness to an AI company:

a. Scope of Use

In traditional licensing agreements, how your work will be used is clearly defined, you are able to track its use and set limits on the purpose of the use. However, with AI particularly large language models, even if limits are set, there is still a risk of an open-ended use of your work, image or likeness. This is because training data is not typically made available to the public in its original form; it is instead absorbed into the AI model and influences how the system generates future outputs. [3]

b. Attribution

Traditional licensing arrangements usually links attribution and credit to identifiable uses such as a soundtrack, film credit, or book cover. Your work is directly consumed by the audience, distribution channels are known and credit is visible.

AI licensing complicates attribution significantly. Where a work or likeness is used for training purposes, the final generated output by the system may not be directly traceable to any individual creator, even though it is influenced by your work, image, or likeness. Given the scale at which AI systems operate, with millions of users generating outputs, there is currently no practical mechanism for tracking how, when, or to what extent a particular creator’s contribution has shaped those outputs.

c. Termination and the Problem of Permanence.

Termination in traditional licensing is relatively straightforward: when the licence expires, use of the work, image, or likeness must cease. In AI licensing, however, termination is far more complex. Once an AI model has been trained, it cannot “unlearn” the data or even where it does, it is a resource intensive process that the company may not be willing to undertake. Even if the agreement expires or is terminated, your contribution may continue to influence outputs indefinitely. This permanence fundamentally alters the risk profile of AI licensing and undermines the practical effectiveness of termination clauses.

While Kenya’s Data Protection Act does provide certain protections, it is restricted to personal data. Such protections may apply more readily to performers or sportspeople than to copyright works generally. Consequently, while data protection law may impose duties on AI companies as data controllers or processors, it does not eliminate the issue of permanence that arises once AI training has taken place.

d. Auditing AI Use.

Conventional license agreements allow licensors to audit usage, enabling them to verify the number of plays, territories, box office receipts, sales records, distribution channels and license compliance among other factors to ensure accurate usage and compensation.

AI systems operate as black boxes, making it difficult to observe what is happening inside, how models are trained, and there is an inability to verify which outputs were influenced by your work, image, or likeness.  Furthermore, because these AI companies control all the data, you are entirely dependent on their reporting.

e. Compensation

Compensation models are typically well established and tied to measurable metrics based on decades of industry practice, allowing you to benchmark your compensation against similar deals.

When it comes to AI licensing, the market is still very new, there are no established valuation methods. Additionally, and the models in use vary wildly with some only receiving one-time payments while others negotiate ongoing royalties but with no standard rate. Generative AI platforms do not support traditional royalty models, as it is impossible to track outputs the same way as streams, sales, plays or broadcasts. This makes compensation in AI licensing remains largely speculative.

These structural uncertainties have led a number of scholars to question whether AI licensing is viable for most individual creators. In practice, meaningful AI licensing arrangements tend to favour large content owners with extensive catalogues, brand recognition, and significant bargaining power. Independent creators and smaller rights holders often lack the leverage and technical insight necessary to negotiate terms that adequately reflect long-term value, making them more vulnerable to accepting compensation that may significantly undervalue future use.[4]

Licensing to AI Companies is a Risk-Based Decision, not a Routine Transaction.

Any Kenyan considering licensing their work, image, or likeness to an AI company must do so with a clear understanding that the law, technical safeguards, and industry standards are still evolving and are not keeping pace with AI development. While regulatory and contractual innovations are emerging, many of the challenges identified above, such as traceability of use, meaningful attribution, auditing, and containment of downstream exploitation, cannot currently be resolved through drafting alone.

That said, a recent decision by a German regional court characterised AI systems specifically Large Language Models as incapable of independent creation, with its outputs treated as communicating memorised content rather than generating wholly new works. The court viewed AI outputs as a form of direct communication involves granting direct user access to protected works to.[5] While this reasoning is not binding in Kenya, it reflects a principle already embedded in Kenya’s Copyright Act: unauthorized use of protected works constitutes infringement. However, the uncertainty in Kenya, is not whether copyright protection exists, but rather how courts will interpret and enforce these protections in the AI context.

As a result, it is important to approach AI licensing not as a conventional rights transaction but as a risk-allocation decision made under conditions of uncertainty. Entering into such agreements will often require compromise, particularly regarding control, transparency, and enforcement; and these compromises should be made consciously rather than by default.

In this context, compensation becomes the most crucial protective mechanism. Where attribution is difficult, auditing is limited, and termination does not negate use, the economic terms must reflect the permanence, unpredictability, and potential future value of the licensed contribution. Creators should prioritise compensation structures that account for long-term and evolving use, including higher upfront payments, ongoing revenue participation, minimum guarantees, or mechanisms for periodic review.

Equally important is preserving the possibility of renegotiation should material changes occur, whether through technological advances, expanded use cases, or shifts in the legal landscape. Until more robust technical and legal solutions are developed, fair compensation and contractual flexibility remain the most practical tools available to balance the asymmetry inherent in AI licensing arrangements.

An Evolving Conversation on Ownership, Control, and Value

Licensing creative works, images or likeness to AI is no longer a speculative future issue; it is a present-day commercial and legal reality. As this article has demonstrated, while traditional intellectual property concepts such as attribution, auditing, and compensation still provide a useful starting point, they operate very differently in the context of AI. The opacity of AI systems, the difficulty in tracing usage, the permanence of training data, and the absence of established valuation models mean that entering these agreements requires a heightened level of awareness, caution, and strategic thinking. AI licensing should be approached not as a routine transaction but as a carefully negotiated allocation of risk and value.

This conversation is far from settled. Laws, industry practices, and technical safeguards are still catching up, and the perspectives of those affected will shape how fair and sustainable these arrangements become. That said, this is not a discussion to sit out. AI is already reshaping how creative value is extracted and monetised, and informed participation is one way to retain agency.

 If you are considering licensing your work, image, or likeness—or have already done so—what concerns you most, and where do you see the greatest gaps in protection? Join the conversation by sharing your thoughts, questions, or experiences in the comments below.

[1]  Jeffrey,C. (2025 November 12), Techspot, ‘Matthew McConaughey and Michael Caine are licensing their voices to AI’ accessed on 30/01/2026 via Matthew McConaughey and Michael Caine are licensing their voices to AI | TechSpot

[2] Cobbina, K. BBC, (2026 January 01) ‘Matthew McConaughey trademarks iconic phrase to stop AI misuse’, accessed on 30/01/2026 via https://www.bbc.com/news/articles/cp87z6vexl3o

[3] Dawen Zhang, Boming Xia, Yue Liu, Xiwei Xu, Thong Hoang, Zhenchang Xing, Mark Staples, Qinghua Lu, and Liming Zhu. 2024. Privacy and Copyright Protection in Generative AI: A Lifecycle Perspective. In Proceedings of the IEEE/ACM 3rd International Conference on AI Engineering – Software Engineering for AI (CAIN ’24). Association for Computing Machinery, New York, NY, USA, 92–97. https://doi.org/10.1145/3644815.3644952

[4] Sag, M, (2025, November 19). “The False Hope of Content Licensing at Internet Scale, ProMarket. Accessed on via https://www.promarket.org/2025/11/19/the-false-hope-of-content-licensing-at-internet-scale/

[5] Copyright Infringement in Form of a Reproduction of Preexisting Works in a Large Language Model, GRUR International, 2026;, ikag009, https://doi.org/10.1093/grurint/ikag009

2 thoughts on “LICENSING YOUR WORK, IMAGE, OR LIKENESS TO AI COMPANIES: LEGAL RISKS AND KEY CONSIDERATIONS.

  1. Timely and well articulated article. What stands out for me is how traditional licensing models are ill suited in the AI era.

Leave a Reply

Your email address will not be published. Required fields are marked *