Matthew McConaughey Trademarked Himself

Matthew McConaughey has done something no major Hollywood actor had done before him: he trademarked himself. Not a logo. Not a company name. Himself, his face, his voice and his famous catchphrase. The move is novel, it is deliberate and it tells you a great deal about where the law stands on artificial intelligence, likeness rights and the very question of who owns you.

What McConaughey Actually Did

Over the past several months, McConaughey secured eight approved trademark registrations from the U.S. Patent and Trademark Office. The registrations cover short video and audio clips of his face and voice, including a porch scene, a Christmas tree setting, and the three-word line he has been repeating since his 1993 film debut in Dazed and Confused: “Alright, alright, alright.”

This is not a defensive registration of a brand name or a product. It is a registration of the man himself, what he looks like, what he sounds like and how he moves. The stated purpose is to give him a federal legal tool to pursue anyone who uses AI to clone his likeness or voice without his permission.

“My team and I want to know that when my voice or likeness is ever used, it is because I approved and signed off on it,” McConaughey told the Wall Street Journal. His attorney put it more bluntly: they now have a mechanism to stop someone in their tracks or take them to federal court.

McConaughey and his legal team are not responding to a known threat. They are not aware of any AI deepfake involving the actor. This is proactive. That is the point.

Why Trademark and Not Something Else

The legal landscape for protecting your identity against AI misuse is, to put it charitably, a work in progress. Understanding why McConaughey’s team chose the trademark route requires a quick tour of what the other options look like.

Copyright protects creative works: books, films, music, software. It does not protect a person’s face or voice as such. You cannot copyright yourself.

Right of publicity laws protect against the unauthorized commercial use of your name, image or likeness. Every state handles this differently. California and New York have strong protections. Many other states have weak ones or none at all. More importantly, the application of right of publicity law to AI-generated content is largely untested. Courts have not yet worked through the foundational questions of whether training an AI model on someone’s likeness constitutes a violation, whether generating a synthetic version of a voice triggers liability and which state’s law governs when the AI operator is in one state and the victim is in another.

There is no federal right of publicity statute. Congress introduced a bill in 2024, the NO FAKES Act, that would create a national standard, but it has not come to a vote. Until it does, right of publicity claims are a patchwork quilt with gaps large enough to walk through.

Trademark law is different. It is federal. It is established. It provides a cause of action in federal court with well-understood remedies including injunctions and damages. The core requirement is that the mark be used in commerce and that it be distinctive, meaning it identifies the source of goods or services. For McConaughey, whose face and voice are commercially exploited across films, commercials and endorsement deals, meeting that standard is not particularly difficult. His identity functions as a brand. He is now formally treating it as one.

How Other Celebrities Have Dealt With This Problem

McConaughey is not the first celebrity to confront AI misuse of his identity. He is simply the first to try this particular legal strategy at this scale. That distinction may already be narrowing.

Scarlett Johansson famously turned down an offer from OpenAI to voice its ChatGPT assistant. OpenAI released a voice that sounded strikingly similar to her anyway. After Johansson’s attorney sent a letter expressing that she was shocked and angered by the resemblance, OpenAI paused the voice and its president apologized. The episode produced no litigation, no legislation and no lasting legal clarity. Johansson’s attorney, Kevin Yorn, also represents McConaughey, a fact that is not incidental to the trademark strategy now being pursued.

Taylor Swift’s situation has evolved considerably. She dealt with explicit AI-generated deepfakes in early 2024 that circulated widely on social media, and during the 2024 presidential election cycle, AI-generated images falsely suggested that Swift had endorsed Donald Trump, which the then-candidate reposted and shared as genuine. The backlash from the deepfake crisis generated bipartisan congressional attention and contributed to momentum for the NO FAKES Act. But Swift’s experience also illustrated the limits of existing law: platform takedowns are slow, state remedies are inconsistent and the perpetrators are often anonymous and unreachable.

Now Swift appears to have drawn the same conclusion as McConaughey. On April 24, Swift’s company TAS Rights Management submitted three trademark applications to the U.S. Patent and Trademark Office. Two of the applications are sound trademarks covering her voice, and the third is a visual trademark describing a photograph of Swift holding a pink guitar, wearing a multicolored iridescent bodysuit with silver boots, standing on a pink stage. Trademark attorney Josh Gerben, who first publicized the filings, said they are specifically designed to protect Swift from threats posed by artificial intelligence, and that while right-of-publicity laws already offer some defense, trademarks may provide an additional layer of protection in combating fake endorsements, false advertisements and manipulated media. As Gerben explained the theory: if a lawsuit were filed over an AI using Swift’s voice, she could argue that any use of her voice that sounds like the registered trademark violates her trademark rights. The applications have not yet been approved.

Tom Hanks warned his followers about an AI-generated advertisement using his likeness without authorization. He had no commercial remedy beyond a public statement.

Some celebrities have trademarked catchphrases, as Lizzo did with hers, but McConaughey’s approach of securing a broad trademark on his actual visual and audio likeness was without direct precedent among major entertainers when his registrations were approved. Swift’s filings suggest the strategy is becoming a model.

What This Means for Businesses

If you run a business that uses AI-generated content in advertising, marketing or customer-facing communications, the McConaughey registrations and Swift’s pending applications are a meaningful development. They create a federal hook for claims that might otherwise have been limited to slower, less predictable state court proceedings.

The practical implication is straightforward: do not use AI to generate content featuring recognizable individuals without explicit, documented authorization. This was already legally risky. It is now more so. The trademark registrations give rights holders a new theory of liability that does not depend on the unsettled boundaries of right of publicity law.

Businesses should also revisit their AI vendor agreements. If you are using a third-party AI platform to generate images, video or audio, your contract should clearly address who bears liability if the output implicates someone’s likeness rights. Most standard terms of service place that responsibility on the user. Read yours.

Finally, if your business model involves licensing content featuring real people, including influencer partnerships, spokesperson arrangements or archival footage, make sure your agreements explicitly address AI-generated derivatives. The gap between “you may use this image” and “you may use this image to train an AI model or generate synthetic versions of this person” is where litigation is going to happen.

What This Means for Private Individuals

Here is the honest answer: the McConaughey strategy is not directly available to most people, and it would not work the same way even if it were attempted.

Trademark law requires commercial use in commerce. It requires distinctiveness, meaning the mark must function as a source identifier for goods or services. For a private individual who is not a public figure with a commercially exploited identity, obtaining and enforcing a trademark on your own likeness would be legally difficult to justify and practically impossible to enforce at scale.

That does not mean private individuals are without recourse. Depending on the state, right of publicity laws may apply. Defamation law covers false statements of fact. The federal DEFIANCE Act, which passed the Senate unanimously but has not yet cleared the House, would create a civil cause of action for victims of non-consensual intimate deepfakes. But the broader problem of AI-generated synthetic media featuring private individuals in non-intimate but still harmful contexts, including fake statements, manufactured scandals and identity fraud, remains largely outside settled federal law.

The NO FAKES Act, if enacted, would change this by creating a federal right of publicity applicable to everyone, not just public figures. Until that happens, private individuals are navigating a patchwork that varies significantly depending on where they live and what was done to them.

The Deeper Point

McConaughey’s legal team acknowledged they do not know how a court will ultimately rule on these registrations. The USPTO approving a trademark application is not the same as a federal judge sustaining an infringement claim built on that registration. The theory has not been tested in litigation.

What the trademark strategy accomplishes right now is deterrence and leverage. It signals that unauthorized use will be met with a federal lawsuit in a venue with well-established procedures. It creates a paper trail of prior rights. It forces anyone generating McConaughey’s likeness without consent to think twice about whether the legal exposure is worth it.

That deterrence effect does not require the theory to be airtight. It requires it to be credible. And given the legal team assembled around it, it is.

The broader lesson is that the law governing AI and identity is being made in real time, by lawyers willing to test novel theories in front of courts that have not yet drawn the lines. McConaughey’s team is not waiting for Congress. Neither, it now appears, is Taylor Swift. They are building the argument that the existing trademark framework already covers this ground, and daring someone to disagree in federal court.

For businesses and individuals alike, the takeaway is the same: the legal infrastructure for AI accountability is being constructed right now, faster than most people realize, and the decisions being made in trademark offices, in courtrooms and in congressional committees will shape the rules for years to come. Pay attention.


You may also enjoy:

and if you like what you read, please subscribe below or in the right-hand column.