Business

Musk says xAI used distillation on OpenAI models

In a court trial tied to OpenAI’s nonprofit mission, Elon Musk testified that xAI “partly” used distillation on OpenAI models to train Grok.

Elon Musk’s testimony in a California federal courtroom has put a spotlight on a fast-growing AI technique called “distillation,” with implications reaching far beyond any single model.

In the trial connected to Musk’s lawsuit against OpenAI and its top leaders. he was asked whether xAI had used distillation methods on OpenAI models to train Grok.. He did not frame it as a one-off. instead describing distillation as something practiced more broadly across the industry. replying “Partly” to the direct question.. The exchange matters because distillation can help rival labs build capable systems without matching the same level of expensive compute investment.

Insight: If distillation is widely used, it can compress the advantage that comes from large-scale infrastructure spending, making it harder for leading labs to protect their edge through sheer resources alone.

The broader case. now unfolding in federal court. centers on Musk’s claim that OpenAI deviated from its original nonprofit mission by shifting toward a for-profit structure.. Testimony from Musk is taking place as the trial enters new phases. with the court hearing from the tech executive who helped set OpenAI’s early agenda and is now challenging its direction.

Alongside distillation questions. Musk also addressed a separate assertion he made last year about xAI’s expected position in the AI race.. In his testimony. he described the competitive landscape by ranking major AI providers. placing Anthropic at the top. followed by OpenAI. Google. and Chinese open source models.. He also characterized xAI as a smaller organization.

Insight: Market perceptions of who is leading in AI are increasingly tied to technical strategies, not just funding or compute, which is why admissions like this can influence investor and partner confidence.

For the companies targeted by distillation. the concern is not only about competition. but also about how such training approaches may clash with product terms and usage policies.. In this context. distillation is often described as extracting knowledge by repeatedly prompting publicly accessible chatbots and APIs to understand behavior. then using that behavior to train newer systems.

Misryoum understands that efforts to counter distillation have been discussed within the industry. including initiatives that aim to share approaches to reduce systematic querying and protect model behavior.. While it is not clear that distillation is always framed as an outright violation of law. the tension often comes down to contract terms and what counts as acceptable use.

Insight: The court’s attention to distillation signals that the AI industry’s next battles may be fought in legal venues and policy frameworks, not just in labs and benchmarks.