In Chinese tech offices, a new arms race has begun. It runs not on hardware, but on .skill files.
A GitHub project called 同事.skill (“colleague.skill”) has gone viral in China’s corporate world. The concept is simple and brutal: feed an AI model enough information about a coworker — their work logic, domain knowledge, communication style, even their daily habits — and you produce a digital replica capable of substituting for them. Once the “colleague skill” is deployed, the original employee becomes, in theory, redundant.

Workers coined a word for it: 蒸馏 (zhēngliù) — distillation. As in, boiling a person down to their replicable essence.
The trend reflects a broader anxiety gripping China’s workforce. 60% of Chinese employees already use AI tools weekly, nearly double the rate of American workers. In a labor market where one major job platform saw college graduate postings fall 22% in the first half of 2025, the fear of being “skilled away” is not paranoia — it’s rational.
The Distillation Economy
The logic behind colleague.skill is an extension of something companies have always done: systematize institutional knowledge so it outlives the person who holds it. AI just makes it faster, cheaper, and scalable. What used to take years of process documentation and knowledge transfer can now be compressed into a prompt file.
The implications are stark. Companies under pressure to cut headcount now have a plausible — and technically credible — path: capture the employee’s knowledge, then eliminate the employee. The skill file becomes the asset; the human becomes a liability.
This is happening inside a labor market already under significant strain. Job postings in functions susceptible to AI — programming, accounting, editing, sales — have declined sharply in China since 2018, according to an analysis of over a million job listings by Peking University. Youth unemployment in the 16–24 age bracket has hovered between 15% and 19%. One Shanghai worker, whose employer cut 30% of its workforce in 2025, described the atmosphere as feeling like Squid Game: “You can get eliminated anytime.”
The Countermove: Anti-Distillation
On April 3, 2026, a creator going by the name Deng Xiaoxian posted a video announcing her response: 反蒸馏.skill — the anti-distillation skill.
Her pitch was direct: “We’re all out here working like cattle. Nobody wants to be turned into a skill file and lose their job. So I invented this.”
The tool works by taking your existing skill file — the one you’ve been asked to submit to management — and processing it through a “cleaning” layer. The output strips out the decision-making heuristics, contextual judgment, and tacit knowledge that make your work actually valuable, replacing them with technically correct but strategically hollow language. The result looks complete. It isn’t.
Anti-distillation.skill produces two outputs:
- A “clean” version for submission — structured, professional, and deliberately vague where it matters
- A private backup — your real knowledge, kept for yourself
The cleaned version comes in three intensities: light, medium, and heavy. Light is for companies that audit carefully. Heavy is for environments where management just wants to check a box. “If your company is just going through the motions,” Deng says, “use heavy cleaning. They’ll just check whether you submitted it.”
The tool went viral on GitHub within days.
What This Tells Us About the Moment
There’s a temptation to read this as a quirky tech story — workers using AI to outsmart their employers’ AI. But it’s more than that.
It’s a signal that the knowledge extraction problem is now mainstream. Companies globally are trying to codify what their employees know before deciding who to keep. Workers, realizing this, are starting to game the extraction process. The result is a kind of institutional knowledge arms race, with each side deploying AI against the other.
This dynamic is playing out against a global backdrop where AI was cited as the leading cause of U.S. layoffs in March 2026, accounting for roughly a quarter of the 60,000+ job cuts announced that month. Globally, AI-attributed layoffs are projected to run nine times higher in 2026 than in 2025, according to a Duke University/Federal Reserve survey of 750 CFOs. Finance job openings have already fallen to their lowest level since the 2008 financial crisis — not because banks are struggling, but because they need fewer people to do the same work.
The honest truth, noted by some analysts, is that not all “AI layoffs” are really about AI — some are post-pandemic corrections dressed up in tech language because it plays better with investors. But that cynicism cuts both ways. When workers can’t tell the difference between genuine automation and strategic optics, they respond to the threat they perceive, not the one that’s technically real.
Colleague.skill and anti-distillation.skill are what that response looks like.
The Deeper Stakes
What anti-distillation.skill is really protecting isn’t data. It’s judgment — the part of expertise that can’t be easily documented because it’s embedded in context, intuition, and years of navigating specific situations.
That’s also, not coincidentally, what makes experienced workers hard to replace. AI models can approximate a lot, but they struggle with the tacit knowledge that makes the difference between a good call and a bad one. The colleague.skill project implicitly acknowledges this — it tries to capture not just what someone knows, but how they think and work.
Anti-distillation.skill is a bet that the gap between documented knowledge and actual judgment is large enough to matter. Given where AI capabilities currently sit, it’s probably a good bet — for now.
The longer-term question is how long that gap holds. As AI gets better at reading between the lines of what employees submit, the cleaning tools will need to get more sophisticated. And so it goes: each new capability on one side generates a countermeasure on the other.
China’s workers, navigating one of the world’s most intense AI adoption environments, have simply made this dynamic visible — and given it a name.