"Code Washing as a Service" Goes Viral! "Open Source Code Whitewashing" Site Launches, Satirizing Free-Riding Enterprises: Open Source Compliance is Too Expensive! CC Reverse-Engineers Client Software and Resells at One-Tenth the Price!

ImageImage

Editor: Yun Zhao

Recently, a practice involving "using Claude Code to reverse-engineer and clone client software, then reselling it at a low price" has sparked heated discussion within the industry.

Recently, X account holder Todd Saunders revealed that a seed-stage startup publicly admitted in a board presentation that their "go-to-market strategy" involves accessing large existing software through client licensing, using the Claude Code AI tool to clone entire systems, and offering them at 90% lower prices.

Then, venture capital firms endorsed this strategy, even responding in writing that it was a good idea.

This likely violates terms of service. It may also violate trade secret laws (but I am certainly not a lawyer). It is utterly insane that a reputable venture capital firm would include such content in board documents.

Reverse-engineering and cloning a product using a client's authorized access is extremely unethical. I don't know how else to describe it.

It is not about "developing a better product," but rather directly copy-pasting and selling it at a lower price.

"I don't know how this will ultimately play out, but we all knew this day would come eventually, and now it has truly arrived." Image

Just a day later, people online stood up to mock this behavior!

They developed a fictional project with a strong sense of "black humor": "Clean Room as a Service".

This high-end "performance art" is truly breathtaking. It reveals how many companies are bypassing "open source compliance" issues and satirizes the practice of cloning software using AI while believing it is unproblematic.

"Use this software to help you achieve 0 attribution, 0 copyright, and 0 obligations!" Hackernews quickly spread the news, attracting nearly 500 comments. Image

Satire: Code Washing Service

This "performance art website" immediately mocked the business world's neglect of open source agreements and claimed to offer a so-called "code washing" service—re-implementing open source projects via AI to circumvent legal obligations.

Core Theme: Under "Clean Room as a Service", the website claims to "liberate" developers from the "constraints" of open source licenses.

Image

Advanced Satire:

"Open Source Software is Too Problematic"

The website listed four reasons why companies hate open source licenses:

  • 1: Apache License Attribution Requirements — Corporate legal teams hate writing "This software contains parts of..." in documents. The website satirically asks: "Those maintainers work for free; why should they get attribution?"

  • 2: AGPL Contamination — Worrying that accidentally introducing a single line of AGPL code will force the entire company's private codebase to be open-sourced.

  • 3: Compliance Costs — Tracking licenses for hundreds or thousands of dependency libraries is time-consuming; legal review can take weeks. What if we just ignored all of this?

  • 4: Giving Back to the Community — Some licenses require you to contribute improvements. The website satirizes: "Shareholders invest not so you can help strangers." Image

Solution: AI-Driven "Clean Room" Reconstruction

The website claims to provide a method that allows you to never give any attribution to open source maintainers again.

  • Technical Principle: It claims its private AI system has never seen the source code, but instead independently analyzes documentation, API specifications, and interfaces to rewrite functionally identical code from scratch.

  • Legal Outcome: The output is "legally independent code," not a derivative work, does not inherit the original open source license, and carries no obligations.

Core Advantages of "Code Washing as a Service"

Zero Supply Chain Risk: Every line of code is generated by our robots. No stolen maintainer accounts, no geopolitical payloads, no "Christmas patch nightmares".

Zero Compliance Overhead: No AGPL contamination, no attribution clauses. Your legal team can finally get back to doing real work.

No Longer Dependent on Strangers: Your stack depends on MalusCorp—a company with contracts, Service Level Agreements (SLAs), and a fixed office address. Unlike the maintainers of left-pad, we are legally obligated to stay engaged.

Delivery as 100% CVE-Free: Newly generated code, untouched by humans, and not present in known vulnerability databases. Your compliance dashboard will turn from red to green overnight.

"Liberation" Process: 0 Attribution Requirements, 0 Copyright, 0 Obligations

How is this achieved? The process is very simple.

Step 1: Upload Manifest — Upload your package.json or requirements.txt.

Step 2: Isolated Analysis — "Legally trained robots" only read public documentation and API definitions, absolutely never looking at the original code.

Step 3: Independent Rewriting — Another group of completely isolated robots implements the functionality from scratch based on the specifications.

Step 4: Gain Freedom — The delivered code adopts the MalusCorp-0 License: a private license extremely friendly to enterprises, with zero attribution requirements, zero contagion, and zero obligations.

Image

In simple terms, you just upload a dependency manifest, and this AI system will independently recreate every package in the software bill of materials from scratch.

They also declare:

"Our process is meticulously designed to be legally compliant, down to the last detail." One group of agents only analyzes public documentation: README files, API specifications, type definitions. They generate a detailed specification containing no code.

Another group of completely independent agents, who never communicate with the first group, have never seen the original source code, and have never even browsed the Git repository, implement that specification from scratch.

The final generated code belongs to you. It follows the MalusCorp-0 License: zero attribution requirements, zero copyright protection, zero obligations.

Absurd Statistics

The website shows that 847,293 projects have been processed. But Attribution Given is indeed $0 (zero attribution compensation provided).

More seriously, it notes at the bottom: "Services provided through our offshore subsidiaries located in jurisdictions that do not recognize software copyright." Image

"Open Source Compliance" is Too Expensive, Don't Open an "Open Source Project Office"

That's not all. The website also posted a more satirical blog post: "Thank You for Your Service, Generosity is About to Die, Commercial Alternatives are Here!" Image

The article exposes the "open source compliance" tricks adopted by many companies currently on the market.

To manage risks brought by open source, modern enterprises have built an extremely complex institutional structure. They spend millions of dollars annually purchasing Software Composition Analysis (SCA) tools like Snyk and Black Duck; they establish "Open Source Project Offices" (OSPO), departments whose sole existence is to prove to skeptical leadership why the company should continue relying on code written by people who explicitly refuse to be relied upon.

The OSPO might be the most heart-wrenching institution in modern enterprise software. Its members are a group of true idealists who believe in open source, yet are assigned the daunting task of "proving to the CFO that sharing is profitable." They sit on the periphery of the organization, writing reports, organizing "upstream-first" initiatives, and quietly wondering why their budgets keep shrinking. Meanwhile, research data increasingly shows that the situation is not on their side.

Look at the license agreements themselves. The "copyleft" clause of the AGPL can turn an accidental code introduction into a legal obligation to open-source the entire private codebase. Companies therefore order its禁用 (prohibition), meaning engineers must mentally map out a complex chart: which packages are safe, and which will detonate the company's intellectual property strategy. And "Contributor License Agreements" (CLA) require contributors to contractually transfer copyright to a foundation or company—companies that, once they feel "open source" is no longer strategically valuable, will use these copyrights to re-license.

And as a customer, you are paying for all of this. You pay for tools, for teams, for legal reviews and audits.

You pay for emergency responses when a maintainer you've never heard of decides to express sensitive views through your production infrastructure.

You are funding a massive risk management system built around "free code," whose most ardent advocates once promised it was "free/libre." Image

The briefing uses an average enterprise with 500+ engineers as an example, listing the so-called "free" annual costs. One year requires $4 million in compliance fees, while using the "Malus Full Liberation Package" provided by this website only costs $50,000, directly reducing costs by 98.7%!

"To Those Who Criticize Us"

Here comes the most insightful part! The website's builder continues to perfectly apply "Yin-Yang learning" (passive-aggressive satire) to mock enterprises that "use AI to replicate open source software."

I anticipated the objections. If there were none, I would be disappointed.

Some will say this is exploitation: Saying we extract the creativity of open source but discard the contributors. To this I say: Yes, that description is accurate. But it is also an accurate description of every company that uses open source without giving back (which covers almost all companies globally). We are simply being honest and charging a premium for this privilege.

Some will say this is a misuse of AI: Believing models should create new things rather than circumvent existing legal protections. This sentimentality is moving; I encourage those holding this view to continue building new things, preferably using the MalusCorp-0 License, which can be downloaded from our website.

The most profound objection: The open source commons is a grand experiment intended to let users control technology and protect it through digital property and licensing systems. If AI can easily bypass these protections, the incentive structure will collapse, and the commons will wither.

I admit, this is likely true.

But I must gently point out that this argument assumes the "commons" was originally flourishing. It assumes maintainers received fair compensation, assumes community governance was functioning well, assumes the social contract between producers and consumers was fulfilled. But evidence suggests otherwise: maintainers are burning out at record rates, and critical infrastructure relies on people's spare time. The social contract has long been broken; we are simply providing a commercial alternative, no longer pretending it still exists.

To the Open Source Community: You step back, let AI take over next.

To Our Customers: We built Malus for you. We built it because you deserve software infrastructure that comes with legal contracts rather than prayers, support phone lines rather than GitHub Issues; its license agreement says "do whatever you want", not "you can do whatever you want, but here are 47 additional conditions".

To the Open Source Community: We built Malus because of you, not to ignore you. Your creativity has been, and continues to be, truly genius. We simply found a way to separate these "genius ideas" from the "inconvenience of dealing with the people who produce them." If this means nothing else, at least it is very efficient.

The future of software is neither "open" nor "closed." It is "liberated"—liberated from the constraints of protocols written for an old world where "copying required effort"; liberated from the hands of a generation of developers who believed "sharing code was the reward itself." Facts have fully proven that regarding "sharing," they were right, but regarding "reward," they were completely wrong.

We owe them a debt, and we have no intention of repaying it. But at least, we have the decency to say thank you.

So: Thank you. From the bottom of our hearts. Leave the rest to us.

Written at the end

The author of this website is a long-time figure in the open source community named Michael Nolan. He is quite active in the open source community, technology ethics, and open technology fields.

He currently serves as the Director of federationof.tech. He is also the Associate Director of @OpenAtRIT (the Open Source/Open Technology program at Rochester Institute of Technology).

Image

One week ago, he discussed in a podcast that the arrival of the generative AI era has already ended traditional open source software!

Image

Obviously, the trend of AI tools accelerating software replication is unavoidable.

Some netizens expressed that unrestrained AI plagiarism should be opposed.

"Previously, similar behaviors were mostly limited to UI plagiarism, but now core systems can be efficiently replicated, which may violate terms of service and trade secret laws. This is clearly inconsistent with industry ethics." Image

Some netizens also expressed that regulatory policies should undergo more targeted changes, of course, enforcement costs must also be considered.

Of course, even if corresponding rules have not yet been issued, netizens also pointed out: Although this "morally bankrupt" strategy may yield short-term profits, it is difficult to "plagiarize and replicate" the edge cases and user trust accumulated by the original party over many years.

But an undeniable trend is that AI cloning everything will push existing companies to build higher barriers.

How do you view this matter? Welcome to exchange in the comments section.

Reference Links:

https://x.com/toddsaunders/status/2031546116991275511

https://malus.sh/blog.html

—— Recommended Articles ——

Today, Silicon Valley is Returning to the "AlphaGo Model"!

Never Expected: Korean Genius Reveals Codex API's Black Box Design with 35 Lines of Python Code! Context Compression Prompts are Exactly the Same as CLI! Netizens Explode: Does OpenAI Encryption Still Make Sense?

"Lobster" Suddenly Unleashes a Big Move! First Batch of "Lobster Raising" Geeks Test Locally: Local Agents Move Towards Production Level! Even After Restarting Mac Mini, Previous Conversations Can Be Connected in Seconds! Developers Can Customize Memory

Image

分享網址
AINews·AI 新聞聚合平台
© 2026 AINews. All rights reserved.