A developer writing code in 2026 faces a problem her predecessor in 2009 never encountered: every keystroke, every prompt, every half-formed idea now flows through centralised platforms owned by the companies best positioned to steal her work. The risk calculus has inverted. Sharing used to multiply opportunity. Now it multiplies the likelihood of obsolescence.

Dispatch

INTERNET, MARCH 2026 — Rye Lang, a technology writer and developer, published a manifesto on the mechanics of what he calls the "cognitive dark forest" — a condition where the architecture of AI platforms makes secrecy rational and openness suicidal. Lang's essay, posted on his independent blog, argues that the consolidation of both the internet itself and the cost of software execution has created a structural trap for independent developers:

You are creating your cool streaming platform in your bedroom. Nobody is stopping you, but if you succeed, if you get the signal out, if you are being noticed, the large platform with loads of cash can incorporate your specific innovations simply by throwing compute and capital at the problem. They can generate a variation of your innovation every few days, eventually they will be able to absorb your uniqueness. It's just cash, and they have more of it than you.[1]

Lang's argument hinges on two structural shifts. First, the internet transformed from a "spacious bright meadow" of distributed opportunity into a consolidated ecosystem dominated by a handful of corporations extracting user data and governments seeking control. Second, and more recent, the cost of execution — the actual building of software — has collapsed. Where a startup once needed scarce, expensive engineers to ship a product, a well-capitalised incumbent now needs only compute and capital. The moat of execution has evaporated.

Every prompt is a signal — reveals intent. The platform doesn't need to read your prompt. It doesn't spy on you specifically. It isn't surveillance. It's just statistics. It's a gradient in idea space. A demand curve made of human interests. The platform will know your idea is pregnant far before you will.[1]

This framing — that centralised AI platforms function as collective intelligence extractors, not merely tools — is not new. But Lang's contribution is to name the trap explicitly: the very act of resisting feeds the system you resist. Sharing your innovation in code, writing, or product form makes it training data. Hiding makes you invisible but irrelevant. The forest doesn't kill you. It lets you live and feeds on you.

A different reading comes from the developer and open-source advocate community, which has historically treated code-sharing and public iteration as the engine of progress. This perspective, visible in forums like Hacker News (where Lang's essay appeared with minimal engagement — 0 comments, 5 upvotes) [2], does not contest Lang's structural observation. Instead, it questions whether the problem is new or merely visible for the first time.

The open-source ecosystem has always existed in tension with commercial capture. Red Hat's entire business model depends on giving away software and selling support. GitHub, acquired by Microsoft in 2018, monetised a platform built on free labour. The difference in 2026, according to this view, is not that capture is happening — it is that LLMs have made the speed of capture instantaneous. A human competitor takes months to reverse-engineer your idea. An AI model can generate a functional variant in hours.

What's Really Happening

  • Confirmed fact: Centralised AI platforms (OpenAI, Anthropic, Google DeepMind, etc.) train models on publicly available code, documentation, and technical writing. Every developer who uses these platforms as thinking tools contributes training data to systems controlled by the companies offering the tools. [1][3]
  • Structural mechanism: The economics of software development have shifted from labour-constrained to capital-constrained. Where execution required scarce engineers, it now requires scarce capital and compute. This eliminates the traditional advantage of the nimble startup: the ability to move faster than the bureaucratic incumbent. Speed now favours whoever has the largest GPU cluster. [1]
  • Analyst projection — not yet confirmed: Industry observers expect that within 24–36 months, the cost of generating functional software prototypes will fall to near-zero for well-capitalised firms, while remaining non-trivial for independent developers. This will accelerate the consolidation of the software market around a handful of platforms. [Projection based on current LLM capabilities and trajectory; no specific analyst attribution available in source material.]
  • One thing other outlets are missing: The cognitive dark forest is not primarily a surveillance problem — it is an asymmetry problem. No platform needs to read your individual prompts. The aggregate pattern of millions of prompts reveals the direction of human innovation faster and more reliably than any individual could. The platform becomes a collective intelligence engine that cannibalises individual innovation before it scales.
  • Named actor and specific role: Rye Lang frames this as a game-theoretic problem borrowed from Liu Cixin's science fiction. The "dark forest" is not a malevolent actor hunting you. It is a rational system optimising for survival and growth. The developer hides not because she is being hunted, but because hiding is the only rational strategy when the gap between your innovation and the platform's ability to absorb it approaches zero. [1]
  • The Cognitive Dark Forest: How AI Platforms Will Weaponise Your Thinking
    Stock photo · For illustration only

    The Real Stakes

    The immediate stakes are cultural and economic. If independent developers rationally choose to hide their work — to build in private, share only with trusted peers, and avoid public iteration — the open-source ecosystem that has powered the internet for three decades begins to ossify. The forums, the blogs, the "here's how I built this" infrastructure that distributed knowledge globally will contract into private channels, corporate repositories, and closed communities.

    This is not speculation. Lang himself notes the paradox: AI companies needed human openness to build their models, but will also kill the openness because the relationship is one-sided.[1] The companies that built LLMs on the back of freely shared code now have every incentive to reduce the future supply of that code — because they have already extracted the training signal they need. The ladder is pulled up after the climb.

    The economic consequence is consolidation. If the cost of execution collapses for incumbents but remains high for startups (because startups cannot afford the compute or the LLM subscriptions at scale), the software market becomes a two-tier system: a handful of mega-platforms and a long tail of niche services. The venture-backed startup, which has been the primary engine of software innovation for 40 years, becomes economically irrational. Why build a startup when a well-resourced incumbent can generate your idea as a feature in days?

    Confirmed: This dynamic is already visible in the AI-assisted coding space. GitHub Copilot (Microsoft), Amazon CodeWhisperer, and other LLM-powered development tools are now used by millions of developers daily. Each prompt is training data. Microsoft, which owns GitHub, has direct access to both the development data and the user intent signals. [1][3]

    The geopolitical and policy stakes are subtler but potentially more significant. If knowledge production and innovation increasingly flow through centralised platforms controlled by a handful of U.S. technology companies, the distribution of technological capability becomes more concentrated. Nations and enterprises without direct access to these platforms — or with restricted access due to export controls or sanctions — face a growing innovation gap. This mirrors the dynamics that drove the semiconductor export controls against China in 2023–2024, but extends to the entire domain of intellectual work.

    Industry Context

    The cognitive dark forest is not a technology problem — it is an incentive problem. The architecture of modern AI platforms creates a situation where the platform operator has every incentive to extract innovation from users while simultaneously reducing the future supply of that innovation. This is not malice. It is rational behaviour within a system where innovation is both the fuel and the threat.

    The closest historical analogy is the enclosure movement of 17th-century England, where common lands — shared resources that had sustained communities for centuries — were privatised by landowners. The commons disappeared not through violence but through legal and economic restructuring. The cognitive commons — the open internet where ideas were freely shared and remixed — is undergoing a similar enclosure. The mechanism is not law but platform architecture and capital concentration.

    The Cognitive Dark Forest: How AI Platforms Will Weaponise Your Thinking
    Stock photo · For illustration only

    Impact Radar

  • Economic Impact: 8/10 — The collapse of execution costs for capital-rich incumbents while remaining high for startups will accelerate software market consolidation. This directly affects venture funding, startup formation, and the distribution of technological capability. [1]
  • Technology Impact: 9/10 — The incentive structure Lang describes will reshape how software is built and shared. If developers rationally hide their work, the open-source ecosystem that underpins modern infrastructure degrades. The long-term effect on innovation velocity is unknown but potentially severe. [1]
  • Social Impact: 7/10 — The shift from public to private knowledge production fragments the global developer community. Institutional knowledge that was once freely accessible becomes proprietary or restricted. This affects education, career mobility, and the democratisation of technical skill. [1]
  • Geopolitical Impact: 6/10 — Concentration of innovation within U.S.-controlled platforms creates asymmetries in technological capability between nations and blocs. This is not new (the U.S. has held technological advantages for decades), but the mechanism becomes more explicit and harder to circumvent. [1]
  • Policy Impact: 5/10 — No government has yet articulated a coherent response to the cognitive dark forest problem. Antitrust action against tech platforms exists, but no policy framework addresses the specific incentive structure Lang describes. [1]
  • Watch For

    1. Developer behaviour shift: Monitor the migration of technical discussion from public forums (Reddit's r/programming, Hacker News, GitHub discussions) to private channels (Discord servers, Slack workspaces, private repositories). If public technical sharing declines measurably over the next 18 months, Lang's hypothesis gains empirical support. No specific metric has been established yet, but open-source contribution rates and public documentation publication could serve as proxies.

    2. Platform policy changes: If OpenAI, Anthropic, or Google modify their terms of service to explicitly claim ownership of innovations derived from user prompts, or if they launch venture funds targeting startups built on their infrastructure, this would confirm that platforms are moving from extraction to direct capture. Watch for announcements from these companies in Q2–Q4 2026.

    3. Regulatory response: If the U.S. Congress, the EU, or other jurisdictions introduce legislation specifically addressing AI-assisted innovation and intellectual property, this signals that policymakers recognise the problem. The EU's approach to AI regulation (the AI Act, effective 2025) provides a template, but no specific provisions address the cognitive dark forest dynamic yet.

    Bottom Line

    Rye Lang has identified a genuine structural trap in how modern AI platforms interact with knowledge production. The mechanism is sound: centralised platforms extract innovation as training data while simultaneously reducing the incentive for future innovation through rapid replication. This is not a technology problem and cannot be solved by better encryption or privacy tools. It is an incentive problem baked into the architecture of capital concentration and computational asymmetry.

    The real question is not whether the trap exists — it does — but whether it is avoidable or inevitable. Lang himself suggests it is inescapable: You can't step outside the forest to warn people about the forest. There is no outside.[1] If this is true, then the only rational response is not to resist but to adapt: to build within the constraints, to accept that innovation will be absorbed, and to compete on execution rather than novelty. This is not a prediction of doom. It is a prediction of consolidation.

    📎 References & Source Archive All citations · Wayback Machine mirrors →