The accusation is stark: Google is dismantling the open web piece by piece—not with one decisive blow, but through a sequence of technical, commercial, and governance decisions that, when connected, form a coherent strategy. The essay “Google is killing the open web” —published on Oblomov’s blog— reconstructs that timeline and ties it together with a single thread: the progressive replacement of open, multi-vendor standards with mechanisms and APIs that favor a more centralized ecosystem, instrumental to Google’s advertising and data business.

This report expands on that thesis, contrasts the most significant milestones, adds technical context, and includes objections and nuances from those who see many of these changes as a reasonable evolution of the web platform. The result is not a summary judgment, but a realistic x-ray of a process spanning more than a decade that today shapes how the web is developed, published, and consumed.

A necessary starting point: standards, browsers, and de facto power

The web was born as a realm of open standards (HTTP, HTML, CSS, DNS) governed by organizations like the W3C and the IETF. The first “browser war” of the 1990s showed that when a dominant actor “extends” the web with proprietary technologies, the result is fragmentation and dependency. Chrome appeared in 2008 in a different context: the explosion of mobile, the rise of centralized services, and the decline of Internet Explorer. On that wave, Google drove a cycle of rapid innovation (V8 engine, release cycles, modern APIs) often channeled through WHATWG, a forum where browsers—led by Google—coordinate the evolution of HTML and related standards.

For critics, this represents a “de facto power” that sidelines the W3C and reduces checks and balances. For defenders, it is a way to avoid paralysis and bring the web where users are.

2013 as a turning point: RSS, XMPP, MathML, and the first push against XSLT/XML

The essay highlights 2013 as a pivotal year:

  • Google Reader shutdown. This was not just the end of a product; it weakened content discovery via RSS/Atom, the backbone of blogs, media, and later, podcasts. The official reason was declining usage. The real effect: news consumption shifted even further toward opaque algorithms on platforms.
  • End of XMPP federation in Google Chat (Facebook followed with Messenger in 2014). Interoperable messaging shrank, walled gardens grew.
  • MathML was removed from Chrome. A decade later, it returned thanks to external work. For classrooms and accessibility, rendering math without images or JavaScript was and remains crucial.
  • First serious attempt against XSLT/XML in the browser. XSLT allows documents (like RSS feeds) to be transformed into HTML without JavaScript. Critics say discouraging XSLT increases dependency on JS and server-side logic.

Realistic view. None of these events alone “kills” the open web. Taken together, they shift the center of gravity: fewer client-side standards for presenting data, more reliance on JS and backend logic, more control in layers where Google is strongest.

2015: AMP, <keygen>, and the dawn of the “app mode”

  • AMP (Accelerated Mobile Pages). Promised speed on mobile. Critics argued the “magic” was simply loading less junk and caching on third-party CDNs (primarily Google). AMP pages gained visibility in search, pushing publishers to duplicate work and surrender control.
  • Deprecation of <keygen>. An HTML element to generate key pairs and enable mutual authentication without intermediaries. Figures like Tim Berners-Lee lamented the loss of a mechanism that gave users more sovereignty over their credentials.
  • SMIL targeted. Declarative SVG animations were sidelined. From then on, almost everything moved to CSS+JS, reinforcing client-side logic (and reliance on JS toolchains).

Balance. AMP may have faded, but it left a legacy: mobile performance was solved on Google’s terms. And the removal of <keygen> closed the door to more distributed authentication models.

2018–2020: RSS fades from browsers, URLs lose prominence

Firefox removed native RSS support (“Live Bookmarks”), pushing users to extensions. Chrome was never supportive of feeds. For average users, RSS stopped being a first-class citizen. In parallel, Chrome explored hiding URLs for “usability.” Critics saw this as another step in which the site matters more than the address (and with it, the ability to judge origin and authenticity).

2019–2023: Manifest V3, Web Integrity, and the JPEG XL case

  • Manifest V3. Chrome changed its extension model “for security.” Observers and developers argued it weakened ad blockers by cutting filtering capabilities. Google denies anti-adblock motives, but the public perception was that it directly benefited its ad business.
  • Web Environment Integrity (WEI). Presented as anti-cheat/anti-fraud, many saw it as “browser DRM”: servers could verify if a client was “trusted.” Backlash slowed it down, but the idea left a mark.
  • JPEG XL. An open format with better compression (lossy and lossless), progressive decoding, transparency, and animation. Chrome removed support despite positive trials. Critics saw a lost chance to cut bandwidth costs across the web. Google argued it lacked adoption and clear benefits over AVIF/WebP.

2024–2025: RSS out of Google News, XSLT targeted again

In 2024, Google stopped accepting RSS feeds for inclusion in Google News. Discovering news outlets became even more dependent on internal algorithms.
In 2025, the debate over removing XSLT from browsers resurfaced—this time via WHATWG. Critics note that since 2007 (XSLT 2.0) and 2017 (XSLT 3.0), newer versions exist, including support for JSON. The uncomfortable thesis: if you spend years neglecting a feature, its “low adoption” becomes a self-fulfilling prophecy.

Why XSLT/XML and RSS still matter (even if you never used them)

  • Presentation without JS. XSLT is a declarative templating language: it transforms node trees (XML/HTML/SVG) into others, with built-in structural validation. Less attack surface than string concatenation.
  • Cost and simplicity. Sites can serve XML+XSLT (feeds, sitemaps, catalogs, tabular data) and let the browser render it. Lower server CPU, fewer bytes transferred.
  • Sovereignty and portability. RSS/Atom allow users to subscribe and migrate between clients freely. It underpins podcasts and many federated flows.
  • Accessibility and science. MathML and TEI/XML in digital humanities are clear cases where the client should render content directly, without heavy toolchains.

Yes, all of this can be done with JS. The real question: why remove open, mature options that diversify the ways the web can be built?

Google’s arguments (and why they don’t convince everyone)

  • “Security and maintenance costs.” Maintaining old XML/XSLT parsers is expensive and prone to CVEs. Critics respond: if the issue is outdated implementations, then update (XSLT 3.0, modern libraries) instead of erasing the feature from the spec.
  • “Low adoption.” A circular metric: years of neglect yield fewer users. Plus, browsers never upgraded past XSLT 1.0, blocking broader adoption.
  • “Simplify browser code.” Valid in principle, but clashes with the constant addition of new JS APIs. Critics argue the problem isn’t trimming features, but which features are trimmed—almost always those that give users more autonomy.

A broader lens: not just Google, but Google matters more

It would be unfair to paint this as a single-villain tale. Apple limits Web APIs in iOS; Mozilla has made controversial moves (RSS, various integrations); Microsoft pivoted to Chromium and prioritizes its platform. The difference is scale: with Chrome’s market share and its search dominance, what Google decides often becomes the norm. Even “small” changes can cascade across the web.

Is this “killing the open web”? A sober assessment

Two truths can coexist:

  1. The web platform is more powerful than ever (graphics, multimedia, typography, WebGPU, PWAs, advanced privacy in alternative browsers).
  2. Effective control over what gets prioritized—and what gets dropped—is highly concentrated. When features like XSLT, <keygen>, or JPEG XL disappear, the web loses diverse paths for publishing, authentication, and distribution.

It’s not an apocalypse. It’s an ongoing erosion that narrows the routes available outside heavy JavaScript, centralized services, or blessed APIs.

What users, publishers, and developers can realistically do

  • Keep feeds (RSS/Atom) visible. If you publish, expose them. If you read, use feed readers.
  • Serve XML with XSLT in clear cases (human-readable sitemaps, lists, catalogs, docs). If the browser doesn’t support it, fall back to server-side transforms or polyfills (e.g. SaxonJS).
  • Rethink JS use. Use islands (HTMX, Alpine), SSR, or static rendering where possible. Less JS where it adds no value.
  • Diversify browsers and extensions. Reduces agenda-setting power.
  • Engage in issue trackers. Technical, civil pressure can shift priorities (it already has with some APIs).
  • Document and teach. Explain what tools like XSLT, MathML, and client certificates do, so defaults don’t go unchallenged.

Looking ahead: the next 12–24 months

  • XSLT in browsers. If removed, expect more server-side transforms and occasional polyfills. Humanities and documentation communities may move to custom toolchains.
  • Images. Without JPEG XL, AVIF/WebP will keep growing. Industry will ask for better tooling and stable profiles to avoid codec churn.
  • Extensions. With Manifest V3 now in place, some filtering will remain, but full network-level blocking will shift outside the browser (DNS, routers).
  • Environment integrity. Attempts at new “trust signals” won’t vanish. The key will be limiting them to fraud cases while resisting shortcuts that turn browsers into user policemen.

Conclusion: preserving multiple paths matters

A healthy web is not the one that adopts every “new” feature without looking back, but the one that adds without excluding. RSS, XSLT, MathML, and client certificates are not relics; they are alternative routes that balance power between publishers, readers, and platforms.

When a systemically powerful actor shuts those down, the map shrinks. It’s not the death of the open web, but it is a web that’s smaller and less open. That deserves debate, informed pressure, and—above all—building alternatives.

Frequently Asked Questions (FAQ)

1) Does XSLT still make sense in 2025?
Yes, in scenarios where it’s valuable to separate data and presentation, reduce JS, and validate structures (feeds, sitemaps, catalogs, docs). XSLT 3.0 even works on JSON. Without native support, you can render server-side or use polyfills.

2) Why insist on RSS if “nobody uses it anymore”?
RSS/Atom powers podcasts, syndicates millions of sites (WordPress includes it by default), and gives readers agency: you choose sources without algorithms. It’s a lightweight, open tool. Its disappearance from products is about priorities, not usefulness.

3) Aren’t Google’s security arguments valid?
Security matters. The disagreement is about treatment: update implementations and maintain standards, versus removing features that reduce the web’s diversity. Maintenance costs are real, but what gets maintained is as much a policy decision as a technical one.

4) What alternatives exist to reduce dependence on big platforms?
There’s no silver bullet. Diversify (browsers, hosting, feed readers), minimize JS where unnecessary, stick to open standards (web, DNS, interoperable email), and document reproducible practices. Independence isn’t absolute, but it can be gradual and cumulative.

vía: wok.oblomov.eu

Scroll to Top