Why Public APIs Are Shuttering in the Age of AI

Why Public APIs Are Shuttering in the Age of AI

There’s nothing particularly new about APIs calling it quits and closing up shop.

Twitter, for example, ended free access to its API in 2023 as part of a monetization push by Elon Musk. Netflix shuttered its public API for third-party developers back in 2014 and implemented strict rules around data scraping for the APIs that it brought in to replace them.

The aims of these closures were pretty clear: Musk hoped to create additional revenue streams after shelling out $44 billion to purchase what we now call X, while Netflix wanted to focus on creating APIs for internal and partner consumption. Whether people liked them or not, the closures formed part of a wider business strategy.

Recently, we’ve seen a number of formerly public-facing APIs become more limited or shutter completely at pace, including services from Slack, Salesforce, and OpenAI. As our own Bill Doerrfeld comments, it’s “in line with the ongoing move from a previous open, global internet toward more geographically-dependent, protected estates, a trend called digital sovereignty.”

But, as we’ll see below, many of these closures feel different in their nature and might not necessarily have come by choice. With the rise of agentic API consumption and web scraping, companies may increasingly feel obliged to shutter APIs to protect their data and intellectual property, as well as cover their own backs against data breaches and security risks.

Below, we’ll cover why such moves could prove to be hugely problematic for third-party developers and how they could ultimately change the face of the internet.

The Golden Age of Public APIs

Throughout the 2000s and 2010s, public APIs were seen as a way to accelerate growth, attract users, and dominate the market. At one point, literally millions of applications were built with or relied on Facebook’s Open Graph API. One notable example is the almighty Farmville, which itself shuttered in 2020 when browser support for Flash was discontinued.

Early versions of the Google Maps API were instrumental in the creation of thousands of apps and tools, enabling developers to outsource maps and geolocation features with a few clicks. Without the Google Maps API, the SoLoMo (social, local, mobile) revolution would probably never have taken hold in the 2010s to the extent that it did. And it was everywhere.

Using Twitter’s API, developers created services like Tweetbot, Echofon, and Favstar.fm. These apps were impactful and endured because they almost exponentially expanded Twitter’s features without the company’s own developers having to do any work on them. However, today’s companies don’t necessarily want third-party developers taking their lunch anymore.

Reddit, for example, killed off apps and readers like Apollo, RedditIsFun, and BaconReader in 2023 when it significantly hiked the price of consuming its APIs. That’s a significant change in direction from its purchase of AlienBlue, which Reddit acquired back in 2014 and subsequently rebranded as its official app for iOS after ditching its own attempt at one. A similar story occurred around the shuttering of Instagram’s public API back in 2015.

Walling Off the Gardens

Make no mistake: the API closures are now coming thick and fast.

Strava, for example, stopped sharing fitness data with third parties in 2024, citing privacy and AI scraping as two major concerns. OpenAI is killing off its Assistants API, a beta attempt at building AI agents, in 2026. At the end of 2024, Spotify removed access to many of its API features, ending a long history of its inclusion in hackathons and coding experiments.

Often leaving breaking changes in their wake, some API closures are perceived as brutal and unceremonious. Marvel’s API, live for a decade plus, shuttered with a 150-word email and a voucher for access to one free month of Marvel Unlimited. As Kin Lane comments on a post about its shutdown, “the first rule of API club is that you will have your heart broken repeatedly.”

These closures are no longer just about the economic viability of building and maintaining public APIs. In many instances, they’re designed to restrict data from being accessed by large language models (LLMs), which are quickly earning a reputation for disregarding the API contract, which can pose some significant security risks.

Since there’s no putting the genie back in the bottle when it comes to revoking data from LLMs, many companies are erring on the side of caution and pulling up the ladder sooner than later. But that’s not the only reason they might want to limit LLM and agentic access to their data.

The Rise of Proprietary Services

These days, it can feel like every tool and app you’ve ever used is introducing an agent, bot, or copilot designed to unlock its true potential — whether or not anyone actually asked for it.

And, in fairness, some of these tools are very slick. Take Alibaba.com’s Accio, for example, which is a B2B product sourcing agent built primarily with Alibaba’s own open-source Qwen LLM. The agent can do everything from sourcing products, comparing them, and managing supplier outreach.

Competent developers could, however, use Alibaba.com’s APIs with their own LLM to replicate at least the majority of its capabilities at a fraction of the price. While most consumers will happily pay a premium for the convenience of an all-in-one user experience, the possibility alone leads one to believe sites like this will introduce measures to prevent power users from taking a DIY approach.

In practice, this could mean a shuttering of one or more APIs, a shift from public to private, or the introduction of limitations. For example, while it isn’t closing its entire API, Slack recently changed rate limits for its conversation history and replies API to one request per minute.

Tweaks like these are an easy way for companies to prevent customers from using their data to power an LLM, strong-arming you into using their official offerings instead. In the above example, these take the form of “AI in Slack” features that are available via subscription.

API Monetization and the AI Factor

While AI isn’t the only factor in play here — successfully monetizing an API, particularly a public API that’s widely used, remains something of an art form — it’s a significant one. With the use of agentic consumption and bot scraping accelerating rapidly, this may just be the beginning.

Then again, as companies adapt and figure out how best to treat AI agents and LLMs, we may find ourselves in another golden age of APIs. It’s telling, for example, that, in an echo of Twitter and Facebook’s early days, Mastodon, Threads, and Bluesky all currently offer relatively open APIs.

And Gartner predicts a 30% increase in API consumption by 2026, driven by AI and LLMs, with greater adoption of MCP likely to expand API usage, too. That could be good news for providers of paid APIs who are able to get their heads around agentic consumption in time.

On the other side of the coin, it means that consumers should be prepared for additional restrictions and APIs being shuttered. That could mean additional expenses in your existing workflows or, at worst, having to say goodbye to services you’ve been using for years.

A middle ground will likely emerge, with limited data points available via public APIs and more focus on private or partner APIs that businesses permit the consumption of by LLMs. API providers looking to monetize their offerings will need to look closely at costs and may want to consider entirely separate pricing models for agentic AI consumption.

The Future of APIs and AI Is Murky

While it’s almost certain that the explosive growth of AI will lead to increased API consumption, that may or may not be a good thing for the space, depending on your roles in the industry.

As we’ve seen above, there’s a ton of potential for API providers to further monetize their products by opening up to LLMs or create their own proprietary AI-driven tools. The downside of that is that it requires fencing off data that may currently be open or free.

If that day does come, there isn’t a whole lot that API consumers can do about it.

That’s the danger of building a house on someone else’s land. So, if you’re planning to use a company’s API for something, it might be a smart idea to look at what it’s doing around AI. If it’s launching a new smart tool, you might consider that a warning shot for your own project.

Overall, it’s difficult not to feel like the utopian future promised by APIs — transparency, adaptability, workflow automation, and so on — might be under threat right now. Hopefully, this will all prove to be just a bump in the road, rather than a complete change in direction that takes us toward data silo-ization, vendor lock-in, and the erosion of interoperability as we know it.

AI Summary

This article examines why public APIs are increasingly being shut down or restricted as AI-driven consumption and data scraping reshape the incentives and risks for API providers.

  • Historically, public APIs fueled growth, innovation, and third-party ecosystems, but many providers are now reversing course.
  • Recent API shutdowns and restrictions are often driven by concerns over AI scraping, agentic API consumption, and loss of control over proprietary data.
  • Large language models frequently bypass intended API usage patterns, increasing security, compliance, and intellectual property risks for API providers.
  • At the same time, monetizing widely used public APIs remains difficult, especially when AI agents can extract value without clear revenue attribution.
  • As a result, many organizations are shifting toward private, partner, or tightly rate-limited APIs, while investing in proprietary AI-powered features.

Intended for API providers, platform teams, and developers evaluating the future of public APIs in an AI- and agent-driven ecosystem.