FinanceAIAPIs

The Real Reason Financial APIs Are Losing Their Moat

By Armando J. Perez-Carreno · Featuring Tommy Carter

I talked with Tommy Carter from Benzinga about why the financial API business that printed money in the 2010s is getting commoditized by AI agents, why hedge funds ask about your archive before anything else, and how prediction markets are quietly turning into their own asset class.

If your business is selling access to a financial API, the moat you thought you had is evaporating. Tommy Carter is on the product side at Benzinga, selling data feeds to the top five US banks and hundreds of smaller clients, and he said it plainly. Most data sets have been commoditized, most APIs can now be substituted, and the value is shifting to who owns the data that is actually hard to get.

In this episode, I talked with Tommy Carter from Benzinga, the financial media company in Detroit that writes stock market news for retail investors and licenses the same feeds to brokerages, hedge funds, and quant shops. Tommy studied mechanical engineering, got pulled into a project manager role, and ended up running API product development. Engineering school taught him the problem-solving toolbox, not the content, which is how most of us actually get used in the working world.

Here is the shift he is watching in real time. Ten years ago, the reason a brokerage or a fintech licensed a Benzinga API was straightforward. Building an internal team to gather stock market news, SEC filings, press releases, and analyst commentary would have been insane. Licensing was the only sane path. That is why APIs were a moat. Data providers sat on top of decades of editorial infrastructure, and every consumer app had to pay to play.

Now, a hobbyist at two in the morning can sign up for a Benzinga key with no credit card, wire it into Claude, and build themselves a generative dashboard that is more personalized than Bloomberg Terminal. Benzinga deliberately made it that easy, because the old model, fill out a form, wait for someone in London to call you back three days later, get the key on Wednesday, is dead. If you make someone talk to a salesperson to try a key, you are losing to whoever does not. And once the MCP server ecosystem landed, plugging an API into an LLM stopped being a dev project. It is a one-click operation.

But the deeper point is what happens after the agent has the key. AI agents can increasingly compile their own data sets. They can scrape, synthesize, cross-reference, and normalize. A lot of the APIs that used to be differentiated are now substitutable. So the value shifts. Tommy put it clean: companies should stop focusing on building a simple API product or a simple dashboard product, and focus on delivering data that is hard to get. The moat is the source, not the pipe.

The example he gave that I keep thinking about is the 84-terabyte historical news archive Benzinga sells. Why would anyone want 84 terabytes of stock market news. Because if you want to train a model to trade a strategy, say buy when positive earnings surprise, sell when specific negative catalysts hit, you need timestamped news paired with historical price data so you can back-test the strategy. Every hedge fund and quant shop that calls Benzinga opens the conversation by asking about the archive and whether the data is point-in-time, meaning does the news entry reflect what was known on that date, not what was rewritten later. That archive is the moat. You cannot spin up a 20-year corpus of timestamped financial journalism with an agent overnight.

The other frontier Tommy brought up, and I had no idea how big this was getting, is prediction markets. Platforms like Kalshi and Polymarket are letting users bet on real-world outcomes that were never tradable before. How many rides will Uber deliver last quarter. Will the US and Iran sign a peace deal by April 30th. How many tweets will Elon Musk send between April 7th and April 14th. I pulled up Polymarket while we were talking and one of the tweet-count markets had $14 million of weekly volume. This is not a novelty anymore. It is a parallel market for KPIs, political events, sports, crypto, geopolitics, and corporate metrics, and it is drawing real liquidity. The companies being bet on are not involved at all. It is just a free market around observable outcomes.

A lot of traders are already running prediction market arbitrage, pairing a Polymarket position against a related traditional-market position to lock in small guaranteed profits. You can see where the AI agents are going with this. Give an agent access to both a prediction market and the underlying news feed, let it run strategies overnight, and you have a whole new category of small quant operations that did not exist two years ago. Some of it will blow up spectacularly. Some of it will quietly print. All of it is going to reshape how data providers think about what they sell.

At the end of the day, if you are building a data business, the old playbook of "charge enterprise prices for access to a clean feed" is running out of runway. The new playbook is own the corpus that nobody else can easily compile, make it trivially accessible so the hobbyist at 2 a.m. can actually try it, and accept that the long tail of AI-assisted builders is the new enterprise customer. Friction is a tax on adoption. Uniqueness is the only moat that lasts.

Published by Armando J. Perez-Carreno

Get started

Let's find your first automation.

Free 30-minute call. No pitch deck. No pressure.

Book a free call →