Alex is a ghost name for a new and independent tech-investor strategist writer who has been following Jason’s work since his announcement of DOMINAIT.ai, and recently delved deep into the “circular deals” debate that Jason laid out in his recent article after contacting him for an interview.
The use of the ghost name is because of the controversy in the article and “AR” is not yet ready to potentially harm his career for posting something that could place him on one side or the other. We have chosen to respect this request.
Please note, we at DOMINAIT.ai, more Jason Criddle, Jason Criddle & Associates, SmartrHoldings, or any associated companies are not “picking a side” either. This is simply an opinion piece that is citing articles and quotes from real people closely associated with Sam Altman and OpenAI.
Alex Reiner (AR): Jason, thanks for taking the time. Your recent article:
(https://dominait.ai/building-real-value-in-an-age-of-circular-deals/) truly pulled no punches when it came to Sam Altman and the phenomena you call circular deals. You wrote:
“Circular deals drive up valuations and in turn, drive up more debt and a view of being successful.”
And:
“There is a fine line between doing a deal that raises value and doing a deal that raises the perception of value.”
Those are strong statements. What stirred you to publish this now?
Jason Criddle (JC): Thanks, Alex. The timing felt right because we’re in the middle of an AI gold rush… with companies raising valuations by the billions each day, talk of trillion-dollar markets and products, and terms like “AGI” being tossed around without merit. My concern is that the hype is increasingly being built on perception rather than value.
When I see firms like OpenAI announce massive infrastructure deals with valuations running wild, and yet, very little path to profitability visible, my radar goes off immediately. When the board of OpenAI claimed they “no longer had confidence” in Altman because he “was not consistently candid,” it became more than a valuation problem. It became a governance and ethics problem.
AR: Let’s dig into that. There’s a report from Helen Toner, former OpenAI board member, who said in an interview:
“When ChatGPT came out in November 2022, the board was not informed in advance about that. We learned about ChatGPT on Twitter.”
That’s a striking level of disclosure breakdown. How does that feed into your critique of circular deals?
JC: It’s exactly the connection. If you’re orchestrating huge deals while internally trading equity, posting big numbers, chasing valuations, but you can’t even keep your board in the loop, the perception of value begins to hinge on optics.
One “circular deal” I refer to is the architecture where Company A invests in Company B, Company B invests back in Company A, each claiming higher valuations, yet minimal external capital or revenue, just internal loops. When governance fails, the risk is that those loops collapse.
AR: You use the term circular deals a lot in your article. For readers, can you offer a succinct definition?
JC: Sure. A circular deal is a transaction or structure that appears to inject new value (equity purchases, infrastructure commitments, and partner deals) but has minimal external economic substance. The money often doesn’t exchange hands in the way it’s portrayed, or the equity is swapped for services rather than cash. It can still be legal, but it raises questions about whether value is created or just mirrored. When valuations and debt expand on the basis of those loops, the foundation is weaker. And when the public or less-sophisticated investors come in believing value is cash-backed, we have a problem.
A simple, “hypothetical” example, one of these “infrastructure” deals: A company gives OpenAI access to $5-billion dollars worth of GPUs to use. No purchase happens. No money changes hands. And then OpenAI announces they did a “$100-billion dollar deal.” They raise their own valuation on paper, and claim they are worth $100-billion dollars more.
No profits. No revenue. Nothing really happened except new numbers written on a balance sheet. But the general public thinks the company is now worth more.
Photo courtesy of CNBC
AR: Interesting. I found a Reddit thread where someone described Altman in scathing terms. One comment on r/ArtificialInteligence read:
“Altman’s whole job as a tech CEO was to lie. It’s that simple.”
How do you interpret comments like that?
JC: That kind of comment signals a deeper erosion of trust. It might be hyperbolic… Reddit tends to be reductive, but when you have dozens or hundreds of posts saying “Altman lied … manipulative,” you begin to see a pattern of perception. And perception drives flows: talent, investment, and brand. It ties back to value. If the marketplace begins believing the leadership is not being candid, the perceived value becomes more about the story than the numbers.
AR: Let’s pause on perception vs. reality. Your philosophy at DOMINAIT.ai is very different. You said:
“While it’s great and all that OpenAI can publicly announce they’re seeking a trillion-plus dollars, it’s not based on profits. It’s all speculation used to make the company look like it’s worth more so you can extract money from new investors.”
How does DOMINAIT mirror a different path?
JC: At DOMINAIT.ai, we built Ryker from my own savings, not hype. We didn’t raise huge venture rounds with no product. I didn't raise a dime to build the product or the company.
We built our framework, the Ryker system, internally, leveraged our own teams and revenues from my own ventures like Jason Criddle & Associates, and we are only now opening to outside investors. After the product has been built. After a proof of concept through our beta program. Our goal is to build a product users love before asking outsiders to bet on us. So when valuation discussions come, they’re grounded in deliverables, not inflated promises. That difference matters when dealing with risk, trust, and long-term impact.
AR: Going back to Altman and OpenAI: there was a Business Insider piece that described serious chart errors during the GPT-5 demo:
“Several charts included in OpenAI’s GPT-5 livestream… visually mis-represented the data.”
Is that symptomatic of the same problem?
JC: Yes. When you get to the point where the public demo has errors in core visuals and you have to call it a “mega chart screwup,” the question is: are we listening to honest mistakes, or is the narrative so tightly managed that every crack in the veneer is magnified? In isolation it might be harmless, but taken together with board issues, trust erosion, and big valuations based on these circular structures, it fits a pattern of show over substance.
Even a couple of emails I got the other day just a few hours apart made me chuckle a bit. I understand all companies can make errors. But since Altman spends so much time making toys, I thought to myself… maybe they should make a new survey generator.
AR: That is pretty funny. To let you know, I also found a quote from Reddit:
“Sam lied to board members… No board of directors would be able to maintain a working relationship with a CEO after he gets caught doing that.”
You commented in your article:
“When circular deals dominate, consumers and small investors pay the price.”
What’s the path for that risk playing out?
JC: The risk plays out when faith in the loop falters. If investors, partners, or regulators sense the structure is brittle, the cycle reverses. Valuations collapse, debt becomes burdensome, and talent leaves. Most importantly, end-users or the broader economy get hurt. When a company goes public at an inflated valuation with no profits, small investors often buy at peak and lose when initial investors cash out. It crashes the stock. That’s the cycle I want to avoid in AI. Build value first, then ask for valuation. Don’t ask for valuation because you built hype.
AR: Finally, for those reading this now who are considering investing in AI companies, what two key questions should they ask?
JC:
1. What’s the path to real cash flow? Not speculative revenue, not “infrastructure commitments,” but an actual monetizable product.
2. Where is the transparency? Board oversight, independent audit, and real governance. If you hear “we’ll release details later” or “our valuation is untapped,” that’s a warning. That's investors and executives counting stacks of cash behind closed doors.
AR (closing): Thank you, Jason. Your stance is clear: focus on the fundamentals, under-promise, over-deliver, and don’t confuse image for substance.
With many voices questioning Sam Altman’s leadership and the broader AI investment climate, your message offers a refreshing counterpoint: build sustainably, govern well, and deliver value so the hype doesn’t collapse underneath you.
“OpenAI fires co-founder and CEO Sam Altman for allegedly lying to company board,” The Guardian (Nov 17 2023) — https://www.theguardian.com/technology/2023/nov/17/openai-ceo-sam-altman-fired
Reddit thread: “Former OpenAI employee describes Sam Altman as ‘nice’ but also a deceptive, manipulative liar,” r/OpenAI (Nov 21 2023) — https://www.reddit.com/r/OpenAI/comments/1804u5y
Reddit thread: “Ilya accused Sam Altman of a ‘consistent pattern of lying,’” r/ChatGPT (Nov 02 2025) — https://www.reddit.com/r/ChatGPT/comments/1omdq04
“OpenAI board first learned about ChatGPT from Twitter, according to former member,” Ars Technica (May 2024) — https://arstechnica.com/information-technology/2024/05/openai-board-only-learned-about-chatgpt-from-twitter-according-to-former-member/
“OpenAI CEO Sam Altman: ‘I personally think we have been on the wrong side of history here’ concerning open source,” Economic Times (Feb 01 2025) — https://m.economictimes.com/tech/artificial-intelligence/been-on-the-wrong-side-of-history-sam-altman-on-openais-closed-source-approach/amp_articleshow/117837885.cms
“I don’t trust Sam Altman,” Reddit thread r/ArtificialInteligence (Jun 17 2024) — https://www.reddit.com/r/ArtificialInteligence/comments/1d2pq9c
“Meta exec calls OpenAI’s Sam Altman ‘dishonest’ over claims of ‘$100 M signing bonuses’ to poach AI talent,” NY Post (Jun 27 2025) — https://nypost.com/2025/06/27/business/meta-exec-calls-openais-sam-altman-dishonest-over-claims-of-100m-signing-bonuses-to-poach-ai-talent/
For investment inquires with DOMINAIT.ai: