The first thing I notice when I watch a zero-knowledge (ZK)–driven chain over a longer time horizon isn’t user growth or headline transaction countsit’s how uneven the computation footprint is. Activity doesn’t scale linearly with usage. Instead, it comes in bursts: proof generation spikes, verification settles into a steady rhythm, and liquidity seems to orbit around those cycles rather than around simple demand for blockspace.
That asymmetry tells you early that this isn’t a typical execution environment. It’s a verification market.
When I track wallets interacting with these systems, I see two distinct classes of participants forming pretty quickly. On one side, you have infrastructure-aligned actorsprovers, node operators, and teams running specialized hardware. Their behavior is slow, deliberate, and capital-intensive. They’re not rotating in and out; they’re positioning for throughput dominance. On the other side, you get more familiar crypto-native flows: capital that treats the network like a yield surface, moving in when incentives expand and fading when they compress.
What’s interesting is how little overlap there is between those two groups.
Speculators and yield farmers tend to cluster around moments when proof demand increasesairdrops, ecosystem campaigns, or onboarding waves. They’re not interacting with ZK as a primitive; they’re interacting with the economic layer built on top of it. Meanwhile, the infrastructure side is playing a completely different game. They’re optimizing latency, proof costs, and hardware efficiency, which has very little to do with token price in the short term but everything to do with long-term positioning.
That split reveals something deeper about the structure: the network’s core value isn’t in execution, it’s in compression of trust. And that changes how capital behaves.
In most chains, execution drives fees, fees drive validator rewards, and rewards anchor token demand. Here, verification is cheap by design, while proof generation is expensive and offloaded. So the economic gravity shifts away from the chain itself and toward whoever controls the proving layer.
This is where incentive design becomes critical.
If you look closely at how tokens are distributed in these ecosystems, a large portion is often directed toward subsidizing proof generation or incentivizing participation in early-stage infrastructure. That creates a temporary bridge between speculative capital and long-term infrastructure buildout. But it also introduces fragility.
Liquidity pacing, in particular, becomes non-trivial. When emissions are high, you see rapid capital inflows chasing yieldTVL spikes, transaction counts rise, and the network appears to be gaining traction. But that liquidity is rarely durable. It’s there for the spread, not for the system. more interesting signal is whether infrastructure commitments continue after emissions normalize.Running a prover isn’t like staking tokens in a typical PoS system. It requires real-world resourceshardware, energy, optimization expertise. That creates a higher barrier to exit, which in theory should lead to stickier capital. But only if the economics justify it. If proof generation rewards compress too quickly, even these participants start to scale back, and you can see it in subtle ways: longer proof times, reduced throughput, or increased reliance on centralized operators.From a market microstructure perspective, this creates very distinct liquidity windows.I’ve noticed that activity tends to cluster around discrete events: major integrations, upgrades to proving systems, or changes in cost efficiency. These aren’t just narrative catalyststhey directly impact the cost structure of the network. When proving becomes cheaper or faster, you often see a temporary surge in activity as new use cases become viable. Liquidity flows in, but it’s opportunistic. Traders position around these shifts, knowing that the window may not stay open for long.
There’s also a feedback loop between verification costs and application design. When verification is cheap, developers push more computation off-chain, increasing reliance on proofs. That, in turn, increases demand for provers, which can tighten margins unless efficiency improves. It’s a constant balancing act between cost compression and demand expansion.Compared to earlier cycleswhere blockspace itself was the scarce commoditythis feels like a different regime. Scarcity is no longer about execution capacity; it’s about computational efficiency and who can deliver it at scale.The long-term question is whether this model creates a durable economic layer or just a transitional phase.
If incentives remain heavily emission-driven, then a lot of the current activity is likely to fade as those emissions taper off. We’ve seen this pattern beforecapital that looks committed until the yield disappears. But if the network can reach a point where proof generation is both efficient and economically sustainable without heavy subsidies, then you start to get something more interesting: a self-reinforcing system where infrastructure investment leads to lower costs, which drives more usage, which in turn justifies further investment.
That’s the flywheel everyone is implicitly betting on.The risk is that the system never quite escapes its bootstrap phase. If proving remains too expensive, or if rewards don’t adequately compensate for the capital required, then participation consolidates. You end up with a small set of dominant operators, and the decentralization narrative starts to weakennot because the protocol failed, but because the economics pushed it in that direction.What I think the market is underestimating right now is how sensitive this entire structure is to cost curves.
People tend to focus on adoption metricsusers, transactions, ecosystem growthbut in ZK systems, those are downstream effects. The real driver is whether the cost of generating proofs can consistently decline faster than demand increases. If that happens, the system scales naturally. If it doesn’t, you get periodic bursts of activity followed by stagnation.
From where I’m sitting, watching flows and behavior over time, this doesn’t look like a typical L1 or L2 growth story. It looks more like an emerging compute market, where the winners aren’t necessarily the ones with the most users today, but the ones who can most efficiently turn computation into verifiable outcomes.And that’s a much harder game to price.

#night @MidnightNetwork $NIGHT
