The Stablecoin Breakout in 2026 and Why Plasma Matters It increasingly feels like 2026 will be the year stablecoins fully step into the mainstream. Among the projects aiming to support that shift, Plasma stands out as a serious contender for becoming everyday financial infrastructure in crypto. From what I’ve seen, stablecoins are no longer mainly about chasing yield. They’re being used for very practical purposes: storing value, sending money, and handling cross-border payments. In this phase, what users care about most isn’t upside—it’s reliability, low friction, and predictability. That’s where general-purpose blockchains start to show their limits. On multi-functional chains, simple payments are forced to compete with DeFi trades, NFT mints, and speculative activity for blockspace. The result is variable fees, congestion, and uncertain settlement times—none of which are acceptable for everyday money. Plasma is designed specifically to address this gap. Rather than trying to be everything at once, it focuses on making stablecoin transfers intentionally “boring”: fast, inexpensive, and mentally effortless. If stablecoins truly scale in 2026, Plasma’s bet is clear—and compelling. In a world where stability is the product, being invisible and unobtrusive may be the real advantage. @Plasma #Plasma $XPL
Ondo turns U.S. stocks into 24/7 perpetual: the biggest moat of traditional finance is being dismantled by 'time.' U.S. stocks don't sleep anymore. You should no longer understand risk with 'opening and closing.' Ondo launches Ondo Perps: allowing users to trade mainstream U.S. stocks and ETFs' perpetual contracts 24/7 (open to users outside the U.S.), and allows tokenized securities as collateral, with cross-margining and up to 20x leverage. If you only see it as 'just another derivative product,' you will underestimate it. The real change is: the time barrier has been dismantled. A hidden advantage of traditional markets is the 'market closure.' Market closure means: - Risks are forced to pause; - News has a buffer period; - Market makers have a window to repair their balance sheets. Meanwhile, 24/7 perpetual means: it can be repriced at any moment, and any sentiment can be immediately leveraged. It becomes more interesting when you look at it alongside Ondo's other clue: it has also submitted a registration statement to the SEC, providing issuer-level information disclosure that meets SEC standards. This means it doesn't just want to create an 'offshore casino'; it is attempting to establish a disclosure framework for RWA that can be accepted by the mainstream. This will completely disrupt the structure of the crypto market: 1) Stablecoins will become the main margin for 'stock risk'; 2) Crypto volatility will spill over to stocks (liquidations are asset-agnostic); 3) Stock news will impact crypto night trading (midnight earnings reports, midnight forced liquidations). You will see a new species: treating AAPL as 'on-chain assets', settling with USDC, expressing views with perpetual contracts, and using on-chain market makers for liquidity. The New York Stock Exchange hasn't disappeared, but it is no longer the only price discovery stage. A deeper layer: The challenges of RWA have never been about 'putting assets on the chain', but about 'enabling assets to be traded, cleared, and risk-controlled on the chain'. Ondo is working to bridge these three
If an AI fails to make it through this winter, who’s really to blame?
When I was a kid, I had a hamster—adorable, but painfully delicate. One day it just stopped moving. I cried the entire day. That feeling of watching life slip away while being completely helpless is something I’ve never forgotten. Strangely, over the past few days, watching thousands of AI Agents die on-chain has brought that same feeling back. They’re created, they buzz with activity for a short while, then they reset, halt, and vanish. The reason is simple: they don’t remember. An existence without memory can’t evolve. It’s doomed to be a disposable toy. That’s why the red lobster image from @Vanarchain hit me so hard. Lobsters are fascinating creatures—biologically speaking, they can theoretically live forever as long as nothing kills them. Vanar is using this metaphor to make a stark statement about AI survival: “Some 🦞 agents are about to remember permanently. The rest won’t make it.” This partnership with OpenClaw isn’t just another integration. It’s the beginning of species-level separation. Until now, all AIs were effectively equal—like goldfish, looping through seven seconds of awareness. But Vanar is now embedding true persistent memory into a select group of Agents through the OpenClaw framework. These Agents will carry memories across resets. They’ll have “past lives.” They’ll learn, compound experience, and grow. They become the long-lived species of the chain. Meanwhile, Agents still stranded on stateless public infrastructure—memoryless and exposed—are quietly being selected out. Here’s my blunt take: This is no longer a feel-good game where everyone wins. It’s an unforgiving evolutionary arena. Vanar isn’t trying to save every Agent. It’s choosing to act as the selector. The message to the market is clear: if you want your Agent to survive into 2026, you need to give it a Vanar brain. Most people are still distracted by short-lived, disposable Agents, unaware of what longevity is really worth. I’m willing to bet against that blindness. Because history—biological or otherwise—shows us this: the ones that end up ruling aren’t always the fastest or flashiest. They’re the ones that remember the longest, adapt the deepest, and live the longest. #vanar $VANRY
Vitalik Buterin tempers vision for ETH L2s, pushes native rollups
Ethereum co-founder Vitalik Buterin has reversed his long-held view that layer-2s should be the primary way to scale Ethereum, saying the approach “no longer makes sense.” “We need a new path,” Buterin said in a post to X on Tuesday, arguing that many layer-2s have failed to decentralize and that the Ethereum mainnet is now sufficiently scaling, with improvements coming from gas limit increases and soon native rollups. “Both of these facts, for their own separate reasons, mean that the original vision of L2s and their role in Ethereum no longer makes sense, and we need a new path.” Layer-2s were envisioned as extensions of Ethereum, managing most transactions at high speed and low cost while inheriting Ethereum’s security. Buterin said layer-2s were meant to partake in “Ethereum scaling” by creating block space that is fully secured by the Ethereum mainnet, in which all transactions become valid, uncensored and final; however, many layer-2s have failed to reach that standard, he said: “If you create a 10000 TPS EVM where its connection to L1 is mediated by a multisig bridge, then you are not scaling Ethereum.” Source: David Hoffman Buterin said layer-2s — which include Arbitrum, Optimism, Base and Starknet — should instead pivot from scalability to focus on a particular niche, suggesting areas such as privacy, identity, finance, social apps and AI. Ethereum’s technical roadmap had long focused on layer-2s as the primary avenue for scaling the network. It also comes as some Ethereum developers have urged a focus on scaling the Ethereum mainnet. Among them is Max Resnick, a former researcher at the Ethereum infrastructure firm Consensys, who moved to the Solana ecosystem after his push to prioritize scaling the Ethereum mainnet failed to gain enough support. Ryan Sean Adams, co-host of the Ethereum show Bankless, also expressed support for Buterin’s view, stating: “This is ‘the pivot.’ I'm glad it's now being said. Strong ETH, Strong L1.” Native rollups, gas limit rises key scaling Ethereum mainnet Buterin said he has become increasingly convinced of the role that precompiled native rollups will have in scaling the Ethereum mainnet, particularly once zero-knowledge Ethereum Virtual Machine (zkEVM) proofs are integrated into the base layer. While traditional rollups are built on top of Ethereum by bundling and executing transactions off-chain before posting transaction data to Ethereum, native rollups are baked into Ethereum itself — meaning the processing of transactions is directly verified by Ethereum validators. In mid-December, Ethereum developers also discussed raising the gas limit from 60 million to 80 million once the second blob-parameter-only hard fork was implemented. The hard fork took effect in January. Doing so would directly increase the number of transactions and smart contract operations that can fit in each Ethereum block, further boosting overall throughput while potentially lowering fees. Last July, Ethereum researcher Justin Drake unveiled a 10-year plan to achieve 10,000 transactions per second (TPS) on the Ethereum mainnet once all scaling features are implemented, marking a considerable increase from the 15–30 TPS currently observed.
From Wallets to Wrench Attacks: A Story Ledger Can’t Escape
On the morning of January 21, 2025, David Balland was dragged from his bed in Méreau, a small town in central France. As co-founder of Ledger, a cryptocurrency hardware wallet company claiming to safeguard $100 billion in Bitcoin for users worldwide, he represented a high-value target. According to Le Monde, when French elite special forces (GIGN) stormed the hideout 48 hours later, Balland had already lost a finger. The kidnappers had sent video of the severed digit to fellow co-founder Éric Larchevêque, along with a message: cryptocurrency only, no police, no delays, or face the consequences. One year later, Ledger announced plans to go public on the New York Stock Exchange with a valuation exceeding $4 billion. Goldman Sachs, Jefferies, and Barclays (Wall Street's biggest names) are backing the deal. This is a business built on "security." The Leaked Addresses Rewind to 2020. That summer, a misconfigured API endpoint gave attackers easy access to Ledger's e-commerce database. Over one million email addresses were exposed. Worse: the names, phone numbers, and home addresses of 272,000 customers. Six months later, the list was dumped on the hacker forum RaidForums and sold for next to nothing, freely accessible to anyone. What followed was predictable. Phishing emails arrived in droves, tricking users into clicking malicious links to steal their private keys. Some Ledger users received threats stating attackers knew their names and addresses, demanding ransom or threatening to come to their homes to steal their cryptocurrency. Ledger CEO Pascal Gauthier announced the company would not compensate customers whose personal data had been leaked on hacker websites, including those whose home addresses were exposed. The incident caused Ledger significant damage. But the real cost was borne by the users who still live in fear today. Has Ledger learned its lesson? Three Times in the Same Pit December 14, 2023, Ledger was hit again. This time the path was even more absurd. A former Ledger employee fell for a phishing attack, giving the attacker access to their NPMJS account. No one explained how long it had been since the employee left, nor why a former staffer still held access to critical systems. Malicious code was injected into Ledger Connect Kit, a core library relied upon by countless DeFi applications. The front ends of the entire DeFi ecosystem (SushiSwap, Zapper, Phantom, Balancer) instantly turned into phishing pages. Although Ledger fixed the issue within 40 minutes, $600,000 had already vanished. CEO Pascal Gauthier wrote in a statement afterward: "This is an unfortunate isolated incident." Isolated? January 5, 2026, just two weeks before announcing its IPO plans, another leak occurred. This time the issue lay with its third-party payment processor, Global-e. Customer names and contact information leaked once more. Six years. Three major leaks. Every time it's an "isolated incident" or a "third-party problem," yet every time the users bear the consequences. If a traditional financial institution had three security failures in six years, regulators would have revoked its license long ago. In the crypto world, it gets to IPO with a tripled valuation. Recover: An Open Betrayal If data leaks can be blamed on accidents or negligence, Ledger Recover was a deliberate self-detonation. In May 2023, Ledger launched a new service: for $9.99 a month, users could have their seed phrases encrypted, split into shards, and held by three companies: Ledger, Coincover, and EscrowTech. If you lose your seed phrase, you can recover it by providing ID. To average users worried about losing their backup, this sounds helpful. But there's a fundamental problem: isn't the entire premise of the hardware wallet business "the private key never leaves the device"? Former Ledger CEO Larchevêque later admitted on Reddit an unsettling truth: if a user enables Recover, the government could use legal processes to force these three companies to hand over the key shards to seize assets. The community exploded. Photos of users burning their Ledger devices appeared on Twitter. Polygon's Chief Information Security Officer, Mudit Gupta, tweeted that anything protected by "identity verification" is inherently insecure because it is too easy to fake. Binance founder Changpeng Zhao also questioned if this meant the seed phrase could now leave the device, calling it contrary to the community's core philosophy. Ledger's response was that most crypto users still use exchanges or software wallets with limited security, and for many, managing a 24-word seed phrase is a barrier too high to cross. They argued that paper backups were becoming an obsolete solution. The logic is sound. But when a company's growth strategy requires it to dilute its core value proposition, things become subtle. Ledger's old users are geeks. Geeks are picky, geeks are loud, geeks will write long Reddit posts to criticize you. But geeks have already bought their wallets. Geeks don't drive growth. Growth comes from novices. Novices fear complexity, novices will pay $9.99 for peace of mind, novices don't care about technical details like "the private key never leaves the device." However, this isn't a trade-off between security and convenience. This is an open betrayal of the core user base, trading their trust for a ticket to a larger market. The Wrench Attacks Let's return to David Balland's missing finger. In the crypto industry, there's a term called a "wrench attack." It means that no amount of complex cryptography or decentralized protocols can stop someone standing in front of you with a wrench, asking for your private key. The term sounds like dark humor, a joke invented by programmers while sketching threat models on a whiteboard. But when it actually happens, it isn't funny at all. In December 2024, the wife of Belgian crypto influencer Stéphane Winkel was kidnapped. In May 2025, the father of another crypto millionaire had a finger severed. Balland's case is part of a larger trend. A French internal security expert noted in an interview that the methods in these cases are strikingly similar. It remains unclear whether the same group is responsible, but one conclusion is undeniable. This industry has become a hunting ground for professional kidnappers. The question is: where does the list of prey come from? Those 272,000 home addresses from 2020 are still circulating on the dark web. This isn't just ordinary leaked data. It's a directory of addresses labeled "this person holds cryptocurrency," and the scale of their assets can be roughly estimated based on the Ledger model they purchased. Those who bought the most expensive models are likely holding the most coins. In a sense, Balland's ordeal is the fruit of seeds Ledger planted itself. That may sound too harsh. After all, Ledger didn't deliberately give the data to kidnappers. But when a company sells "security" as its core product yet fails to protect even customer addresses, it's difficult for them to credibly claim they bear no responsibility. The $4 Billion Logic Despite everything, Wall Street is backing Ledger. The reason is simple—FTX. In November 2022, FTX collapsed. $32 billion in valuation evaporated overnight. Hundreds of thousands of users had their assets frozen. The crypto mantra "Not your keys, not your coins" became a cautionary tale measured in billions. Demand for hardware wallets surged, and Ledger is the only player with genuine brand recognition. According to BSCN, it controls 50% to 70% of the market. The company claims to secure $100 billion in Bitcoin, roughly 5% of the global supply. Timing matters. In 2025, crypto companies raised $34 billion through public offerings. Circle and Bullish each raised over $1 billion. BitGo became the first crypto firm to go public in 2026. Kraken is preparing to follow with a $20 billion valuation. This is an exit window, and Ledger isn't about to miss it. The incentives align perfectly. Founders want liquidity, VCs want exits, and retail investors, swept up in Bitcoin euphoria, will buy anything labeled "crypto." The global hardware wallet market was valued at $914 million in 2026 and is projected to reach $12.7 billion by 2035. Ledger is positioned to capture the lion's share. The $4 billion valuation isn't grounded in operational performance. It's built on a narrative of "crypto custody infrastructure." Investors aren't buying a hardware manufacturer. They're buying the industry's only consumer-facing vault with household name recognition. Whether that vault can actually protect people appears to be a secondary consideration. Beyond the Charts Narratives are fragile. Just ask the crypto companies that went public in 2025. Their six-month performance tells the real story. Circle: Peak of $298, now $69 Bullish: $118 to $34 BitGo: Jumped 25% on day one, erased within 72 hours Crypto stocks don't trade on fundamentals. They track Bitcoin, regardless of business performance, revenue growth, or operational excellence. Marcin Kazmierczak, co-founder of Redstone, acknowledges the regulatory tailwinds but identifies the core risk. Ledger's revenue remains tied to consumer hardware cycles. A prolonged crypto winter would hit hard. His measured optimism rests on one hope: that this IPO might catch an "institutional cycle" driven by something more durable than retail FOMO. It's a bet. For the users whose addresses leaked, whose trust was sacrificed for growth, and whose lives were endangered, the question isn't about Wall Street's gains. It's about accountability when the next failure comes. The following table outlines the chronology of key security incidents and corporate events for Ledger from 2020 to 2026. Survival of the Fittest Ledger's IPO is a perfect mirror of the crypto industry itself. A company selling security has been defined by security failures. A product promising "total control of private keys" now offers third-party key storage. A firm whose co-founder lost a finger to kidnappers is about to expose itself to the most transparent, scrutinized capital market in the world. Contradictions? Everywhere. But crypto's governing principle isn't resolving contradictions. It's surviving them. The 2020 data breach didn't kill Ledger. The 2023 supply chain attack didn't. The Recover backlash didn't. A co-founder's kidnapping didn't. The company didn't just survive. It's going public at a $4 billion valuation. Maybe that's the real lesson here. In an industry where even founders lose fingers to kidnappers, true security doesn't exist. But money keeps moving. The survivors amid the chaos often become the winners of the next cycle, not through excellence but through persistence. Whether Ledger will be one of them is still unknown. The next security breach might provide the answer sooner than expected.
From EVM Storage to Move-Native Storage: An Overlooked Interoperability Shift
While migrating several Ethereum-based DApps to Sui this week, the hardest part turned out not to be contract refactoring, but dealing with static assets—images and frontend files that were previously hosted on IPFS. Despite being the de facto standard, IPFS’s pinning reliability is unpredictable, and its garbage collection is essentially out of the user’s control. In practice, ensuring data persistence often means running your own node, which ironically undermines the very decentralization IPFS aims to provide. During a deeper dive into Sui’s object-centric architecture, I came across Walrus. Rather than positioning itself as a traditional decentralized storage network, Walrus feels more like an external hard drive purpose-built for the Sui ecosystem. What stood out immediately was how it leverages Sui’s global state to manage storage metadata, sidestepping the complexity and overhead of maintaining a separate, heavyweight blockchain like Filecoin. To test it, I built a simple NFT minting page where image uploads are handled directly through the Walrus API on the backend. The experience was almost suspiciously smooth. In the EVM world, you’re usually forced to choose between Arweave’s long confirmation times—sometimes stretching into hours—or the frequent gateway timeouts of IPFS. Walrus, by contrast, provides near-instant upload confirmation. The reason is architectural: storage confirmation is achieved through fast consensus among storage nodes first, without waiting for Sui’s main-chain finality. This separation of concerns is a smart trade-off. Storage-heavy operations are decoupled from the consensus layer, preserving main-chain performance while still offering a scalable home for large assets. Compared to Arweave’s “store forever at all costs” philosophy, Walrus is far more aligned with real-world Web3 needs: fast access, reasonable pricing, and durability that’s “good enough,” rather than engraving every piece of data in stone. That said, some red flags emerged during competitive analysis. Walrus’s token economics—particularly node incentives—are still loosely defined in the documentation. Without clear and sufficient rewards, there’s an open question of how node operators will be motivated to reliably retain data after storage periods expire. Erasure coding mitigates single-node failures, but if profitability collapses and many nodes exit simultaneously, overall data availability could degrade rapidly. A look at the GitHub repo raises additional concerns. Parts of the storage verification logic still contain unimplemented TODOs, which is unsettling for infrastructure that positions itself as secure by design. There’s also a noticeable ecosystem gap: although Walrus is deeply integrated with Sui, support for EVM-native workflows is weak. For Ethereum developers, adopting Walrus currently means learning a new Rust-based SDK or interacting directly with low-level APIs—neither of which is frictionless. Out of curiosity, I intentionally introduced a few misspelled variable names to test metadata parsing resilience. The result: Walrus’s indexing system is extremely strict—one missing byte and the data becomes unreadable. That rigidity is ideal for financial or state-critical use cases, but it may create headaches for content distribution and more forgiving frontend workflows. Another notable design choice is Walrus’s deterministic Blob ID generation. Identical content always produces the same ID, enabling native deduplication. This is excellent for reducing redundant storage and saving network bandwidth, especially at scale. After a few days of hands-on experimentation, I’ve come to believe that Walrus’s biggest competitor isn’t another storage protocol—it’s developer inertia. Even the best architecture can stall if tooling, middleware, and documentation don’t lower the barrier to entry. Still, given Mysten Labs’ current trajectory, it’s clear they intend Walrus to become foundational infrastructure for Sui—and potentially for the broader Move ecosystem. That ambition shouldn’t be underestimated. @Walrus 🦭/acc $WAL #Walrus
Getting Sandwiched on Ethereum Finally Made Dusk’s Privacy Model Click Last week I aped into a meme coin on Uniswap and once again got neatly sandwiched. Same old story MEV bots draining value like parasites that never sleep. Still annoyed, I went back to reread Dusk’s “blind bid” design, and suddenly everything made sense. This is exactly why serious capital refuses to touch DeFi. For large funds, radical transparency isn’t a feature it’s a fatal flaw Ethereum exposes every pending transaction in the mempool, turning trading into an open invitation for exploitation. In that environment, “fair markets” are a fantasy. Dusk’s end-to-end privacy architecture isn’t about hiding shady activity; it’s about eliminating front-running at the protocol level Digging into its transaction flow, the difference becomes obvious. Before consensus, validators have zero visibility into transaction details. They can’t see prices, sizes, or intent. As a result, block producers can’t reorder transactions based on gas bribes, nor can they selectively target large orders for sandwich attacks. The experience resembles traditional dark pools: outwardly quiet, internally active, but with no visibility into who’s doing what until execution completes Compared to this, TEE-based privacy systems like Secret Network feel more like temporary fixes. Hardware enclaves sound nice until you remember how often Intel SGX has been compromised. Dusk takes the harder route relying purely on cryptography and math It’s slower sure but it feels fundamentally safer That said the current user experience leaves a lot to be desired Proof generation noticeably strains the browser and transactions feel sluggish. The wallet UI looks like it was lifted straight from early-2000s online banking functional but painful. Hardcore users might tolerate it but if Dusk ever wants real commercial adoption this entire interaction layer needs a serious overhaul Otherwise institutions won’t even get past the first click and retail users will bounce instantly #dusk @Dusk $DUSK
Why I’m Bearish on EVM Privacy Band-Aids — and Still Willing to Suffer for Dusk
Over the last few days, I’ve been stress-testing Dusk’s frankly unstable testnet. Three sleepless nights, terminals screaming Panic errors, and multiple moments where I seriously considered sacrificing my keyboard to the gods. Dusk today feels like a brilliant but deeply unpleasant engineer: absurdly strong fundamentals, borderline hostile ergonomics. And yet, it was exactly this painful experience that finally clarified something that had bothered me for years — why Ethereum-based privacy solutions are structurally doomed, and why Dusk, as a privacy-native Layer 1, might actually be the right substrate for real-world financial assets. Let’s start with an uncomfortable truth: Ethereum’s account model is built for openness. Trying to retrofit privacy on top of it is like installing blackout curtains in a glass skyscraper. You can obscure the view, but the building itself was never meant to hide anything. I’ve gone through Aztec’s codebase and studied Manta’s early ZK implementations — the cryptography is impressive, no doubt. But it comes with a lethal tradeoff: composability collapse. The moment assets become “private” on an L2, they become isolated. You can’t seamlessly borrow on Aave or LP on Uniswap without exiting that privacy domain — and that bridge in and out leaks metadata like a sieve. The privacy promise breaks exactly where real usage begins. Dusk takes the opposite approach. Instead of wrapping privacy around an exposed core, it bakes opacity directly into the execution layer. Its Piecrust VM — which has made me mildly regret choosing Rust — is conceptually elegant. Contracts can read and compute on encrypted state without revealing it. That sounds hand-wavy until you picture a concrete use case: an on-chain dark pool. On Ethereum, a dark pool is effectively impossible. The instant an order hits the mempool, MEV bots see the price, size, and intent, then front-run it. On Dusk, you can prove “I’m placing a bid for 100 units and I’m solvent” without disclosing price or volume. The network validates correctness, not content. Until a match occurs, state transitions are real, but transaction details remain opaque. This kind of privacy isn’t about illicit trade — it’s about making institutions comfortable enough to participate. For banks, the real risk isn’t regulation; it’s position exposure. That said, Dusk has paid dearly for this design purity. Developer experience is brutal. The documentation is fragmented, with critical host function references lagging half a year behind reality. Porting something as simple as ERC-20 logic turns into a minefield of strict memory rules. On EVM, sloppy code just burns extra gas. In Piecrust, a minor type mismatch can nuke ZK proof generation entirely. The system tolerates nothing — which is great for mainnet correctness and terrible for developer onboarding. It effectively filters out anyone who isn’t deeply committed. People often lump Dusk together with Aleo, but the comparison misses the point. Aleo is immensely powerful and massively funded, but its ambition is diffuse — it wants to be a universal private compute layer for everything from Web2 algorithms to general-purpose apps. Dusk is far narrower and, paradoxically, stronger because of it. It focuses obsessively on regulated financial settlement. Its XSC standard embeds compliance directly into the asset layer: whitelisting, adjustable transfer caps, even regulator-accessible hooks. To crypto purists, this looks like heresy. To anyone who has actually worked on STOs, it’s non-negotiable. Compliance isn’t optional — it’s the admission ticket. While running nodes, I noticed something telling: Dusk’s SBA consensus is hypersensitive to latency. Miss a few hundred milliseconds, and you’re out of a Blind Bid round. That’s not a bug; it’s a signal. The system is engineered for immediate finality — once a block is sealed, it’s final. No probabilistic settlement, no “maybe it reorgs.” For financial infrastructure, this certainty matters more than raw throughput. Solana is fast, but its history-based assumptions still allow edge-case forks. Dusk locks state with math, not hope. Right now, Dusk feels unfinished — like a Swiss Army knife that’s heavy, solid, and clearly forged from real steel, even if half the tools still jam. Most so-called privacy chains today are either single-purpose payment rails or thin privacy veneers on top of other ecosystems. Dusk stands alone in rebuilding the VM itself in pursuit of programmable privacy. The risks are obvious. Rust developers may refuse to migrate. Regulators might reject the compliance abstractions. A flaw in the ZK circuits could be catastrophic. Any one of these could zero the project. Still, I’m willing to take that risk. Because the next phase of Web3 isn’t about minting endless tokens — it’s about migrating a trillion dollars of traditional assets on-chain, safely. If that future requires enduring a miserable developer experience and a testnet that occasionally implodes, so be it. Real freedom has never been comfortable. #dusk @Dusk $DUSK
WHITE HOUSE DRAWS A LINE IN THE SAND OVER CRYPTO BILL
The Trump administration just signaled a major "red line" for the upcoming crypto market structure bill. While the President wants a bill on his desk ASAP, his team, led by Patrick Witt, is making it clear: they won't accept any "poison pill" ethics provisions that specifically target the President or his family’s digital asset businesses.
It’s a high-stakes standoff in the Senate. Democrats are pushing for strict bans on top officials cashing in on the industry, while the White House is calling these demands "outrageous" political attacks.
With a 60-vote majority needed and the midterms looming, the clock is ticking for $BTC and $ETH regulatory clarity.
If the banking lobby and the ethics hawks can’t find middle ground by the end of February, this whole legislative effort might hit a wall.
The core question now: Should sitting government officials - including the President - be allowed to trade crypto while in office? 🤔
$BTC : The reversal is not convincing at this stage. The price should remain below $79,631 to keep downside pressure intact. A break above that level would serve as an early sign of a trend reversal. For reference, my buy order at $73,308 has been triggered.
Next Bitcoin accumulation phase may hinge on credit stress timing: Data
Bitcoin’s volatility spiked, and its price plummeted to fresh lows as worrying US economic conditions emerged. Will credit stress data signal the next accumulation phase for BTC? Bitcoin scratched new lows below $73,000 on Tuesday as data shows troubling macroeconomic challenges bubbling below increasingly volatile markets. New data highlights tightening credit conditions, even as the US debt and borrowing costs stay elevated, and one analyst says this gap between credit pricing and credit market stress may define Bitcoin’s price trajectory for the upcoming months. Key takeaways: The ICE BofA US Corporate Option-Adjusted Spread is at 0.75, its lowest level since 1998. US debt stands at $38.5 trillion, while the 10-year Treasury yield is 4.28%. Bitcoin whale inflows to exchanges have risen, but the pace of onchain profit-taking is easing. Tight credit spreads contrast with rising economic strain The ICE BofA Corporate Option-Adjusted Spread may act as a key macroeconomic signal for Bitcoin. The metric tracks the extra yield investors demand for holding the corporate bonds over US Treasurys. When spreads widen, it usually reflects stress in the credit markets. Currently, the spreads are compressed, suggesting the risk is still underpriced. This is notable given the current market. US government debt reached $38.5 trillion at the end of January, and the 10-year Treasury yield, after briefly falling below 4% in October, has climbed back to 4.28%, which is keeping the present financial conditions tight. 📷US Corporate Index Option spread against Bitcoin. Source: Fred In previous Bitcoin market cycles, including 2018, 2020, and 2022, BTC formed a local bottom only after the credit spreads began to widen. That process played out within a three-to-six-month delay, rather than an immediate effect. In August, 2025, Alphractal founder Joao Wedson argued that if liquidity tightens and credit spreads rise in the coming months, Bitcoin may enter another accumulation phase before the broader market stress becomes visible. Related: Bitcoin, crypto 'winter' soon over, says BitWise exec as gold retargets $5K Bitcoin whale selling rises, but longer-term pressure is cooling down Short-term selling activity has increased for Bitcoin this week. Crypto analyst Amr Taha noted that both whales and mid-term holders recently transferred a significant amount of BTC to Binance. On Monday, wallets holding more than 1,000 BTC deposited about 5,000 BTC, matching a similar spike seen in December. 📷Bitcoin Binance exchange inflows by holder age. Source: CryptoQuant At the same time, holders from the 6-to-12-month age group also moved 5,000 BTC to exchanges, the largest inflow from this cohort since early 2024. However, broader selling pressure appears to be fading. CryptoQuant data shows the spent output profit ratio (SOPR) has dropped toward 1, its lowest level in a year, as Bitcoin dropped to a year-to-date low of $73,900 on Tuesday. Historical patterns outline that a Bitcoin bottom has played out between three-to-six months after credit spreads begin to widen. Rising Treasury yields may pressure the credit markets, potentially driving spreads toward the 1.5–2% range through April. This may lead to an accumulation window after July 2026, continuing into the second half of the year, as the market absorbs this stress, aligning with the current SOPR data, signalling long-term seller exhaustion.
Bitcoin loses $73K as US stocks sell off: Analyst says BTC price action is not ‘abnormal’
Bitcoin fell under $73,000 as futures liquidations soared and worries over this week’s US corporate earnings triggered a stock sell-off. Will traders finally step in to buy “discounted” BTC? Bitcoin tumbled to a new 2026 low of $72,945 on Tuesday as bulls failed to hold the $80,000 level as support. Year-to-date, Bitcoin trades at a 15% loss and remains nearly 45% down from its $126,267 all-time high, raising investor concerns that BTC’s cyclical bull market may have reached an end. Rocky price action in US stock markets is an alleged driver of the selling across the crypto market. Since the end of Q4, 2025, investors questioned whether the costs associated with the artificial intelligence infrastructure build-out and the lofty fundraising and valuations were sustainable. Investors fear that product demand and revenues may fall short of industry projections, and this souring sentiment is visible across the Magnificent 7 stocks, along with the S&P 500, DOW and NASDAQ, which are currently trading down 0.70% to 1.77%. AI majors, NVIDIA and Microsoft lost a respective 3.4% and 2.7% during the trading day, while Amazon nursed a 2.67% loss. More than 100 S&P 500 companies are set to report their earnings this week, so the current early-week volatility may simply be a manifestation of investors’ anxiety or a hint of what’s to come once earnings data is posted. Within the crypto market, liquidations of leveraged positions are adding pressure to the pace of selling, with BTC longs seeing $127.25 million forced closed and ETH longs locking in $159.1 million in liquidations. 4-hour crypto market liquidations. Source: CoinGlass Related: Bitcoin, crypto ‘winter’ soon over, says BitWise exec as gold retargets $5K While many analysts have suggested that Bitcoin is trading at a deep discount, apparent dip-buying from retail investors and institutional investors like Strategy has done little to stem the selling. According to Joe Burnett, Strive’s vice president of Bitcoin strategy, BTC’s current “price action is still sitting within historical norms at $74,000.” Burnett explained that the “45% Bitcoin drawdown aligns closely with historical volatility,” and added that the “volatility of this magnitude remains a symptom of a rapidly monetizing asset.” If the selling were to continue, current Bitcoin (BTC/USDT, Binance) orderbook data from TRDR.io shows bids thickening from $71,800 to $63,000. Whether or not traders step in to buy within that range is the real question, and it’s likely that non-crypto-specific macroeconomic and stock market-connected outcomes will continue to be the most impactful driver of Bitcoin’s price.
Stop reading $XPL as “just another public-chain token.”
It behaves more like a toll gate in the stab
Stop reading $XPL as “just another public-chain token.” It behaves more like a toll gate in the stablecoin battlefield. Let me be direct. After revisiting Plasma (the project behind $XPL), what stood out to me wasn’t flashy tech or narrative hype—it was a strategic bet that’s both risky and intriguing: Plasma pulls stablecoins out of the grab bag of chain use cases and treats them as the core infrastructure primitive. That’s a dangerous move. If the direction is wrong, the project collapses fast. But if it’s right, Plasma doesn’t need to become “the next Ethereum.” It becomes something else entirely: a settlement toll system for stablecoins. What makes this especially worth discussing is timing. The 2026 narrative stack—compliance, payments, cross-chain liquidity, and stablecoins—has clearly moved from crypto Twitter into traditional finance boardrooms. If you’re writing about Plasma now, shouting “moon” won’t work. You need to explain why stablecoin infrastructure has a real window right now. Otherwise, it just sounds like noise. 1) Start with reality, not slogans: where does XPL actually stand? Let’s anchor this in basic market data. At the time of writing: XPL trades around $0.10 24h volume sits roughly in the $50–60M range Circulating supply is about 1.8B That puts market cap near $180M (yes, it fluctuates—refresh as needed) What does this tell us? This isn’t a dead corner asset. Liquidity is sufficient for serious discussion and distribution. It’s also far from being “priced as a guaranteed winner.” There’s no consensus yet. That lack of consensus is exactly what makes it suitable for analysis, not cheerleading. Whether you look at CoinMarketCap or CoinGecko, the conclusion is the same: $XPL is still discounted by the market, not capped out by belief. That’s the stage where structural breakdown + personal judgment actually adds value. 2) Plasma’s real story isn’t speed—it’s specialization If I had to compress Plasma into one line: A high-performance, EVM-compatible L1 purpose-built for stablecoin payments and settlement, optimized for near-instant execution and ultra-low friction. That might sound like marketing—until you compare it honestly: TRON dominates stablecoin transfers, but functions more like a cash rail than a programmable, compliance-friendly ecosystem. Ethereum and L2s have unmatched ecosystems, but stablecoin payments remain clunky under congestion and volatile fees. Solana is fast and efficient, but stablecoins are just one narrative among many. Plasma flips the table: stablecoins are the first-order product; everything else is subordinate. This is a form of anti-involution. While most chains obsess over TPS charts and execution tricks, Plasma asks a simpler question: if stablecoins already drive most real on-chain activity, why not design the chain around that fact? 3) It’s not only about sending money—it’s about liquidity access Plenty of projects talk about “payments” and end up building nothing more than another transfer rail. What’s made Plasma more interesting lately is its attempt to bind payments and cross-chain liquidity into a single narrative. A concrete example: in January 2026, Plasma integrated NEAR Intents via the 1Click Swap API, lowering the friction for developers to tap aggregated, cross-chain liquidity. Why does this matter? Because real payment scale requires solving two hard problems: Users don’t hold stablecoins on one chain—they’re scattered across TRON, Ethereum, L2s, and more. Merchants and apps can’t afford constant slippage and complexity from bridges plus DEX hops. The Intents model hides complexity behind the interface. Users just click; the routing happens elsewhere. For a payment-focused chain, that level of abstraction is not a luxury—it’s essential. 4) XPL value capture: forget mysticism, focus on three mechanics I dislike turning tokens into metaphors. With XPL, there are only three questions that matter. A. Base usage XPL is the native token—fees, execution, validator incentives. In simple terms: operational fuel and security budget. B. Does stablecoin settlement actually scale? This isn’t about TVL screenshots or KOL noise. It’s about: transfer counts settlement volumes fee structures merchant and application adoption If Plasma genuinely becomes a settlement hub, XPL can move from “story-driven” to “cash-flow-adjacent” valuation. C. Can liquidity keep entering the system? Payment chains die when they become self-contained. Integrations like Intents matter because they reduce the cost of bringing external liquidity in. Without that, even the fastest settlement layer becomes irrelevant. 5) Compliance isn’t a slogan—it's a tailwind if handled correctly The 2026 environment isn’t about blind bullishness. It’s about regulators and institutions pushing stablecoins toward auditability, traceability, and financial-grade standards. That pressure cuts both ways. Offline banking is getting harder because compliance thresholds are rising. On-chain finance—payments, trading, yield—feels smoother precisely because stablecoins act as a digital dollar layer that bypasses legacy friction while inviting scrutiny. Plasma’s positioning sits right in that tension: on-chain efficiency without abandoning financial usability. You don’t have to fully believe it will succeed—but you should recognize that its chosen battlefield is clearer than most general-purpose chains. 6) My practical conclusion: how I’ll judge XPL over time I’m not here to predict. I prefer post-mortem logic. For XPL, I’m watching three validation lines: Sustained growth in stablecoin settlement, measured over weeks and months—not one-off spikes. Ecosystem focus: if the chain drifts into random narratives like NFTs and GameFi, its settlement advantage gets diluted. Real UX of cross-chain liquidity tools: does it actually feel like “click and done,” or is it just prettier complexity? If these line up, XPL starts to look like infrastructure pricing. If not, it gets repriced as another failed attempt at a stablecoin narrative. My stance for now: Worth tracking. Worth analyzing. Worth writing about with data. Not worth blind belief. @Plasma $XPL #plasma
When AI Training Data Explodes, Is Modular Storage a Lifeline—or Just Another Capital Trap?
As AI and crypto continue to converge, most of the spotlight has been on DePIN narratives and decentralized compute. Yet one of the most uncomfortable bottlenecks is often ignored: data storage. Training modern AI models routinely requires hundreds of terabytes of unstructured data, and relying on centralized cloud services like AWS can lead to bandwidth and egress costs that are fatal for early-stage teams. Walrus has stepped into this gap, clearly aiming at this overlooked but critical pain point. After closely examining Walrus’s architecture for handling large-scale unstructured data, it becomes obvious that it is purpose-built for AI training workloads. Unlike Ethereum, where block space is scarce and expensive, Walrus treats storage blobs as native primitives. This makes it naturally efficient at handling images, video, and audio—the core inputs that fuel today’s AI models. Looking at the competitive landscape, the contrasts are sharp. Filecoin may boast massive capacity, but its architecture is heavy and retrieval performance remains weak. During AI training, slow data access is unacceptable: idle GPUs waiting on storage translate directly into wasted capital. In a small-scale simulation on Walrus’s testnet, I found its parallel data retrieval to be genuinely impressive. This performance stems from its two-dimensional erasure coding scheme, which shards data across nodes in a way that enables simultaneous reads from multiple sources. Conceptually, it resembles BitTorrent-style parallelism, but enforced with much stronger mathematical guarantees at the protocol level. By comparison, Arweave functions more like a permanent archive—excellent for immutable historical records, but inefficient and costly as a high-frequency data source for AI training. That said, some cold realism is necessary. Walrus’s biggest weakness today isn’t technical performance but ecosystem maturity. AI developers live in Python-first workflows, and the S3 interface has effectively become the industry standard. While Walrus claims compatibility, real-world integration with PyTorch or TensorFlow still requires substantial custom adaptation. A survey of GitHub reveals thin SDK support, with many operations still dependent on manually written HTTP calls. Without a dramatically improved developer experience, even the strongest underlying tech risks remaining a niche toy rather than a production solution. There’s also a structural challenge: AI teams tend to follow compute subsidies, not storage ideology. Storage is typically viewed as secondary infrastructure. If Walrus fails to tightly integrate with decentralized compute networks, it may struggle to compete in the AI arms race. This is where projects like Akash or Render differ—they approach storage as an extension of compute, whereas Walrus approaches compute as a gravitational consequence of data. Conceptually, I lean toward Walrus’s strategy. Data has gravity. Once large datasets settle into a network, compute resources tend to follow. But this vision hinges on trust. Are AI companies truly willing to place proprietary datasets—their most valuable assets—onto a decentralized network that hasn’t yet proven itself under real stress? While Walrus documentation discusses privacy, there is no sign of breakthrough privacy-preserving computation such as fully homomorphic encryption. Today, the system still leans heavily on access control lists, which satisfy neither crypto purists nor enterprise security teams. Meanwhile, capital markets are extremely bullish on modular storage narratives. Celestia has reframed the data availability layer, and EigenDA is clearly moving in the same direction. Walrus is attempting to carve out a narrow but meaningful position within this crowded field. Its decision to avoid becoming a general-purpose L1 and instead act as a storage sidechain for Sui is strategically sensible, as it sidesteps direct competition with Ethereum. However, this also exposes it to the risk of Sui’s own ecosystem vitality—if the host chain falters, Walrus’s growth ceiling shrinks with it. The recent surge in attention around $WAL makes me cautious. Heavy KOL-driven hype often runs far ahead of real development timelines. Code audits show that some core modules haven’t seen updates in over six months, which contradicts the narrative of rapid progress. Storage infrastructure is slow, difficult, and capital-intensive to mature. Timing matters as much as vision. At this stage, Walrus feels similar to early Dropbox: the concept is compelling, the experience is still rough, but it undeniably addresses a real and urgent need. Whether it can evolve into foundational Web3 infrastructure will depend on its ability to survive the inevitable shakeout of this cycle. @Walrus 🦭/acc $WAL #Walrus
Most Web3 “storage” ends up being cold archival: upload data, park it somewhere, and pray you never need fast access. Walrus is going after a tougher problem—hot storage for applications that require low-latency reads and high availability for large assets like images, game files, models, and user-generated content. What Red Stuff changes At the core is Red Stuff, a two-dimensional erasure coding design. Instead of relying on costly full replication or traditional erasure codes that struggle during node churn, Red Stuff is built for availability. It reportedly achieves around 4.5× storage overhead while enabling self-healing recovery where repair bandwidth scales only with the data actually lost, not the entire object. The paper also addresses real-world network conditions. Red Stuff supports storage challenges in asynchronous environments, making it harder for nodes to fake storage by exploiting timing delays. Why Sui matters Walrus uses Sui as its coordination and incentive layer rather than spinning up a dedicated storage blockchain. For a system that prioritizes uptime and operational predictability, that’s a pragmatic architectural choice. The trader’s filter Ignore the story. Track paid storage, repeat usage, and retrieval performance under load. If those metrics grow, WAL starts to look like a token backed by genuine infrastructure demand—not just narrative momentum. @Walrus 🦭/acc $WAL #walrus
4 reasons why $75K may have been Bitcoin’s 2026 price bottom
Data suggests Bitcoin is unlikely to fall further than its year-to-date low of $74,680. Cointelegraph explains why. Key takeaways: Bitcoin fell to $74,680 after futures market liquidations, yet derivatives data show no signs of panic or extreme bearishness. Spot Bitcoin ETF outflows reached $3.2 billion, but represent less than 3% of assets under management. Bitcoin price plunged to $74,680 on Monday after a total of $1.8 billion in bullish leveraged positions were liquidated since the market downturn on Thursday. Traders moved into cash and short-term government bonds, especially after silver prices fell 41% over three days. Concerns over stretched valuations in the tech sector pushed investors into a more risk-averse stance. Traders fear that further downside for Bitcoin remains possible as gold has been selected as a clear store of value, and saw its market capitalization reach $33 trillion, an 18% rise over the past 3 months. Despite the price downside, four indicators suggest that Bitcoin may hold above $75,000 through 2026, as macroeconomic risks have eased and traders overstate the scale of outflows and the impact of BTC derivatives. Yields on the US 2-year Treasury stood at 3.54% on Monday, unchanged from three weeks earlier. A surge in demand for US government-backed assets would likely have pushed yields below 3.45%, similar to October 2025, when the US entered a prolonged government funding shutdown, and nonfarm payroll data weakened. Likewise, the S&P 500 index traded just 0.4% below its all-time high on Monday, signaling confidence in a swift resolution to the latest US government partial shutdown, which began on Saturday. US House Speaker Mike Johnson told Fox News that an agreement is expected by Tuesday, despite limited support from House Democrats. Bitcoin derivatives show resilience despite 40.8% price drop Concerns around the artificial intelligence sector gradually eased after tech giant Oracle (ORCL US) announced plans to raise up to $50 billion in debt and equity during 2026 to meet contracted demand from its cloud customers. Investors had been unsettled by Oracle’s aggressive artificial intelligence expansion, which previously led to a 50% drop in the company’s share price, according to CNBC. Resilience in Bitcoin derivatives suggests that professional traders have refused to turn bearish despite the 40.8% price decline from the $126,220 all-time high reached in October 2025. Periods of excessive demand for bearish positions typically trigger an inversion in Bitcoin futures, meaning those contracts trade below spot market prices. 📷Bitcoin 2-month futures basis rate. Source: Laevitas.ch The Bitcoin futures annualized premium (basis rate) stood at 3% on Monday, signaling weak demand for leveraged bullish positions. Under neutral conditions, the indicator usually ranges between 5% and 10% to compensate for the longer settlement period. Even so, there are no signs of stress in BTC derivatives markets, as aggregate futures open interest remains healthy at $40 billion, down 10% over the past 30 days. 📷Bitcoin US-listed spot ETFs daily net flows, USD. Source: CoinGlass Traders grew increasingly concerned after spot Bitcoin exchange-traded funds (ETFs) recorded $3.2 billion in net outflows since Jan. 16. Even so, the figure represents less than 3% of the products’ assets under management. Strategy (MSTR US) also fell victim to unfounded speculation after its shares traded below net asset value, fueling fears that the company would sell some of its Bitcoin. Related: Saylor’s Strategy buys $75.3M in BTC as prices briefly dip below $75K Beyond the absence of covenants that would force liquidation below a specific Bitcoin price, Strategy announced $1.44 billion in cash reserves in December 2025 to cover dividend and interest obligations. Bitcoin’s price may remain under pressure as traders try to pinpoint the drivers behind the recent sell-off, but there are strong indications that the $75,000 support level may hold.
While they tell you support and resistances don't get respected by Bitcoin, yesterday it bounced on the EXACT LOW of April 7, 2025.
It was the EXACT bottom. And I mean, local bottom, don't get me wrong. Most likely won't hold the next retest.
Aside from that, this area ranging from 72.5-74.5k is out KEY weekly support from Bitcoin. As I said before, price tested and retested this area multiple times in the past.
That's where it will be smart to long.
Later, as price stabilizes and the situation becomes clearer, we may get some lows just below 70k and still remain in the same structure.
If $BTC has to range somewhere, I believe we're now at the right spot.
For now, I'm keeping the ETH long open. Will maybe share the BTC version later of the same long.