When Blockchain Finally Understands Who It Should Serve - A Deep Dive into Plasma's Technical Ambitions for a Stablecoin-Specific Public Chain
Ethereum aims to become the king of all chains, but the applications that truly run happily on the chain are actually just a few types: DEX lending protocols and various fancy derivatives. However, if we talk about what really has massive user demand, it is actually the simplest thing: transfer payments, especially stablecoin transfers. You can see the circulation of USDT on various chains to understand this. But the problem arises. The existing general-purpose public chains handle stablecoin payments with an efficiency that drives people crazy. Ethereum is not worth mentioning; the gas fees are outrageously high. Transferring a hundred dollars' worth of USDT might cost over ten dollars in fees, which is unbearable. Solana is indeed fast, but you have to first acquire SOL for gas and also worry about the network occasionally having issues. Base, this L2, is cheap, but ultimately still relies on Ethereum's mainnet for data availability costs. Not to mention these chains are not specifically optimized for stablecoins; various functions have to rely on third-party protocols or wallets, resulting in a fragmented user experience.
Many people ask me if Aster has been sold; the current price does not reach my valuation, so I must be holding on. Reasons for keeping Aster: 1. Differences in market environment October 2024 (HYPE): Early stage of a bull market, strong demand for contracts September 2025 (Aster): The market is relatively rational, establishing trust in spot trading is more important 2. Different competitive landscape HYPE advantage: At that time, competition in contract DEX was low, and technology was leading Aster challenge: Facing strong competitors like the mature Hyperliquid, it needs to establish a user base first 3. Token distribution strategy Aster airdrops 8.8% of the supply, preventing large-scale sell-offs Withdrawal lock ensures early liquidity is controllable Strategy assessment Aster is not a simple replication of HYPE, but a reverse strategy based on different market environments: Same goals: Control liquidity, gain price discovery dominance, build platform moats Different paths: Spot first vs contract first Adaptability: Rational choices based on the current market environment and competitive landscape Next step predictions Short term (2-4 weeks) More second-tier CEX spot trading will go live Completion of APX token swaps on first-tier exchanges like Binance Liquidity gradually improving but still relatively low Medium term (1-3 months) Derivatives trading will be launched, prioritized on the Aster platform Mainstream CEXs start to pay attention to ASTER contract demand Forming positive competition with HYPE Long-term risks If the spot phase cannot establish a sufficient user base, subsequent derivatives promotion will face difficulties Dispersed liquidity may affect trading experience, less effective than HYPE's concentrated strategy Aster has chosen a more conservative but possibly more suitable strategy for the current environment, hoping for its success!#空投大毛
Walrus's choice to build on Sui is not a random decision, but rather based on Sui's parallel processing capabilities and the composability of Move smart contracts. This combination creates gameplay that cannot be achieved by other storage protocols, turning storage resources directly into on-chain objects that can be transferred, split, and merged like tokens. For example, with traditional storage, if you buy 100GB of space, you can only use it yourself. But in Walrus, this 100GB is a programmable asset. You can rent out 30GB to others, use the remaining 70GB for collateralized lending, or even package it into an NFT for trading on the market. This flexibility opens up entirely new business models. For developers, the best part is that all metadata is on the Sui chain. Querying blob status does not require going to off-chain servers; you can directly call smart contracts. The response speed is sub-second, making dynamic NFTs feasible. Your NFT images can update in real-time based on on-chain events, such as when a game character upgrades, the appearance of the equipment changes automatically. This is much more complex on other protocols. Sui's high throughput also solves the common congestion issues seen in storage protocols, capable of processing thousands of storage transactions per second, and the gas fees are predictable, not like Ethereum's sudden spikes. This stability is very important for enterprise users who need to upload data in bulk. Overall, Sui integration is not a limitation but an amplifier. @Walrus 🦭/acc $WAL #Walrus
There are too many projects in the Web3 space that use AI as a marketing tool, either by adding an AI plugin to existing public chains or by relying on LLM concepts to generate hype. However, Vanar Chain directly reconstructs from the bottom up, making AI the core of its infrastructure. This is the true native AI blockchain. Its five-layer architecture is not just a superficial PPT product; the underlying Modular L1 chain guarantees high throughput, fast transactions, and low costs. Coupled with the semantic storage layer Neutron, it can transform raw data such as PDF contracts and property certificates into 'knowledge objects' that AI can directly understand, eliminating reliance on fragile IPFS hashes. The most impressive feature is the Kayon inference engine, which allows smart contracts to analyze data and validate compliance independently, without the need for third-party oracles or off-chain computations. This is significantly stronger than competitors that only shout slogans. $VANRY As an ecological token, it is not just a trading chip; staking can yield passive income and support cross-chain asset bridging and smart application development. Now, EVM compatibility also supports multi-language SDKs, allowing developers to get started without learning a new framework. This practicality leaves similar projects in the dust. A true AI blockchain should be like Vanar, bringing technology to practical scenarios such as payment compliance and asset on-chain, rather than relying on concepts to exploit users. @Vanarchain #Vanar
I carefully looked at the token allocation of XPL. Although the circulation rate is currently only over 20%, this precisely indicates that the team is making long-term plans, not the kind of short-term strategy that relies on high circulation and attracting retail investors through price pumping. Out of a total of 10 billion, 40% is allocated for ecological growth, 25% for the team, 25% for investors, and 10% for public sales. This ratio is relatively reasonable in the current market. The key is that the lock-up mechanism is designed quite solidly; both the team and investors have a 1-year cliff followed by 2 years of linear release. This means that there will not be significant selling pressure until at least September 2026. Although the ecological portion releases monthly, it is distributed proportionally and will not cause a massive sell-off at once. The public sale part has American investors locked until the end of July. All these designs are aimed at avoiding excessive early circulation that could lead to drastic price fluctuations. In terms of price performance, XPL was launched at $0.21, peaked at $1.68, and has now fallen back to around $0.13. This correction, against the backdrop of a bear market, is actually not bad. You can see many new coins launched last year have dropped over 90%. XPL rebounded 13% from its low of $0.115 in December last year, indicating there is support and it is not purely in a death spiral. With a market cap of $270 million, it ranks over 200, which is undervalued compared to the more than $8 billion TVL, providing significant room for growth. Especially since top exchanges like Binance, Bybit, and OKX have already listed spot and futures trading, liquidity is not an issue. In January, Binance even launched a staking reward program for 3.5 million XPL. All of these are increasing the application scenarios for the token. In the long run, as the ecosystem expands, the demand for XPL as a gas token will stabilize and grow. Although USDT transfers are free, other transactions still require payment in XPL. Moreover, if a PoS consensus transition is implemented in the future, staking demand will further increase. Coupled with a controlled unlocking rhythm, I believe the token economics design is robust and not a short-term operation that exploits investors. @Plasma $XPL #plasma
The NFT market shows signs of recovery The number of NFT buyers has increased by 120%, and the trading volume has reached $61.5 million, with on-chain activity warming up. The NFT market has been quiet for a long time, and this data is indeed eye-catching. However, I feel that this is more like an early accumulation stage, and the real explosion may still require some new narratives or star projects to emerge. It would be more prudent to observe the sustainability in the follow-up before making a judgment.
Decentralized storage has always had a deadlock, which is that the cost is too high. Traditional solutions require data to be copied hundreds of times to ensure safety. This brute-force approach, although simple, cannot be afforded by wallets. The RedStuff 2D erasure coding technology used by Walrus has completely changed this situation. It divides the data into small blocks, and after encoding, only requires 4.5 times the expansion to achieve the same level of security. Calculating specifically, it is 99% cheaper than Arweave and can save 80% compared to Filecoin. These are not theoretical numbers; in practical applications, it only costs $50 per TB per year. Even without subsidies, it amounts to just $250 per month. Compared to Arweave's one-time payment plans of over $10,000, the cost advantage is simply crushing. Even more impressive is that this coding also has a self-healing function, requiring only one-third of the data fragments to fully recover files. The bandwidth consumption is a fraction of traditional methods. This means that when the network is congested, while others are still in line, Walrus users have already received their data. For AI training or video streaming that requires frequent access to large files, this speed advantage can directly translate into product experience. I won't go into the technical details, but the core is that it elegantly solves engineering problems with mathematics, turning the impossible into the possible. This kind of innovation is what truly drives industry progress, not merely piling up hardware but relying on algorithm breakthroughs. That's also why top institutions are willing to invest $140 million; they see an opportunity for paradigm shift. @Walrus 🦭/acc $WAL #Walrus
$700,000 TVL is basically a joke compared to the tens of billions of Optimism and the hundreds of billions of Polygon. Dusk has completely no presence in the DeFi field. PieSwap, this DEX, has launched, but the liquidity depth simply cannot support a decent trading volume. The 0.3% fee standard has no competitive edge. Uniswap V3 has long achieved tiered rates of 0.05%-1%. Sozu's delegated staking is a good attempt, allowing users who don't want to run nodes to participate in staking. However, the problem is that the overall staking yield is not significantly advantageous compared to other PoS chains. Moreover, the concept of Hyperstaking sounds quite new, but it is actually just smart contract custody, with low technical barriers. Dusk's DeFi needs to break through by capturing its uniqueness, which is the combination of privacy and compliance. For instance, using Hedger to create a confidential DEX where the order book is fully encrypted but settlement is transparent. This is something that cannot be achieved in traditional DeFi. The problem is that the user education cost for such products is too high. Ordinary DeFi players are accustomed to transparent AMM and order books, and suddenly getting them to accept a fully encrypted trading interface poses a significant psychological barrier. Therefore, in the short term, Dusk's DeFi is more likely to serve institutions rather than retail investors. However, institutions have extremely high requirements for on-chain liquidity, which leads back to that deadlock. $DUSK #Dusk @Dusk
NPEX's plan to put 300 million euros worth of assets on the blockchain sounds enticing, but the actual progress has been frustratingly slow. Since its announcement in 2024, it is still in the waiting list phase at the beginning of 2026. What exactly is holding it back? Technically, Dusk has everything it needs: the XSC compliant token standard, the Citadel identity protocol, and the DuskTrade trading platform. It has also secured regulatory licenses. The problem may lie in the acceptance of traditional financial institutions. Asking fund managers to place several hundred million in assets on a chain with a total value locked of less than a million dollars is too high a trust threshold. Moreover, RWA is not a simple tokenization; it involves legal rights confirmation, custodial arrangements, and clearing processes. Each link needs to be established. As a technology provider, Dusk has limited capabilities; much more relies on NPEX to promote regulatory sandboxes and institutional cooperation. In contrast, Polygon already has real funds like Franklin Templeton operating, and Centrifuge has run tens of millions of dollars in real credit scenarios. The gap for Dusk is not technological but ecological momentum. To put it simply, it's a question of which came first, the chicken or the egg. Without assets on the chain, there is no liquidity; without liquidity, asset owners are even less willing to participate. Breaking this cycle is the biggest challenge. $DUSK #Dusk @Dusk
How to resolve the contradiction between zero-knowledge proofs and regulatory compliance Privacy and compliance are inherently opposed. The completely anonymous route like Zcash is not accepted by regulators at all. However, Dusk has taken a middle path. The Phoenix protocol uses ZK proofs to hide amounts and senders while retaining the viewing key mechanism, allowing auditors to selectively disclose information. This design is theoretically quite clever, meeting user privacy needs while also addressing the regulatory requirements of MiCA and MiFID II. The licenses from NPEX have indeed been legitimately obtained. The issue lies in whether this system is flexible enough in practice. How quickly can regulators respond when disclosure is required? If generating a viewing key requires manual intervention every time, the efficiency will be too low. Moreover, the more complex the compliance process, the worse the user experience. Many institutional users might prefer to choose the traditional financial system rather than deal with these technical details. Additionally, the competitor Oasis uses a TEE trusted execution environment. Although it is not as decentralized as ZK, at least the performance overhead is much lower. Aztec's client-side privacy execution also has its own advantages. Dusk needs to prove that its ZK+HE solution is truly better than competitors in real RWA scenarios. $DUSK #Dusk @Dusk
How Walrus Overcame Traditional Storage in the Cost War Everyone knows that blockchain storage has always been expensive, but after Walrus came out, the entire game changed. What is the playstyle of traditional on-chain storage? It's to copy your data hundreds of times and throw it to different nodes. This full replication method sounds safe, but the actual cost is outrageous. Storing objects directly on Sui incurs costs more than 100 times. Walrus uses the RedStuff erasure coding technology, which is much smarter. It splits your data into k small blocks and then encodes it to expand into n fragments. Theoretically, you only need to retrieve 1/3 of the fragments to restore the complete file. In practical applications, the expansion rate is 4.5-5 times, which saves a lot compared to traditional methods. Data shows it's 80% cheaper than Filecoin and 99% cheaper than Arweave. The problem is that this advantage has not yet been fully realized. The mainnet has been online for less than a year, with a total cost of only $440,000 and daily active users numbering only 15. No matter how great the technology is, if no one uses it, it's useless. Moreover, although this 4.5 times expansion is better than 100 times, it is still somewhat expensive for ordinary users. A 10MB file needs to become 45MB in storage, and the cost for smaller files is even less economical due to the high overhead of metadata. Another pitfall is that the price of the WAL token fluctuates too wildly, having dropped 82% from its ATH. Users' storage costs fluctuate with the price of the token. This uncertainty may deter many projects that require long-term storage. The officials say they want to implement stable pricing in USD, but it has not been realized yet. Therefore, while the cost advantage is theoretically valid, in practice it still depends on future optimizations. @Walrus 🦭/acc $WAL #Walrus
Having observed too many public chains fall into performance traps due to state explosion, Dusk's three-layer design offers a different approach. DuskDS is dedicated to settlement and consensus, while the execution layers DuskEVM and DuskVM each have their own roles. The benefits are obvious: the underlying layer only stores validity proofs without touching application states, which directly reduces the burden on nodes. Compared to projects that cram everything into L1, the requirement for full nodes to have several TBs of storage after long-term operation makes it unmanageable for ordinary developers. Dusk checks state transitions before going on-chain through MIPS-driven pre-validators. This design is quite clever, essentially adding a quality inspection layer to the consensus layer, eliminating the hassle of a 7-day challenge window like Optimism. However, this brings new issues: how significant is the computational overhead of the pre-validators? Will it become a new bottleneck? Just because the testnet runs smoothly doesn't mean the mainnet can handle massive transactions stably. Moreover, while modularization is elegant, the latency and complexity of cross-layer calls cannot be ignored. In the DeFi scenario, if Phoenix's privacy transactions need to interact with DuskEVM's DEX, whether the user experience is smooth enough will depend on actual performance. There are always trade-offs in technology; the key is whether the team can find a balance between performance and decentralization. $DUSK #Dusk @Dusk
Spot BTC/ETH ETF saw an inflow of $1.9 billion last week Spot Bitcoin and Ethereum ETFs recorded the strongest inflow since October last year, totaling $1.9 billion last week. This data presents an interesting contrast to the market decline. Are institutional funds buying the dip? Or are long-term investors seizing the opportunity to build positions? In any case, this scale of inflow can indeed support prices to some extent, preventing them from falling too drastically.
For so many years, most public chain projects are done just for the sake of doing them, trying to achieve results without focusing on any specific aspect. What impressed me about the Plasma project is its sufficient focus; from the very beginning, it clearly defined its goal to create a stablecoin-specific chain instead of competing with those general public chains for ecological breadth. The zero transaction fee USDT transfer feature truly addresses a pain point. Just think about it: right now, transferring funds using Ethereum can cost several or even dozens of dollars in gas fees, making small transfers not worthwhile. Plasma directly solves this problem at the protocol layer using the paymaster mechanism, not relying on subsidies or inflation but achieving it through customized consensus layer design. This approach is quite smart; optimizing specifically for stablecoin scenarios is much stronger than trying to do everything but doing nothing well. The EVM compatibility design is also quite clever; developers don't have to learn new things and can directly use tools like Foundry and Hardhat. Migrating contracts from Ethereum incurs almost zero cost. Just look at how Aave absorbed $6.6 billion in deposits within 48 hours of going live on Plasma. What does this data indicate? It shows that DeFi protocols are willing to join because the barriers are low and the user experience is good. Unlike some new public chains that create their own development environments, where the learning curve is too steep, no one wants to engage. Now, the TVL has stabilized at over $8 billion, ranking in the top ten, and this has only been a few months since the mainnet launch. The growth rate is on par with that of Solana and Avalanche in their early days, and the key is that the quality of its TVL is very high—90% is related to stablecoins, not the kind of false prosperity supported by mining and selling. Top protocols like Pendle, Ethena, and Maple have all joined, and the quality of the ecosystem speaks for itself. Technically, the PlasmaBFT consensus can achieve sub-second finality, and TPS can reach thousands, which is totally sufficient for payment scenarios. There’s no need to sacrifice decentralization like those chains with TPS in the tens of thousands. The balance is quite good; currently, over 40,000 USDT transfers are processed daily. Although there’s still a gap compared to Tron, the growth trend is evident, and Plasma has the EVM ecological advantage, which provides more room for imagination in the long term. @Plasma $XPL #plasma
$852 million liquidated within 24 hours, many long positions were wiped out In the past day, over $852 million worth of positions were forcibly liquidated, of which $787 million came from long positions. This number alone makes one's heart race. Leveraged players have been severely affected by such volatility, and panic sentiment has intensified. I have always believed that opening high leverage in such uncertain times is akin to gambling, and this liquidation wave has once again proven this point. The downward momentum in the market is thus magnified.
Trump threatens to impose tariffs on the EU, triggering global market nerves Trump announced that if Denmark, Finland, Germany, and 8 other EU/NATO countries do not back down on trading in Greenland, a 10% tariff will be imposed starting February 1, rising to 25% before June. As soon as this news broke, BTC plummeted to 92.3k, a drop of nearly 3%. The EU is not to be underestimated either, preparing $100 billion in retaliatory tariffs. The traditional safe-haven asset gold surged to a historic high of $4664 per ounce, and BTC's "digital gold" narrative clearly couldn't hold up this time. I feel this move exposed the market's confidence in BTC's safe-haven properties, which is actually quite fragile; once a real risk event occurs, people still tend to habitually flee to gold.
Where to Store AI Training Data: Why I Think Walrus Might Be an Answer
Recently, there has been a heated debate in the AI community about the copyright and provenance of training data. OpenAI has been sued by several media outlets and authors, alleging that they used others' content to train their models without authorization. This issue has gained significant attention and has prompted the entire industry to focus on the compliance of data sources. The traditional approach is to store training data on one's own servers, but this presents two problems. First, data is easily tampered with; you claim your training data is clean and compliant, but you cannot prove it. Second, the cost of data storage is too high. A large model's training dataset can easily reach several TB or even PB, and the storage expenses are astronomical.
After three months of testing Walrus storage, I found these pitfalls of decentralized storage that competing products didn't tell you.
The decentralized storage niche is said to be hot and cold. Since IPFS came out in 2015, everyone has been shouting about overthrowing cloud storage, yet ten years later, AWS and Alibaba Cloud are still industry giants. Meanwhile, a bunch of decentralized storage projects have come and gone, with very few surviving. Walrus is a project I've been using intensively for the past three months, so let me share my actual experience. The initial choice of Walrus was mainly because I was creating a decentralized podcast platform that required storing a large number of audio files, with each file ranging from a few MB to several hundred MB. I tried Filecoin, but the storage deal process was too complicated, needing to find miners, negotiate prices, and wait for packaging. Just the preparation work could take several hours, and if the miner goes offline or encounters issues, your data might be lost. Although there is a backup mechanism, the entire experience feels very disjointed.
When Decentralized Storage Meets Erasure Coding: Why Does Walrus Dare to Claim 75 Times Cost Savings Compared to Filecoin?
Recently, while researching the Sui ecosystem, I discovered an interesting phenomenon. Everyone is discussing the performance and TPS of Layer 1, but the real bottleneck for the implementation of web3 applications is actually the old problem of data storage. Just think about it: an NFT project stores images on a centralized server. If the service provider goes bankrupt or fails to pay, the JPG you spent a lot of money on will directly return a 404. This is not a joke. Many early NFT projects are now facing this dilemma. The Walrus project caught my attention not because it is another storage protocol, but because it has taken a completely different approach in its technical implementation. Traditional decentralized storage either creates a storage market like Filecoin or offers one-time payment for permanent storage like Arweave. However, Walrus uses Red Stuff erasure coding technology. This sounds very academic, but it actually solves a very practical problem.