In the past few days, I've been glued to the screen, thoroughly going through Walrus's white paper and GitHub repository. I also ran their Devnet node, and there are some things I need to get off my chest. The current decentralized storage space is indeed too crowded; Filecoin is over there complicating things with complex space-time proofs, Arweave is shouting the slogan of permanence, and Celestia has separated out the DA layer to sell it. It seems like every niche is taken, but the Walrus developed by Mysten Labs actually has some merit. It didn't aim to directly compete for others' market share; instead, it cleverly chose a middle ground of 'large files, low-frequency access, and high throughput.'

I tried using the CLI to upload a dataset of several hundred megabytes, and the speed was faster than I expected. Traditional storage networks, like IPFS or Filecoin, operate on the core logic of slicing files and finding nodes to store them. To ensure data isn't lost, they either replicate copies madly or require extremely high-intensity zero-knowledge proofs, which places high hardware demands on miners. Ultimately, the cost falls on users. This time, Walrus directly used RaptorQ erasure codes, a technology that has been used in the communications field for a long time but is not widely deployed in blockchain storage. Simply put, it breaks data into a bunch of fragments; as long as part of the nodes in the network are alive, I can restore the data. This logic is much lighter than Filecoin's heavy asset model that often incurs slash penalties.
However, several shortcomings were discovered during the testing process. The current interaction experience is a disaster for non-technical users; the SDK documentation is quite obscure, and many parameters require me to guess from the source code. Furthermore, the current level of decentralization is questionable; during the Devnet phase, most nodes are likely still being run by the official team or core partners. This early-stage centralization in the network is a common issue, and whether it can genuinely transition into community control depends on the incentive mechanisms in place. Comparing it with Arweave, Arweave excels in the narrative of permanent storage, suitable for storing NFT metadata or historical archives. However, if you want to store several terabytes of video materials or AI model weights on it, the cost could bankrupt you. Walrus is clearly eyeing the Web2 cloud storage market, aiming to take on data that doesn’t necessarily need to be 'eternally stored' but must be 'censorship-resistant' at a lower price.
I also specifically looked at its coupling with Sui. Sui serves as the consensus layer, while Walrus acts as the storage layer; this Lego brick-style approach aligns well with the current modular narrative. However, one concern I have is that if Walrus becomes too closely tied to Sui, it may lose its potential as a general storage layer. After all, the developers on the Ethereum side are the majority. Although the official claim is to support multiple chains, the actual delays and gas fees for cross-chain calls have yet to see a perfect solution.
Last night, the network crashed once during the upload test, which is normal since it's a testnet. What I see as a positive point is its handling of Blob storage. Right now, Layer 2s are rushing to send data to the mainnet; Ethereum's EIP-4848 is cheaper, but still expensive. If Walrus can capture this part of the DA layer business, or even specifically create a storage repository for AI Agents to manage long and short-term memory, then the valuation logic would completely change. The current storage track lacks not capacity but 'verifiable efficient retrieval.' Filecoin's storage and retrieval are difficult, and Arweave is unaffordable. If Walrus can solve this dual demand issue even by 50%, it would be enough to establish a foothold in the next bull market.
But I still have to remind myself to stay calm; good technology doesn't mean good coin prices. I haven't seen a particularly detailed release chart for the $WAL token economics so far. Storage projects often face significant inflation pressure; miners need to sell coins to pay for electricity. If demand doesn't pick up, it can easily fall into a death spiral. I feel the project team has a bit of a slow rhythm now; the code update frequency is okay, but the community operations clearly feel understaffed, and many technical issues linger for a long time in Discord without responses.

The potential of this thing lies in its possibility to become the default hard drive for the next generation of Dapps. Current Dapps are too 'light'; even slightly heavier assets have to rely on AWS, which isn't Web3 at all. If Walrus can bring over a Web2 level of experience, even if it sacrifices a bit of its commitment to decentralization, I'm on board. After all, for the user, I only care about whether my images can load instantly and whether my videos will be deleted. As for whether the underlying technology is erasure codes or replica copies, who cares?


