
Last night, Polkadot's OpenDev Developer Call was held as scheduled. This is a monthly meeting where the core group of protocol developers and Technical Fellowship members synchronize real engineering progress to the outside world.
Here, the discussion is not about 'what might be done in the future', but about which codes have already been written, which are being audited, which are ready to be merged, and which will take effect in the next runtime.
It is precisely for this reason that many of Polkadot's most important and easily overlooked key changes often appear first and most clearly in this meeting.
In this article, based on last night's OpenDev Call, we have summarized several latest developments that the Chinese community should pay close attention to, helping you understand more clearly: Where is Polkadot now, and what important changes are quietly advancing?

One, the new Website is not just a skin change, but an entry point to the 'Second Era'.
At the beginning of the meeting, the host showcased the newly launched Polkadot website.
For this new website, my most intuitive change is:
Pink has disappeared, replaced by black / white / gray.
The page no longer stacks white papers, founder introductions, and collaboration logos.
Technical jargon has noticeably decreased, and product entry points are more prominent.
This is not a change in aesthetic preference, but a shift in narrative focus.

https://polkadot.com/
Fellowship member Basti mentioned that the new official website is more like an entry reserved for the 'Polkadot Portal' that is being built for the future. The future Polkadot Portal will be built on top of Hub, People Chain, and individual proofs. It will no longer just introduce technology, but allow ordinary people to truly 'enter and use' Polkadot's unified entry point. That is to say, the official website itself is already preparing for a 'product-level entry', rather than just a 'protocol introduction page'.
An interesting detail is that the official website has begun to directly mention and showcase the 500ms block / half-second transaction experience - and this feature is precisely what Basti submitted PR for last month.
Two, 500ms blocks: moving from discussion to code and auditing.
The concept of 500ms blocks has been frequently mentioned in the community recently, but without proper explanation, it can easily be misunderstood as 'TPS improvement' or 'faster confirmation', which is not that simple.
First, it must be clarified: 500ms blocks do not mean that final confirmation is completed every 0.5 seconds.
In Polkadot, the lifecycle of a block is divided into two stages:
Block Production: New blocks are proposed by designated validators within a time window.
Finalization of blocks: Confirming a series of blocks through GRANDPA consensus that cannot be rolled back.
The so-called 500ms blocks refer to the production rhythm of blocks being increased to once every 0.5 seconds, allowing the relay chain to advance on-chain state with smaller, denser steps, rather than changing the final confirmation mechanism itself.
In other words, Polkadot's state updates are no longer advancing in 'jumps every few seconds', but have transformed into 'almost real-time continuous advancement'.
So what does this mean for users and applications?
If it's just 'technology a bit faster', it actually has limited significance. The real importance of 500ms blocks lies in: it significantly reduces the experience cost of 'must be on-chain operations'.
To give a most intuitive example, transfers and payments:
In the 6-second block era, users often had to wait a few seconds after initiating a transaction to see it packed into a block, and wallets usually waited for another 1-2 blocks before daring to display 'on-chain'.
Under the 500ms block, transactions can almost be included in new blocks within 1 second, giving users a near real-time feedback experience.
This change is particularly critical for applications aimed at ordinary users.
During the live broadcast, Basti repeatedly emphasized one principle: Do not cram everything into global consensus. Chatting, content transmission, and P2P interaction are not suitable for on-chain, but steps like identity verification, permission changes, and state anchoring that 'must be on-chain' will become a bottleneck for product experience if they have to wait several seconds each time.
The significance of 500ms blocks lies in preventing these key steps from 'slowing down the entire application'.
As Polkadot enters the 'Second Age', the execution layers (EVM, PolkaVM, Runtime primitives) begin to deeply cooperate within the same network, and applications will increasingly involve multi-module and cross-VM combined execution.
Shorter block times mean:
Finer-grained execution scheduling.
Lower interaction latency.
More suitable for building Web3 products close to Web2 experiences.
This is why the 500ms block is not a standalone performance optimization, but an important infrastructure upgrade for Polkadot to move towards a truly 'usable product platform'.
6-second blocks are sufficient to support a powerful blockchain; 500ms blocks are just beginning to support 'systems used continuously by ordinary people'.
Regarding the 500 millisecond blocks (also jokingly referred to as Basti Blocks, as they were developed by Basti), Basti provided a very clear status update:
The core PR has been fully submitted.
Audits are underway.
Preliminary review feedback has been received, and the overall situation is good.
If progress goes smoothly:
By the end of the year, the main focus will be on review and auditing.
By early 2026, it is expected to enter the substantial implementation stage.
This is also a key step for Polkadot in terms of 'performance experience'.
Three, Hard Pressure: The issuance limit is becoming a protocol fact.
This is one of the most concerned and clearly updated points of this meeting. Fellowship member Dom explicitly stated:
The code for Hard Pressure has been completed.
Related audits have been initiated and completed.
Currently re-running final checks, preparing to merge.
The next most important thing is the runtime node:
The target is runtime 2.04.
If time does not allow, it will enter 2.05 at the latest.
Once the code is merged and published in the runtime, Polkadot will officially have an upper limit! Look forward to March 14, 2026!
This is an important turning point for Polkadot to move from 'inflation narrative' to 'long-term predictable economic model'.
Four, OpenGov governance experience is evolving.
Another significant change that has been completed is RFC-150: allowing users to vote and delegate simultaneously.
The code for this feature has been completed and is currently in a 'waiting for more reviews' status.
Its significance lies in:
No longer requiring a choice between 'participating yourself' and 'fully entrusting representatives'.
Closer to the way governance participation works in reality.
For ordinary DOT holders, this is a very practical, yet long-overlooked experience upgrade.
In addition, with the increase of Decision Deposit in OpenGov, a realistic issue has begun to arise: Will ordinary proposers be blocked by higher thresholds?
For this, Dom took over and completed a key mechanism:
Decision Deposit crowdfunding logic.
Allowing multiple participants to share the deposit.
This not only reduces the financial pressure on individual proposers but also makes the governance process more democratic, collaborative, and closer to the real process of community consensus formation.
Currently, this part of the logic has been completed and is entering the migration, benchmarking, and testing stage.
Five, Social Recovery: Account security begins to work 'like the human world'.
Oliver and Clara updated the progress of Social Recovery:
The backend pallet is basically completed.
Deployed on the private test chain.
The front end is being integrated.
This mechanism allows you to:
Pre-designated friends / family / multiple roles.
In the case of key loss or risk, accounts can be recovered through social relationships.
This is a very important step for Polkadot in terms of 'usability' and 'ordinary user experience'.
Six, Snowbridge v2: A cheaper, faster Ethereum bridge is on the way.
In this OpenDev Developer Call, Fellowship member Clara systematically updated the latest progress of Snowbridge. If you only understand Snowbridge as 'the official bridge between Polkadot and Ethereum', the significance of these changes may be underestimated.
In fact, the core goal of this round of updates has only one: to make cross-chain move from 'expensive, slow, only suitable for large operations' to 'daily usable'.
1️⃣ Snowbridge v2 has been launched, and cross-chain costs have significantly decreased.
First, the cost issue that everyone cares about the most.
The on-chain logic of Snowbridge v2 has been deployed, but the official front-end has not yet enabled v2 by default, mainly to wait for a critical runtime fix to be fully online to avoid user experience issues in extreme situations.
Once the front-end switch is completed, the cross-chain costs of Snowbridge will show a very obvious change: fees will drop from 'traditional security bridge levels' to levels close to those of multi-signature bridges.
This point is very important. In the past, Snowbridge was more suitable for large and infrequent asset transfers, but under the v2 model, it has begun to possess the realistic possibility of supporting more frequent cross-chain operations.
2️⃣ Cross-chain confirmation time: from 30 minutes → 1-2 minutes.
Beyond costs, another long-standing complaint is: slowness.
Currently, the confirmation time for Snowbridge is about 30 minutes (both ways), which is reasonable in terms of security but not user-friendly. Clara mentioned during the live broadcast that the team has introduced a new technical path: Fiat-Shamir + BEEFY.
Under this new protocol:
Cross-chain confirmation time is expected to be shortened to 1-2 minutes.
No longer needing to wait for a complete long confirmation window.
More suitable for interactive, product-level applications.
This path has completed auditing and is currently fixing detail issues, expected to go live in 1-2 months.
3️⃣ The L2 ↔ Polkadot bridge is progressing, targeting Q1 2026.
The third progress may be the easiest to overlook but has the most significant long-term impact.
Clara mentioned that the team is promoting the bridging solution between Ethereum L2 and Polkadot.
Currently, there is already a testnet Demo.
Using dual bridge paths (Ethereum → L2 → Polkadot).
Target release date: Q1 2026.
This means that in the future, Polkadot will no longer just connect to the Ethereum mainnet, but will begin to directly integrate into the actual use scenarios of Ethereum L2.
For developers, this will significantly lower the psychological and technical barriers to entering Polkadot; for users, it is an important step towards 'where the assets are, where the applications can be used'.
If you look at these three points together, you will find that Snowbridge is completing a role transformation: from 'secure but heavy official bridge' → 'truly usable cross-chain infrastructure for daily interaction'. This is a critical underlying support for Polkadot's subsequent Hub, EVM, DeFi, stablecoins, and application ecosystem.
Seven, the execution layer has not been abandoned: EVM / Revive is still in the main line.
In the free discussion session nearing the end of the meeting, the host directly responded to one of the community's most concerned recent questions: Is Polkadot's EVM still moving forward? Has Revive replaced EVM?
This doubt is not unfounded.
In the past period, the community has seen an increasing amount of work around Revive, PolkaVM (PVM), and the reconstruction of the execution layer in code repositories and technical discussions, while discussions about 'EVM' have decreased. This has led many to have the illusion that Polkadot is shifting towards a 'non-EVM route'.
In this OpenDev Developer Call, the developers provided very clear and critical clarifications.
First, the conclusion is straightforward: EVM has not been abandoned, and the execution path of Solidity still exists and is being continuously advanced.
Polkadot is not 'building another EVM public chain', but that does not mean abandoning EVM. On the contrary, the current focus is to upgrade EVM from the past 'managed and isolated execution environment' to a part of the protocol and system level.
The second easily misunderstood point is Revive.
Revive itself is not meant to be 'a new virtual machine that replaces EVM', but rather an underlying reconstruction of Polkadot's execution layer.
It is responsible for scheduling the execution of different virtual machines.
Unified gas / weight billing logic.
Providing consistent state access and calling entry points.
On top of Revive, it can simultaneously carry:
EVM (Solidity / Ethereum toolchain)
PolkaVM (RISC-V, multi-language, high-performance computing)
And other execution environments that may be connected in the future.
In other words, what Revive solves is not 'whether to use EVM', but rather: how should the execution layer work when more than one VM exists simultaneously?
You can also learn more in the latest article on PolkaWorld (How far has Polkadot's EVM compatibility come?).
What has truly changed is not the technical route, but the narrative approach.
In the past, blockchains often built an entire chain's identity around 'a single VM': EVM chains, Wasm chains, Move chains...
Polkadot has now clearly chosen a different path - no longer defining itself around a specific VM, but rather building a unified base that accommodates multiple execution environments.
In this system:
Solidity developers can still use familiar tools.
But are no longer limited to 'only doing what EVM can do'.
Rather, they can work in conjunction with native assets, governance, cross-chain capabilities, and PVM execution environments.
Therefore, EVM is still an important part of Polkadot, but it is no longer 'the only main character', but rather a member of a unified execution layer.
This is also the direction that Polkadot Hub is building towards: a protocol-level platform that can accommodate multiple execution models and allow them to cooperate with each other.
Summary: Polkadot is becoming 'usable, governable, and sustainable'.
If you only look at the price, you might find it hard to feel the changes happening in Polkadot. But if you follow these developer meetings, you will discover a very clear main line:
The execution layer is getting faster.
The economic model is converging.
The governance mechanism is becoming more realistic.
The product entry is taking shape.
Polkadot's 'Second Era' is not a slogan. It is being gradually written line by line of code, one runtime upgrade at a time!
Complete video of the meeting: https://www.youtube.com/watch?v=ZiNfJlN-u2M
PolkaWorld Telegram group: https://t.me/+z7BUktDraU1mNWE1
PolkaWorld Youtube channel:
https://www.youtube.com/c/PolkaWorld
PolkaWorld Twitter:
@polkaworld_org
