A few days ago, I was seeing a situation to be simply a delay in any financial process. Cross-border payment had already been initiated balance of sender was supposed to be sufficient and receiving party was verified more than once in the past. But the transaction didn't end in time. This was not rejected and technically not blocked. Instead it was held in a state of not knowing again where further verifications were triggered off which were already already done.
At a surface level, this does seem to be an operational inefficiency. However, when we get down to it, it becomes clear that it is a structural issue that is pervasive in most digital and financial systems today. These systems do not put restrictions on their processing capacity in terms of transaction processing and data movement. In a lot of cases, they are limited by failure to rely on previously verified information. Each system acts like it has to establish trust for itself even if that trust has been established somewhere else.
This leads to the situation where verification is repetitive but not reusable. Identity access is confirmed multiple times legitimacy of the transaction is evaluated at every point and compliance checking is done in multiple layers of the same process. The result is not only delay but a certain rhythmical form of friction which is proportional to the complexity. As the systems become more connected, failures to have a set of trust mechanisms creates the problem where instead of being able to build on each other or reuse them they end up duplicating the effort.
This is where the approach introduced by Sign becomes structurally important. Rather than focussing on just faster execution or lower transaction cost, it tries to tackle problems of how trust is created and re-used between systems? The big idea is to make verification into a form which can be validated externally without having to do it time and time again. This is done using attestations, where a trusted entity is verifying a given claim and making a cryptographically anchored verification proof of the claim.
In practical terms this means that once a piece of information is determined to be true by someone else, recognized as such, other systems do not have to go through the same process. Instead, they make an assessment on the trustworthiness of the person or organization responsible for such attestation. If the issuer is considered to be reliable, the system does not need to reprocess the underlying data, it can accept the claim. This changes the verification from a local and repetitive task and makes it a distributed and re-usable mechanism.
Such a shift has important implications. In many real-world processes and especially in fields such as cross-border payments, that of business compliance and financial approval. The source of delay is not in execution, but validation. Transactions are able to be processed quickly, as the waiting period to be approved by the blockchain network is time consuming since there are multiple participants and every transaction must be verified independently. By allowing for the reuse of verification systems can spend less time on redundant checks of verification and can instead concentrate on making decisions based on already validated inputs.
However, this model assists to draw forth a new set of issues that can not be neglected. The success of attestation-based systems is very much dependent on the credibility and acceptance of the bodies that provided the attestations. If there is not an agreement on what issuers can be trusted the system is at risk of fragmenting. Different platforms may have different attestors recognized, which may recreate the same trust silos that the system is supposed to eliminate.
There is the problem of adoption. In order for this model to work at scale, institutions, platforms and service providers need to ensure they incorporate it into their workflow. This not only has to be implemented in the technical sense but also in a regulatory and operational sense. Not being employed consistently by enough users, the value of reusable verification is limited, to the extent that this female may be used in certain isolated cases, rather than as commonly recognised as an infrastructure layer.
From a market point of view, this is where evaluation is a little more nuanced. Price movements and trading volume may be a measure of interest, but not if the system is being used in a meaningful way or not. More related indicators would be how often attestations are issued and used again and number of people using the system repetitively and how much institutions are relying on these verification mechanisms in real operations.
Ultimately, the importance of such an approach is that it is another way of framing the problem. Instead of asking how systems can verify data in a way that is more efficient, it asks whether systems can make use of verification that has already been completed elsewhere.
This is a fine, but important distinction. If trust can be made portable and reusable many of the inefficiencies that exist today may slowly disappear. If not, verification will continue to be a bottleneck in the process, no matter how advanced we make the processing of transactions.
The outcome will depend not only on technology, but whether or not different parts of the ecosystem are willing to change away from an isolated trust model towards a more shared and interoperable structure. Until that happens, systems may be able to move faster and faster but those systems will not necessarily become more efficient.
@SignOfficial #SignDigitalSovereignInfra $SIGN
