線上客服
客服團隊
剛剛
親愛的 LBank 用戶
我們的線上客服系統目前遇到連線故障。我們正積極修復這一問題,但暫時無法提供確切的恢復時間。對於由此給您帶來的不便,我們深表歉意。
如需幫助,您可以透過電子郵件聯繫我們,我們將盡快回覆。
感謝您的理解與耐心。
LBank 客服團隊
There is always a tradeoff to be made when it comes to scaling on blockchain technology. Scaling comes with the competing priorities of decentralization, security, and performance. In the current conversation around scaling, with modular architectures and rollups taking center stage, the most immediate topic to tackle is inevitably, Data Availability (DA).
Many feel that simply running a node, or for that matter just doing some browsing on some block explorer, will answer some of the issues and provide clarity. But what if what you are looking at isn’t the whole narrative? What if some key piece of data regarding a transaction is missing? That is the small, but important detail which DA solves for.
Data availability guarantees that any party wishing to verify the integrity of the blockchain has access to the entire data set, and not just the validators. In the absence of data availability, the best consensus protocol is all but meaningless.
The post which follows, looks at why data availability is now the single most important piece of modular blockchain security, and how Celestia, EigenDA, and Avail are at the front lines of this new race to innovate.
The definition of data availability (DA) is the data available to be accessed at any point in time that allows validators to verify whether a set of transactions are valid. Without access to data related to the transactions, the user, or a node, is not able to independently determine that the state of the chain is accurate.
To put this in simple terms, you may have a blockchain that displays as transparent blocks, hashes, transactions but if a transaction shows only a summary of the transaction, at that point the consumer is unable to independently verify the transaction. This is similar to reviewing a company's books without being allowed to see the headings.
Let’s consider a simple example: say at some point in the update of a DeFi protocol some portion of transaction data was hidden; therefore consumers will not be able to verify if the integrity of the liquidity pool changed, or whether some funds were misallocated - everything may seem okay, but corruption could happen in the background.
This is why DA is an important foundation for trustless verification. Without DA, decentralization simply is a charade; it’s an illusion of transparency that we later are unable to verify.
The unavailability of data is not just a technology issue, it is nothing less than a matter of life and death. With unavailable data, users are unable to determine the state of the blockchain, detect fraud, or prevent bad behavior.
Consider a rollup where the sequencer only publishes summaries of a transaction without sharing the actual calldata. Users will not be able to verify their balance or rebuild the state trees. Once a sequencer commits a malicious act, it's no longer possible to submit fraud proofs or challenge states, and users will have to rely on faith instead of code for security.
Data Availability is typically what connects execution layers and consensus layers in the modular blockchain architecture; the unavailability of data represents the undetectable base that makes rollups safe.
What happens when Data Availability fails:
Ultimately, Data Availability isn't an application, it is the security model behind all scalable blockchains.
Celestia was the first modular blockchain with a separation of consensus and execution. Celestia has its own Data Availability layer that any rollup can connect to. This means that not every node must re-execute all transactions.
Their two main strategies are erasure coding and DAS. In short, data is broken into coded pieces and sent to many different nodes and a light client will randomly sample pieces to make sure a full dataset is available so that the light-client does not have to download any data in its entirety.
This design is easy to scale up and review. It is worth noting that it has a probabilistic dimension here: it relies on a threshold of honest players for the outcome. If there are too few nodes sampling, such as in a time of low activity or governance vote, liveness risks can arise.
All in all Celestia is great for throughput, but once you join the network you must be careful, as it sacrifices deterministic evidence for scalable assurance. This type of compromise could set the bar for modular security for years to come.
In contrast, EigenDA operates differently. It was built in Ethereum and exists in the context of EigenLayer's restaking framework. Its primary goals are to make throughput customizable and trust assumptions variable.
EigenDA provides rollups the flexibility to set their own DA capacity, while ensuring economic incentives remain aligned to the performance of their use case. A DeFi rollup will focus on lower latency with high throughput as its key development, while a governance chain will want redundancy and assurance of verification.
EigenDA's architecture is built on Ethereum's validator architecture and restaked operators, which creates your trust assumptions that are variable but layered. This is the differentiating factor to Celestia's probabilistic model. This gives us a valuable model of mixing Ethereum's Trust Model and modulated scalability.
The downside of this added flexibility is that it complicates being able to operate in a usable way. Developers of Rollup will have to balance this flexibility against stability. While this is tremendous freedom, it could become brittle very quickly.
EigenDA is still best suited to be used in concert with other systems as it is an avenue to extend the modular court and Ethereum. It is not to be treated as a standalone trust layer but as a DA marketplace within the superior trust layer of Ethereum.
Polygon Labs has developed Avail, a universal data availability layer that is agnostic and designed to interface with multiple ecosystems. Avail uses KZG commitments, a form of cryptography, to allow for short proofs of data availability. While Celestia samples samples probabilistically for proofs, Avail is able to produce small proofs that ensure data availability. This decreases the workload of light clients who verify data and improve usability and experiences.
However, terseness has its downsides: KZG-based systems need certain, specific, or structured reference strings, and setup procedures, and not applying proper care can lead to less decentralization. So the design is an exercise in finding a middle ground, they need to include openness but do so in a way that is efficient; basically the principle is to make sure relaxing verification does not create hidden dependencies.
As a project that combines these properties, it enables possibilities in projects in many ecosystems and becomes the cross-chain DA backbone in a modular environment that is breaking up.
Despite all three projects trying to scale DA on a large-scale, they have drastically differing methods of achieving that:
One other less discussed danger is data persistence. If a rollup migrates away to another data provider, what happens to the historical data of the legacy system? Without a standardized model of DA and interoperability, blockchain history itself may become broken; this is an implicitly articulated, but still overwhelmingly potent concern regarding composability.
The ultimate decision regarding which DA to employ will be determined by the particular chain's priorities: scalability v proof strength, flexibility v simplicity, and finally, ecosystem loyalty v neutrality.
As the space moves into mainstream rollups and more developed modular ecosystems will, third parties that cannot be DA buyers also add increased pressure for competition. Similar to the battles in cloud infrastructure, DA will be a competitive and increasingly nuanced market of DA buyers or sellers will be competing on everything from performance to cost to service reliability.
This could mean:
But even this could come with risk, as credentialing different DA service layers could create fragmentation of security assumptions, or switching costs could discourage switching from layered services when the services are supposed to be permissionless and interoperable.
Nevertheless, there will still be innovation here. The blockchain of the future may be constructed from a set of DA networks that are intended to create different trade-offs and complementarities but do work together to always maintain global trust.
The biggest challenge that blockchain has right now is data availability. The hidden link between security and scalability is available data: when there is no secure DA, transparency breaks down, and the idealized vision of decentralization becomes reliance. Celestia, EigenDA, and Avail all have a different approach to customers: probabilistic sampling, flexible restaking, and short proofs. Each is partial, and none is perfect. Together they are developing something new: a model whereby data is not only stored, but plainly visible to everyone. Getting this right is crucial to the future of trustless computation. DA is now a must-have for anyone building, auditing, or investing in blockchain infrastructure; it is base layer.
Disclaimer: The content created by LBank Creators represents their personal perspectives. LBank does not endorse any content on this page. Readers should do their own research before taking any actions related to the company and carry full responsibility for their decisions, nor can this article be considered as investment advice.




剛剛
親愛的 LBank 用戶
我們的線上客服系統目前遇到連線故障。我們正積極修復這一問題,但暫時無法提供確切的恢復時間。對於由此給您帶來的不便,我們深表歉意。
如需幫助,您可以透過電子郵件聯繫我們,我們將盡快回覆。
感謝您的理解與耐心。
LBank 客服團隊