Private states never go on-chain, but contracts can still use it — How Midnight does it
Last week a friend asked me something: he signed a commercial contract on a certain chain protocol, and the wallet address, signing amount, and terms hash of the other party are all in plain text on the chain, anyone can search and see his business layout. He asked me, can privacy and being on-chain coexist? I said, it wasn't possible before, but now someone is seriously solving this problem. That is @MidnightNetwork . The blockchain industry has a structural contradiction that has never been truly resolved: the value of public chains comes from being open, transparent, verifiable, and immutable, but these three advantages are also three nightmares for users – every transaction and every interaction broadcasts to the whole world. People have been shouting for years about 'privacy', but the vast majority of solutions are either completely hidden or superficially obfuscated. Very few can find a balance between privacy and verifiability.
I participated in an on-chain vote, and after voting, I casually checked the blockchain explorer—what I voted for, when, and the wallet address, all laid out there, clear as day. The so-called anonymous voting is nothing more than a public announcement board where no one helps you sign your name.
This is the other side of on-chain transparency: it is visible to everyone, including yourself. @MidnightNetwork 's Kachina protocol addresses exactly this issue.
Kachina splits the contract state in two: the public state resides on-chain, while the private state stays on your local machine, never going on-chain. The challenge is: how can a private state that never goes on-chain be legally used by the contract logic?
The answer is ZK SNARKs. You generate a zero-knowledge proof locally, proving "my private state meets the conditions for triggering this on-chain update," but the proof itself contains no private content. The on-chain verifier only checks whether the proof is valid, and if so, updates the public state; the private data never leaves your device.
Still using the voting example: there is no need to record "who voted for what" on-chain; it only needs to verify one proof: "this voter is legally registered and has never voted before." The vote count increases by 1, and no one knows who it was.
$NIGHT 's more challenging aspect is concurrent processing: Kachina uses transcripts to track state operations, allowing non-conflicting transactions to be reordered to maximize throughput while controlling information leakage. It’s not a trade-off of "either fast or private," but based on Universally Composable (a formal proof of a security framework, proving that security still holds when combined into complex systems).
Most privacy solutions give you a black box and say not to worry. @MidnightNetwork gives you a set of commitments that can be mathematically verified. #night
The junior colleague stared at the white paper for three days and discovered that nobody understood this track at all.
Currently, more than 130 countries worldwide are advancing CBDC research or pilot programs, covering 95% of global GDP. But if you look through the progress reports of these projects, one word appears with an odd frequency: fragmented. Identity verification was handled separately by different departments, payment processes were not interconnected, subsidy disbursements lacked end-to-end documentation, and cross-departmental data did not match. Hundreds of millions of dollars were invested, only to discover in the end that the underlying foundation had never been properly laid. @SignOfficial I think that sentence in the white paper is the most honest diagnosis in the entire industry: "Most national digital programs fail at scale due to fragmented foundations."
Last year, I helped a friend handle cross-border compliance, and a qualification certificate ran through five departments, dragging on for three weeks. It's not a human problem; it's that the underlying trust infrastructure hasn't been built properly: what was signed, who authorized it, whether it has been tampered with, and there isn't a system to clarify all of this.
@SignOfficial A statement in the white paper hits the mark: "Most national digital programs fail at scale due to fragmented foundations." The mismatch between systems is the number one cause of death for global national digital projects.
$SIGN The solution is not to rebuild the chain but to insert a layer of sovereign-level Attestation protocol between government systems and the chain. The technical points here are not simple: Attestation is not an ordinary signature; it is a structured evidence record bound to a DID + Schema template, which must simultaneously prove "who approved this under which authority framework, according to which version of the rule set, and when"—missing any dimension renders it invalid.
Even more extreme is the dual-track architecture: the public chain model operates with transparent liquidity, while the private chain model (Arma BFT consensus, 100,000+ TPS, UTXO privacy model) specifically serves CBDC, satisfying both retail privacy and regulatory lawful access. The atomicity guarantee of the bridge between the two tracks + AML checks + evidence records are not the kind of bridge where "you burn first and cast later, and you're on your own." This is the core logic of #sign地缘政治基建 .
Central banks around the world are competing for digital currency sovereignty; it's not about TPS, but whose evidence format can be written into the laws of other countries. ISO 20022 compatible, W3C VC/DID, OIDC4VCI, $SIGN has crammed in all the international standards that can be compatible, paving the way for multilateral negotiations.
Most people focus on how many dApps the ecosystem has. But the standard for sovereign-level infrastructure is only one: whether your standard has been adopted by other sovereign nations.
Next Generation Trusted Infrastructure, Reliant on Proof, Not Transparency
Once she talked to me about a case: a patient wanted to sue the hospital and needed to prove that the diagnostic records had not been tampered with. But the hospital system is centralized; whoever manages the records can modify them. The 'original record' she obtained had almost no evidential power in court. She said: the truth does not not exist; rather, the truth has no carrier. I didn't think too much at the time until I later came across @MidnightNetwork and realized that this is not an isolated case; it is a structural gap in the credibility of data in the entire digital age. Data exists, but authenticity is lost; records exist, but evidential power is lost.
Midnight uses ZK Snarks, and its core is one thing: to allow the prover to convince the verifier that something is true without revealing any additional information.
Last month, a friend of mine who works in private equity told me that they are afraid to go on-chain because once they do, the entire investment portfolio gets exposed. This statement made me think for a long time.
@MidnightNetwork is precisely what needs to be done to resolve the most fundamental contradiction of Web3:
The core of Midnight is the Kachina proof system. Traditional ZK solutions mostly only provide numerical commitments, encrypting amounts, while the contract logic remains public. Kachina goes deeper because it is based on the UC (Universal Composability) framework, transforming the entire execution process of smart contracts with private states into verifiable proofs. Nodes validate the correctness of the proof rather than the computation itself, and input data, intermediate states, and private variables remain local throughout. The execution logic is private, and the on-chain results can be verified; both things hold true simultaneously at the level of cryptography.
Additionally, Midnight has a dual ledger architecture: the public ledger handles global consensus and on-chain visible states, while the private ledger stores UTXO extended states that only key holders can read through local encryption. Selective disclosure is based on ZK proof derivation, allowing you to provide a separate proof to regulators that "this source of funds is compliant" without having to expose the entire wallet. AML checks pass, and the investment portfolio is still your own.
Midnight decouples compliance and privacy at the architectural level; it is not a compromise solution but a cryptographic compatibility in a meaningful sense. Thus, $NIGHT is the true breakthrough point: Kachina is not just a privacy upgrade; it is the last technical prerequisite for institutions to enter the market. Without it, compliance will always be tied to transparency, and institutions will always have a reason not to go on-chain.
The final blank of Gulf digitalization, $SIGN is filling in
Last year I helped a friend with something. He worked in the Gulf for two years in supply chain and wanted to secure a business loan upon returning home. He had all the materials prepared — bank statements, contracts, and platform data. However, the credit officer took a glance and said: we cannot cross-verify the sources of these materials; they do not meet the risk control requirements. My friend was stunned on the spot. It's not data fabrication; it's that the banking system doesn't have a mechanism to acknowledge his reality. Sitting next to him, the first thought in my mind was: this isn't a problem of one person; it's the most absurd structural contradiction of the entire digital economy era — you have the data, but you can't prove that it's true.
Last time I applied for financing from the Gulf investors, I ended up gathering more than twenty materials such as bank statements, tax records, contract scans, and platform screenshots... The compliance team on the other side took a glance and said: We cannot cross-verify these, they do not meet our due diligence standards. I didn't get the money, three months wasted. It's not that I faked the data; it's that my real data has no credible foothold in their system.
This is the structural problem that @SignOfficial is solving. The core logic of $SIGN 's New Capital System is: using on-chain Attestation to tag every business operation with verifiable evidence stamps, encapsulated in W3C VC standards, anchored by ECDSA signatures, with sensitive fields using ZK selective disclosure. You don't need to expose all data, but key proofs can be verified by any third party at any time. The compliance team of the investors does not need to make calls for verification; GraphQL can directly query on-chain records, compressing the due diligence cycle from months to days.
On a national level, the logic is completely consistent. Gulf countries are promoting digital government, CBDC, and cross-border settlement; the bottleneck has never been payment technology but who can prove that the money was issued compliantly. With a blank evidence layer, sovereign credit is discounted. S.I.G.N. directly fills this gap; it's not just an enhancement, but a fundamental infrastructure need.
The growth of $SIGN does not follow market sentiment but follows the sovereign procurement cycle. Once a country is established, it spreads through diplomatic relations. Once the evidence layer is embedded into the national system, the migration cost is an order of magnitude higher than changing payment systems.
How precise can the matter of subsidy disbursement be? Let's talk about the programmable capital logic of $SIGN.
Last year, my junior helped a friend solve a problem in a digital transformation project. The situation was as follows: a Middle Eastern country wanted to promote the distribution of digital welfare. From identity verification to disbursing funds to post-audit, it took nine months. Identity verification was in one system, eligibility review in another, and funding records in a third. When issues arose, no one could clarify which link had gone wrong. At that time, I thought the problem wasn't a lack of people, but a lack of infrastructure that could allow the three systems to speak the same language.
These past few days, I have been studying @SignOfficial , which reminded me of this matter. There is a chapter in the S.I.G.N. white paper titled (Sovereign Infrastructure for Global Nations), which discusses: Money, ID, Capital, three systems, one proof language. Isn't this precisely what my friend needed most at that time?
My friend's company has recently been working on a digital payment project for a certain Middle Eastern government. Just proving that this money was issued in compliance involves three departments, two systems, and seven seals. My immediate feeling was: this isn't an efficiency issue; this is a gap at the infrastructure level.
This gap happens to be the layer that @SignOfficial needs to fill: The core proposition of S.I.G.N. is a reusable digital infrastructure stack at the level of sovereign states, which runs through its three main lines: the Evidence Layer, and its value logic in Middle Eastern digital infrastructure.
Many countries in the Middle East are pushing for CBDC pilots, and the demand for cross-border settlement is accelerating. The pain points are clear: who approved this money, what rules were applied, and whether there are verifiable records, which are almost a black box in the current systems. SWIFT transmits instructions, not verifiable proof.
The answer provided by @SignOfficial is called inspection-ready evidence, meaning that each of its G2P allocations can generate cryptographically bound Evidence Artifacts in real-time: allocation batch ID, rule set version hash, identity qualification proof reference, settlement transaction hash, verifiable across systems, without relying on any single institution's endorsement. Technically, it relies on two primitives: Schema and Attestation, supporting three modes: fully on-chain, off-chain with verifiable anchors, and hybrid mode, with sensitive citizen data kept off-chain and compliance proofs publicly verifiable; this separation is a necessity rather than an option for sovereign states.
Overlaying the dual-track architecture of the New Money System: public chain mode is transparent and composable; private chain mode is based on Hyperledger Fabric, 100,000+ TPS, Arma BFT consensus, ZK privacy namespaces, and compatible with the ISO 20022 cross-border payment standard. It connects to the existing financial system slots, transforming the issue of 'who approved what under which version of the rules' from manual endorsement into mathematically verifiable.
Therefore, my junior sister believes that $SIGN has hit a structural gap. More and more sovereign governments around the world are asking the same question: how to build national-level digital infrastructure on foundations they control, rather than outsourcing it to others? What SIGN is doing is providing a technology stack that can integrate with the existing financial system while allowing sovereign entities to hold the keys themselves. Once a national-level deployment is up and running, it can become a national-level infrastructure dependency, which is the position that is truly hard to replace.
What if one day I become a giant whale and get targeted by others?
Today, the junior sister casually checked a wallet address, and as a result, it listed clearly when the person bought what, how much they held, and who they transferred it to. My first reaction was not 'Wow, how convenient,' but rather 'What if it were my own address...' It's quite scary to believe carefully; the junior sister imagined that if one day she became a giant whale, wouldn't every action be exposed for everyone to see? This is the underlying design logic of public chains; verifiability and privacy are in conflict with each other. If you want others to verify that your transactions are legitimate, you have to disclose the details; if you want to protect your privacy, you can't prove your innocence.
Two years ago, I wanted to add a layer of data protection to a blockchain application. After searching around, I found that I had to implement ZK circuits myself: first, I needed to understand the R1CS constraint system, elliptic curve pairing, and the Groth16 proof structure. The threshold was too high... I could only give up.
Recently, I saw the part about the Compact language in the white paper @MidnightNetwork , and my first reaction was: it would have been great if this had come out two years earlier...
In the past, when doing ZK privacy contracts, developers had to maintain two sets of logic: one for the on-chain verification circuit and another for the off-chain proof generation circuit. These two had to be strictly aligned, and this alignment process required developers to deeply understand the structure of ZK circuits, which was the real threshold.
But Compact handed this task over to the compiler: its underlying syntax is based on TypeScript, but a single compilation outputs two completely different sets of code at once. One set is the on-chain verification logic, deployed on the Midnight network, which only receives ZK proofs and verifies their validity without touching any raw data; the other set is the off-chain proof generation logic, running on the user's local device, using raw inputs to generate proofs and then sending them to the chain, with the raw data never leaving the user's end. Developers only need to write one source file, and the compiler automatically splits it. The alignment of the two sets of logic is ensured by the compiler, so there's no need to manually maintain consistency, nor do they need to know what the underlying circuit looks like.
Back then, I gave up because I couldn't maintain the two circuits for on-chain verification and off-chain proof. Now, the technology Compact from @MidnightNetwork has removed this heavy burden from developers, not just simplifying it, but directly allowing the compiler to take it on.
There’s another detail that needs to be clarified: off-chain proof generation runs locally on the user’s device, and the node only ever receives the proof results, with no access to the raw inputs. Even if all verification nodes on Midnight are compromised, all the attacker has is a bunch of mathematical proofs, with no path to restore the original transaction content.
Privacy contracts used to be an exclusive tool for PhDs in cryptography; after Compact, they have become the daily work of TypeScript developers. The real demand from $NIGHT ultimately comes from the DUST consumption generated when these developers deploy contracts, and the mainnet will go live in a few days, so everyone can keep an eye on it.
In the past few days, I have been running a tool on the chain. The gas fees have increased sixfold in two days, and the budget has collapsed, forcing a shutdown. It was only today that I realized my operating costs are not in my hands, but determined by market heat.
This issue @MidnightNetwork has provided what I think is currently the most straightforward solution using the design of $NIGHT and DUST: turning gas fees from a market variable into a holding variable.
The logic is simple. Holding $NIGHT passively generates DUST, which is the only consumable resource for all operations on the network. The key is a term called non-pegged in the white paper, where the supply of DUST is linked to the holding amount of NIGHT and decoupled from market prices. The amount of NIGHT you hold determines how much DUST you produce, with the rate protocol fixed and not fluctuating with bull or bear markets.
In other words, if my tool runs on Midnight, the amount of DUST needed and how much NIGHT to hold is a math problem that can be calculated before going live, not something to decide daily by checking market conditions. The occurrence of gas fees increasing sixfold in a bull market simply does not happen in this mechanism because your costs are anchored to your holdings, not the congestion level on the chain that day.
Midnight also has a design called Babel Station that allows new users without NIGHT to exchange other assets for network resources. This is a workaround for cold starts; you cannot expect enterprise clients to understand NIGHT before using your product; if the threshold is too high, the ecosystem cannot flourish.
A 50% block utilization rate is the dynamic anchor point of the entire mechanism. If it exceeds the threshold, the per transaction consumption automatically increases; if lower, it remains stable. The protocol layer adjusts automatically, and developers do not need to worry about this layer, allowing them to focus on product development.
When everyone discusses Midnight, they talk about privacy, ZK, compliance.... Although these are all valid, I believe the design of Babel Station is the key capability that allows it to truly create an enterprise-level ecosystem. Because no matter how strong the privacy features are, if operating costs are uncontrollable, enterprises will not come.
The real business on the blockchain, the first hurdle is here.
I have a lawyer friend who complained about a very specific issue last month. After talking to him, the first thing that came to my mind was@MidnightNetwork . They were negotiating a contract with a foreign-funded enterprise, and after several rounds, the other party wanted to keep the contract terms and records on the blockchain, reasoning that it is immutable, allowing both sides to verify at any time, reducing disputes. It sounds quite reasonable, but the problem arises: some terms in the contract involve supply chain pricing and trade secrets, and once such information is on the blockchain, it becomes completely transparent; anyone who takes a glance can see it, which is absolutely not acceptable to the other party. The negotiation got stuck here, the proposal was abandoned, and they reverted back to the traditional paper and notarization method.