Binance Square

胖鸟

不喜欢卷
141 Following
1.4K+ Followers
3.1K+ Liked
137 Shared
Posts
·
--
$BASED The recent airdrop quality is quite good, let's see what happens.
$BASED The recent airdrop quality is quite good, let's see what happens.
B
image
image
BASED
Price
0.11093
·
--
🔥🔥🔥Alpha|March 30th Morning Report The U Airdrop is back!!! The quality has really been high recently! I wonder if 245 can double eat today; many brothers have returned again. ⏰BASED Time: 18:00 Background: Fundraising of $18.2 million, pool price 0.075U ⏰R2 Time: 16:00 Background: FDV of $30 million, pool price 0.03, relatively low circulation PS: Blockchain trivia: Why does Sign seem to be selling a kind of long-term governance cost? Recently, while researching @SignOfficial , I found that it consistently works on many projects and deliberately writes light dirty work. Many projects love to talk about the frontend aspects; what I have done and what I will do, but seldom does anyone mention what will happen in the end. Sign, however, does not write vague statements about governance; it separately lists change management, key custody, and audit readiness. This point is really critical; there are many projects selling functional agreements, but few like Sign take a step forward to touch on long-term accountability. $SIGN even carefully writes about who decides when rules change, who is responsible if keys are lost, and who provides materials when audits arrive. This shows that it is no longer just about functionality but about who will take the blame in the future as the system grows. Because what is truly valuable is not just making the functions work, but the ongoing daily costs of rules, permissions, audits, incidents, and responsibilities once the system grows. And what $SIGN presents now is not just the Sign Protocol, but also connects to the New Money System, New ID System, New Capital System—all of this. This means it is selling not just a tool, but a system that someone will have to continuously support in the future. So what I care about most regarding Sign is no longer just whether it can issue more proofs, but whether there will be someone willing to continuously support this system in the future. #Sign Geopolitical Infrastructure
🔥🔥🔥Alpha|March 30th Morning Report

The U Airdrop is back!!! The quality has really been high recently!

I wonder if 245 can double eat today; many brothers have returned again.

⏰BASED
Time: 18:00
Background: Fundraising of $18.2 million, pool price 0.075U

⏰R2
Time: 16:00
Background: FDV of $30 million, pool price 0.03, relatively low circulation

PS: Blockchain trivia: Why does Sign seem to be selling a kind of long-term governance cost?

Recently, while researching @SignOfficial , I found that it consistently works on many projects and deliberately writes light dirty work. Many projects love to talk about the frontend aspects; what I have done and what I will do, but seldom does anyone mention what will happen in the end. Sign, however, does not write vague statements about governance; it separately lists change management, key custody, and audit readiness.

This point is really critical; there are many projects selling functional agreements, but few like Sign take a step forward to touch on long-term accountability.

$SIGN even carefully writes about who decides when rules change, who is responsible if keys are lost, and who provides materials when audits arrive. This shows that it is no longer just about functionality but about who will take the blame in the future as the system grows. Because what is truly valuable is not just making the functions work, but the ongoing daily costs of rules, permissions, audits, incidents, and responsibilities once the system grows.

And what $SIGN presents now is not just the Sign Protocol, but also connects to the New Money System, New ID System, New Capital System—all of this. This means it is selling not just a tool, but a system that someone will have to continuously support in the future.

So what I care about most regarding Sign is no longer just whether it can issue more proofs, but whether there will be someone willing to continuously support this system in the future. #Sign Geopolitical Infrastructure
·
--
Why is offline so important for Sign?In the past few days, I specifically looked into the New ID System of @SignOfficial . I found that wallets need to support offline storage and offline display, which includes methods like QR and NFC. This isn't just my own speculation; the official documentation lists offline presentation patterns as a common requirement for identity systems, and the white paper separately mentions offline capabilities within digital wallet functionalities. I initially didn't take this point seriously. A project that talks about VC, DID, ZK, and verifiable identity every day—shouldn't it focus on more 'advanced' areas like on-chain verification, privacy proof, and cross-agency interoperability? Why is it emphasized so much even for something like 'what to do when there's no internet'?

Why is offline so important for Sign?

In the past few days, I specifically looked into the New ID System of @SignOfficial . I found that wallets need to support offline storage and offline display, which includes methods like QR and NFC. This isn't just my own speculation; the official documentation lists offline presentation patterns as a common requirement for identity systems, and the white paper separately mentions offline capabilities within digital wallet functionalities.

I initially didn't take this point seriously.
A project that talks about VC, DID, ZK, and verifiable identity every day—shouldn't it focus on more 'advanced' areas like on-chain verification, privacy proof, and cross-agency interoperability? Why is it emphasized so much even for something like 'what to do when there's no internet'?
·
--
In the past two days, I followed the query process of @SignOfficial and looked through it. I originally thought that sending the attestation was about it, but the more I looked, the more I felt something was off. I found that under the Sign Protocol, $SIGN actually had a dedicated section for indexing & querying, and it clearly stated that SignScan would aggregate attestations from different chains, different storage layers, and different execution environments. Developers could directly use REST, GraphQL, and SDK to query, which is quite interesting. Why is this point so critical? Because once the attestation truly enters the process, the first thing that often goes wrong is not that it can't be sent out, but rather it is sent out, and then what? Who goes to find it? How to find it? After retrieving it, who continues to take over? If an attestation can only exist but cannot be easily retrieved, referenced, or continued to be executed by other systems, then it is ultimately more like an electronic filing cabinet rather than infrastructure. And $SIGN even singled out the backend dirty work of indexing, querying, and aggregation, which are the least appealing tasks that people often overlook. This indicates that what it truly wants to address is not just whether the facts can be written down, but whether the facts, once written down, can still be used by the system. So now, the most interesting aspect of $SIGN for me is no longer whether it can issue proof, but that the attestation does not end after issuance; it must also ensure that it can be found, retrieved, and connected in the future. If it really smooths out the cross-chain, cross-storage, and cross-environment querying, then what @SignOfficial encounters is not just how to issue proof, but whether the facts can still be found by the system after being written on-chain. #sign地缘政治基建
In the past two days, I followed the query process of @SignOfficial and looked through it. I originally thought that sending the attestation was about it, but the more I looked, the more I felt something was off.

I found that under the Sign Protocol, $SIGN actually had a dedicated section for indexing & querying, and it clearly stated that SignScan would aggregate attestations from different chains, different storage layers, and different execution environments. Developers could directly use REST, GraphQL, and SDK to query, which is quite interesting.

Why is this point so critical?

Because once the attestation truly enters the process, the first thing that often goes wrong is not that it can't be sent out, but rather it is sent out, and then what? Who goes to find it? How to find it? After retrieving it, who continues to take over?

If an attestation can only exist but cannot be easily retrieved, referenced, or continued to be executed by other systems, then it is ultimately more like an electronic filing cabinet rather than infrastructure. And $SIGN even singled out the backend dirty work of indexing, querying, and aggregation, which are the least appealing tasks that people often overlook. This indicates that what it truly wants to address is not just whether the facts can be written down, but whether the facts, once written down, can still be used by the system.

So now, the most interesting aspect of $SIGN for me is no longer whether it can issue proof, but that the attestation does not end after issuance; it must also ensure that it can be found, retrieved, and connected in the future.

If it really smooths out the cross-chain, cross-storage, and cross-environment querying, then what @SignOfficial encounters is not just how to issue proof, but whether the facts can still be found by the system after being written on-chain. #sign地缘政治基建
·
--
Why is Sign placing so much emphasis on offline?These past few days, I specifically followed the New ID System of @SignOfficial and initially wanted to look at the old issues, but found a particularly inconspicuous statement. It mentions that wallets need to support offline storage and offline presentation, including methods like QR and NFC. The white paper has separately included this in the capabilities of digital wallets, and the document's homepage lists 'offline presentation patterns' as a common requirement for identity systems. I really didn't take this point seriously at first. A project discussing attestation, VC, and DID. Shouldn't the emphasis be placed on advanced aspects like on-chain verification, privacy proof, and cross-system interoperability according to common sense? Why is so much emphasis placed on what to do when there is no internet?

Why is Sign placing so much emphasis on offline?

These past few days, I specifically followed the New ID System of @SignOfficial and initially wanted to look at the old issues, but found a particularly inconspicuous statement. It mentions that wallets need to support offline storage and offline presentation, including methods like QR and NFC. The white paper has separately included this in the capabilities of digital wallets, and the document's homepage lists 'offline presentation patterns' as a common requirement for identity systems.
I really didn't take this point seriously at first.
A project discussing attestation, VC, and DID. Shouldn't the emphasis be placed on advanced aspects like on-chain verification, privacy proof, and cross-system interoperability according to common sense? Why is so much emphasis placed on what to do when there is no internet?
·
--
Recently, I have been running on the chain @SignOfficial . At first, I didn't pay much attention to the trust registry aspect. I thought that as long as the format was correct, the signature was correct, and it could be checked on the chain, that would be enough. However, the Sign is very heavy; it not only verifies a proof, but it also checks not only the signature but also confirms whether the issuer is a legitimate organization, whether they have been authorized, and whether their status has expired. In other words, the issuance authority is not a subsidiary condition but a prerequisite for whether the entire system can run. Why is this point important? Many people understand that the default view of on-chain proofs is very clean, but the real process often hits a wall first, not in how to verify the content, but in why I should trust that you are qualified to issue this content. Diplomas, licenses, qualification documents, identity credentials, once they enter the real world, the problem immediately is not the format, but who can register as an issuer, who authorizes, who revokes, and who confirms that this qualification is still valid. This is also why I find $SIGN increasingly interesting. It doesn't first make the proof content very flashy and then assume the issuer exists. Instead, it first nails down the issuance authority. At this point, Sign encounters not just the proof layer but actually something more interesting: making the issuance authority itself a verifiable object. Because many systems ultimately do not fail on proof formats, but rather on the fact that no one universally recognizes the issuance authority. No matter how beautiful the content or how standard the signature, if someone says, 'Why should I recognize this issuer?' the whole thing comes to a halt. So now, what I care about with $SIGN is not just whether it can issue more attestations. What I care more about is whether it can make the trust registry layer a step that will be checked by default across institutions. If this is achieved, the nature changes. At that time, $SIGN would not only encounter what the proof says, but also who is qualified to have this proof accepted by the world. #sign geopolitical infrastructure
Recently, I have been running on the chain @SignOfficial . At first, I didn't pay much attention to the trust registry aspect. I thought that as long as the format was correct, the signature was correct, and it could be checked on the chain, that would be enough. However, the Sign is very heavy; it not only verifies a proof, but it also checks not only the signature but also confirms whether the issuer is a legitimate organization, whether they have been authorized, and whether their status has expired. In other words, the issuance authority is not a subsidiary condition but a prerequisite for whether the entire system can run.

Why is this point important?

Many people understand that the default view of on-chain proofs is very clean, but the real process often hits a wall first, not in how to verify the content, but in why I should trust that you are qualified to issue this content.

Diplomas, licenses, qualification documents, identity credentials, once they enter the real world, the problem immediately is not the format, but who can register as an issuer, who authorizes, who revokes, and who confirms that this qualification is still valid. This is also why I find $SIGN increasingly interesting. It doesn't first make the proof content very flashy and then assume the issuer exists. Instead, it first nails down the issuance authority.

At this point, Sign encounters not just the proof layer but actually something more interesting: making the issuance authority itself a verifiable object.

Because many systems ultimately do not fail on proof formats, but rather on the fact that no one universally recognizes the issuance authority. No matter how beautiful the content or how standard the signature, if someone says, 'Why should I recognize this issuer?' the whole thing comes to a halt.

So now, what I care about with $SIGN is not just whether it can issue more attestations. What I care more about is whether it can make the trust registry layer a step that will be checked by default across institutions.

If this is achieved, the nature changes. At that time, $SIGN would not only encounter what the proof says, but also who is qualified to have this proof accepted by the world. #sign geopolitical infrastructure
·
--
In the past few days, I specifically went through the distribution examples in the document following @SignOfficial . I initially just wanted to confirm how detailed a distribution record would be. However, I was caught up by a very small detail, not the amount, not the address, but the ruleset version. I thought that a distribution record would simply show who received how much and when it was sent out for reconciliation. However, the example of $SIGN did not present it this way. In the distribution list, it not only included the amount/object and time, but also packed in the ruleset version, compliance check log, and settlement reference. This means that $SIGN does not just want to keep the results; it also wants to retain the rules under which this result was generated at that time. Why is this small detail important? Because in the real process, the most confusing aspect is never whether something was distributed, but rather being unable to clearly explain which version of the rules was applied at that time, whether the list was updated later, or whether the freeze was added in the middle. Ultimately, it becomes difficult to determine which logic should be revisited for this batch of results. Once these things cannot be clarified, even the most automated systems will revert back to spreadsheets/emails and manual standards. The TokenTable document emphasizes details, pausing backflow, review, and replay logic, essentially acknowledging that if distribution only retains the results without retaining the rules, problems will inevitably arise later on. Thus, I find the most interesting aspect of $SIGN is not just whether it will distribute, but that it is forcing the system to fill in a loophole. The results want to be retained, and the rules behind the results must also be retained. If this step is merely about making the documentation look good, then its significance is limited. I will continue to monitor a very specific signal. If more and more real scenarios start treating rules versions as part of the distribution records, then what Sign encounters will not just be at the execution level, but whether it will be possible to fully clarify why it was sent this way in the future. #sign geopolitics infrastructure
In the past few days, I specifically went through the distribution examples in the document following @SignOfficial . I initially just wanted to confirm how detailed a distribution record would be. However, I was caught up by a very small detail, not the amount, not the address, but the ruleset version.

I thought that a distribution record would simply show who received how much and when it was sent out for reconciliation. However, the example of $SIGN did not present it this way.

In the distribution list, it not only included the amount/object and time, but also packed in the ruleset version, compliance check log, and settlement reference. This means that $SIGN does not just want to keep the results; it also wants to retain the rules under which this result was generated at that time.

Why is this small detail important?

Because in the real process, the most confusing aspect is never whether something was distributed, but rather being unable to clearly explain which version of the rules was applied at that time, whether the list was updated later, or whether the freeze was added in the middle. Ultimately, it becomes difficult to determine which logic should be revisited for this batch of results. Once these things cannot be clarified, even the most automated systems will revert back to spreadsheets/emails and manual standards. The TokenTable document emphasizes details, pausing backflow, review, and replay logic, essentially acknowledging that if distribution only retains the results without retaining the rules, problems will inevitably arise later on.

Thus, I find the most interesting aspect of $SIGN is not just whether it will distribute, but that it is forcing the system to fill in a loophole. The results want to be retained, and the rules behind the results must also be retained. If this step is merely about making the documentation look good, then its significance is limited.

I will continue to monitor a very specific signal. If more and more real scenarios start treating rules versions as part of the distribution records, then what Sign encounters will not just be at the execution level, but whether it will be possible to fully clarify why it was sent this way in the future. #sign geopolitics infrastructure
B
SIGN/USDT
Price
0.04183
·
--
What Sign recently supplemented is not just rewards, but the position of $SIGNRecently, $SIGN during the callback, I initially wanted to check the token release mechanism, but I accidentally clicked into the updated OBI page. At first, I thought it was just about updating the event details, giving some incentives to holders, and providing a reason for the market to hold on. However, as I looked through the rules page, I increasingly felt that this matter was not so simple; @SignOfficial it is not just a simple reward number, it repeatedly emphasizes the importance of self-custody. If this were just an ordinary event, then talking about holding would be enough, but it insists on repeating this matter. In my view, this action is not just an operational detail; it seems more like asking where Sign stands in the increasingly large system narrative.

What Sign recently supplemented is not just rewards, but the position of $SIGN

Recently, $SIGN during the callback, I initially wanted to check the token release mechanism, but I accidentally clicked into the updated OBI page. At first, I thought it was just about updating the event details, giving some incentives to holders, and providing a reason for the market to hold on. However, as I looked through the rules page, I increasingly felt that this matter was not so simple; @SignOfficial it is not just a simple reward number, it repeatedly emphasizes the importance of self-custody.
If this were just an ordinary event, then talking about holding would be enough, but it insists on repeating this matter. In my view, this action is not just an operational detail; it seems more like asking where Sign stands in the increasingly large system narrative.
·
--
Unprecedented big毛, it's already 100u Yesterday I misremembered the opening time, otherwise I could have added more today😭 I heard there's a TGE and boost tomorrow, things have really been looking up lately #ALPHA $PRL {alpha}(560xd20fb09a49a8e75fef536a2dbc68222900287bac)
Unprecedented big毛, it's already 100u

Yesterday I misremembered the opening time, otherwise I could have added more today😭

I heard there's a TGE and boost tomorrow, things have really been looking up lately
#ALPHA $PRL
·
--
I have once again pulled an all-nighter, feeling like I'm about to become American. I stayed up late going through the document of @SignOfficial , initially wanting to find a less conspicuous small detail to tackle, but I ended up getting tripped up by a parameter, maxValidFor. At first, I didn't take it seriously; I thought it was just about the validity period, like casually adding a deadline when filling out a form, at most a minor configuration. But as I looked further down the Schema page, I realized it was completely different. In $SIGN , maxValidFor is not just a note added later, but a constraint that was written into the Schema from the beginning, meaning how long this type of proof can last. Hmm... this is quite interesting, many projects you are qualified for today do not equal being qualified three months later; passing verification today does not mean it will still be valid six months later. If the issue of how long it becomes invalid is not incorporated into the structure from the outset, it will eventually revert back to the old path of backend lists and manual updates, each checking their own. And $SIGN precisely addresses this issue, not only wanting to teach you how to issue proof but also proactively handling how long such proof remains valid and whether it can be revoked, and how it will be understood uniformly by other systems in the future. This greatly enhances convenience, so my understanding of Sign is no longer just whether it can issue more proofs. I care more about whether it can make the logic of this kind of proof naturally carry a time validity boundary, truly making it an action that cross-system will default to comply with. If that cannot be achieved, many proofs will still just be electronic certificates, but if it can be done, what Sign encounters won't just be at the proof level, but when this proof should automatically lose its validity. #Sign geopolitical infrastructure
I have once again pulled an all-nighter, feeling like I'm about to become American.

I stayed up late going through the document of @SignOfficial , initially wanting to find a less conspicuous small detail to tackle, but I ended up getting tripped up by a parameter, maxValidFor.

At first, I didn't take it seriously; I thought it was just about the validity period, like casually adding a deadline when filling out a form, at most a minor configuration. But as I looked further down the Schema page, I realized it was completely different.

In $SIGN , maxValidFor is not just a note added later, but a constraint that was written into the Schema from the beginning, meaning how long this type of proof can last.

Hmm... this is quite interesting, many projects you are qualified for today do not equal being qualified three months later; passing verification today does not mean it will still be valid six months later. If the issue of how long it becomes invalid is not incorporated into the structure from the outset, it will eventually revert back to the old path of backend lists and manual updates, each checking their own.

And $SIGN precisely addresses this issue, not only wanting to teach you how to issue proof but also proactively handling how long such proof remains valid and whether it can be revoked, and how it will be understood uniformly by other systems in the future.

This greatly enhances convenience, so my understanding of Sign is no longer just whether it can issue more proofs. I care more about whether it can make the logic of this kind of proof naturally carry a time validity boundary, truly making it an action that cross-system will default to comply with.

If that cannot be achieved, many proofs will still just be electronic certificates, but if it can be done, what Sign encounters won't just be at the proof level, but when this proof should automatically lose its validity. #Sign geopolitical infrastructure
·
--
Sign|It not only helps the system keep accounts, but may also want to help the system guard the doorIn the past few days, I specifically went through the hook tutorial of @SignOfficial . I initially thought this was just an additional feature like many protocols that have an advanced option, but as I followed the official documentation, I found it increasingly strange. Because here at $SIGN the hook is not just a decoration, the official documentation directly states that the schema hook can run custom logic when creating or revoking proofs. More importantly, as long as the hook rolls back, the entire call will roll back together. This means that this is not a post-check, but rather that you need to pass this gate for your proof to enter the system. Various whitelists/charges and various custom business logic can be hooked onto this.

Sign|It not only helps the system keep accounts, but may also want to help the system guard the door

In the past few days, I specifically went through the hook tutorial of @SignOfficial . I initially thought this was just an additional feature like many protocols that have an advanced option, but as I followed the official documentation, I found it increasingly strange.
Because here at $SIGN the hook is not just a decoration, the official documentation directly states that the schema hook can run custom logic when creating or revoking proofs. More importantly, as long as the hook rolls back, the entire call will roll back together. This means that this is not a post-check, but rather that you need to pass this gate for your proof to enter the system. Various whitelists/charges and various custom business logic can be hooked onto this.
·
--
Night|The threshold of Midnight may not be on the chain, but on your local Docker.I have been tinkering with the development environment of m-92 in the past couple of days. I originally thought it would be simple to deploy the most basic contract and just see how the Compact and TypeScript experience goes, but after getting the environment up and running, I was completely bewildered. I suddenly realized that the heaviest place might not be on the chain, but on the proof server on the developer's own computer. The official Preview / Preprod tutorials now require that the proof server be run locally first, and it must always be online; whether deploying contracts or submitting transactions, the process cannot proceed without the proof server being online. Even Docker is not something you can just install conditionally; it is a legitimate prerequisite. You haven't even started writing business logic seriously, and the environment has already put you in the chair for an educational session.

Night|The threshold of Midnight may not be on the chain, but on your local Docker.

I have been tinkering with the development environment of m-92 in the past couple of days. I originally thought it would be simple to deploy the most basic contract and just see how the Compact and TypeScript experience goes, but after getting the environment up and running, I was completely bewildered.
I suddenly realized that the heaviest place might not be on the chain, but on the proof server on the developer's own computer. The official Preview / Preprod tutorials now require that the proof server be run locally first, and it must always be online; whether deploying contracts or submitting transactions, the process cannot proceed without the proof server being online. Even Docker is not something you can just install conditionally; it is a legitimate prerequisite. You haven't even started writing business logic seriously, and the environment has already put you in the chair for an educational session.
·
--
Oh my, I haven't seen big Mao for a long time, 60u The cost is 6u, subscribe for 485PRL, with a high value of 65u Definitely big Mao, I've heard that there are several more going to alpha later, plus the night event, this month will be satisfied. I was just happily calculating the profits, thinking of checking the DApp Connector documentation of @MidnightNetwork to study how the wallet feeds configuration to the front end. As I was looking, I suddenly thought about Midnight's privacy system, and in the end, wouldn't it all be concentrated in the hands of a few infrastructure entrances? That would be a bit uncomfortable. The documentation of $NIGHT is quite straightforward, the wallet can configure the URI of the node, indexer, and proving server by itself, and it's best for the DApp to follow the wallet's configuration, and it specifically mentioned that this is related to privacy. I even went to see why the connector requires DApp to follow the wallet URI as much as possible, and the more I looked, the more I felt that this is not just a normal configuration habit; it is part of the privacy design itself. At this point, I felt something was wrong. Everyone understands the reasoning, theoretically, anyone can run a service, configure nodes, manage indexers, and set up proving servers by themselves. But the problem is, who would keep tweaking the underlying configuration every day? If the wallet still points to a few public services by default, the actual traffic will naturally gather towards a few service providers. $NIGHT wants to solve the privacy issue, but if most users end up crowding into the same batch of nodes, the same batch of indexers, and the same batch of proving servers, then on the surface, it seems that the transaction content is hidden, but underneath that path is becoming more and more centralized. To put it bluntly, just because the content is not exposed doesn't mean the entrance hasn't been grasped by someone. If this really goes in that direction, the issue is not whether the transaction is hidden, but who ultimately holds the entrance of this privacy chain, #night .
Oh my, I haven't seen big Mao for a long time, 60u

The cost is 6u, subscribe for 485PRL, with a high value of 65u

Definitely big Mao, I've heard that there are several more going to alpha later, plus the night event, this month will be satisfied.

I was just happily calculating the profits, thinking of checking the DApp Connector documentation of @MidnightNetwork to study how the wallet feeds configuration to the front end. As I was looking, I suddenly thought about Midnight's privacy system, and in the end, wouldn't it all be concentrated in the hands of a few infrastructure entrances? That would be a bit uncomfortable.

The documentation of $NIGHT is quite straightforward, the wallet can configure the URI of the node, indexer, and proving server by itself, and it's best for the DApp to follow the wallet's configuration, and it specifically mentioned that this is related to privacy.

I even went to see why the connector requires DApp to follow the wallet URI as much as possible, and the more I looked, the more I felt that this is not just a normal configuration habit; it is part of the privacy design itself.

At this point, I felt something was wrong.

Everyone understands the reasoning, theoretically, anyone can run a service, configure nodes, manage indexers, and set up proving servers by themselves. But the problem is, who would keep tweaking the underlying configuration every day? If the wallet still points to a few public services by default, the actual traffic will naturally gather towards a few service providers.

$NIGHT wants to solve the privacy issue, but if most users end up crowding into the same batch of nodes, the same batch of indexers, and the same batch of proving servers, then on the surface, it seems that the transaction content is hidden, but underneath that path is becoming more and more centralized.

To put it bluntly, just because the content is not exposed doesn't mean the entrance hasn't been grasped by someone.

If this really goes in that direction, the issue is not whether the transaction is hidden, but who ultimately holds the entrance of this privacy chain, #night .
·
--
Sign|I was tripped up by a small issue, why is Schema written so heavilyEvery night I stay up late, looking at the monotonous content, I finally discovered a place with two eyes Schema. At first, I really didn't take it seriously; I thought Schema was just a template, like defining fields before filling out a form, at most a detail of the project. However, as I followed the documentation, the more I read, the more uneasy I felt. At Sign here, Schema is not something that can be casually put together; the official directly defines Schemas and Attestations as the two core components of the Sign Protocol, believing that the former defines the structure, and the latter generates the signature data according to this structure.

Sign|I was tripped up by a small issue, why is Schema written so heavily

Every night I stay up late, looking at the monotonous content, I finally discovered a place with two eyes Schema.
At first, I really didn't take it seriously; I thought Schema was just a template, like defining fields before filling out a form, at most a detail of the project. However, as I followed the documentation, the more I read, the more uneasy I felt.
At Sign here, Schema is not something that can be casually put together; the official directly defines Schemas and Attestations as the two core components of the Sign Protocol, believing that the former defines the structure, and the latter generates the signature data according to this structure.
·
--
Stayed up late again, this time specifically to look at the attestation creation process of @SignOfficial . I originally thought where the data is stored was just a cost issue. However, when I clicked on the hybrid attestation part, I realized that the matter is far from simple. It's not just about storing it; there are also APIs, indexes, and query paths involved. Whether this proof can be continued to be used by others in the future is directly tied to where it is stored, which is quite frustrating. The document for $SIGN clearly outlines the route; the schema can be fully on-chain or use hybrid methods like Arweave/IPFS, and some attestations even need to be initiated through APIs, relying on indexing services for queries. I initially thought this was just about storage preferences, but the more I look, the more I feel it’s not. Fully on-chain is the cleanest approach, but it’s expensive, heavy, and clumsy; fully off-chain is the lightest, but others may not necessarily want to follow your path to retrieve data in the future. Hybrid seems like a compromise, but as you look further down, you find that it also brings in APIs, indexes, and query links. This means that what Sign is handling here is not just about where to store it, but whether this proof can still be called by the system in the future. This changes the perspective entirely. Many projects talk about attestation in a very simple way. However, Sign’s approach feels like that of a concerned adult—what if the data is too large, what if the privacy is too sensitive, how will other systems query, verify, and continue to use it in the future? Now, what interests me most about $SIGN is not whether it can issue more proofs, but whether its proofs can be conveniently utilized by other systems in the future. If this link cannot be flattened, hybrid is just a seemingly smart compromise. But if this link can truly run smoothly, then what Sign encounters is not just the proof layer, but the foundation of how proofs can survive between systems. #sign地缘政治基建
Stayed up late again, this time specifically to look at the attestation creation process of @SignOfficial . I originally thought where the data is stored was just a cost issue. However, when I clicked on the hybrid attestation part, I realized that the matter is far from simple.

It's not just about storing it; there are also APIs, indexes, and query paths involved. Whether this proof can be continued to be used by others in the future is directly tied to where it is stored, which is quite frustrating.

The document for $SIGN clearly outlines the route; the schema can be fully on-chain or use hybrid methods like Arweave/IPFS, and some attestations even need to be initiated through APIs, relying on indexing services for queries. I initially thought this was just about storage preferences, but the more I look, the more I feel it’s not.

Fully on-chain is the cleanest approach, but it’s expensive, heavy, and clumsy; fully off-chain is the lightest, but others may not necessarily want to follow your path to retrieve data in the future. Hybrid seems like a compromise, but as you look further down, you find that it also brings in APIs, indexes, and query links. This means that what Sign is handling here is not just about where to store it, but whether this proof can still be called by the system in the future.

This changes the perspective entirely.

Many projects talk about attestation in a very simple way. However, Sign’s approach feels like that of a concerned adult—what if the data is too large, what if the privacy is too sensitive, how will other systems query, verify, and continue to use it in the future?

Now, what interests me most about $SIGN is not whether it can issue more proofs, but whether its proofs can be conveniently utilized by other systems in the future.

If this link cannot be flattened, hybrid is just a seemingly smart compromise. But if this link can truly run smoothly, then what Sign encounters is not just the proof layer, but the foundation of how proofs can survive between systems. #sign地缘政治基建
·
--
The most dangerous aspect of Midnight might not be that the contract cannot be upgraded, but that it must be upgraded.I have been reviewing the contract update documents for @MidnightNetwork these past two days. I just wanted to confirm whether the privacy contract on Midnight is similar to others on different chains. After deployment, it’s best not to tamper with it; if you can avoid upgrading, then don’t. As I was looking through the materials, I started to feel that something was off. The most troublesome part here may not be whether the contract can be upgraded, but rather that it must be upgraded. We are all familiar with public chains; upgrades are generally an option. If you want flexibility, you can keep the proxy; if you want to keep it simple, you can just lock it down. Many projects even consider being non-upgradable as an advantage.

The most dangerous aspect of Midnight might not be that the contract cannot be upgraded, but that it must be upgraded.

I have been reviewing the contract update documents for @MidnightNetwork these past two days. I just wanted to confirm whether the privacy contract on Midnight is similar to others on different chains. After deployment, it’s best not to tamper with it; if you can avoid upgrading, then don’t.
As I was looking through the materials, I started to feel that something was off. The most troublesome part here may not be whether the contract can be upgraded, but rather that it must be upgraded.
We are all familiar with public chains; upgrades are generally an option. If you want flexibility, you can keep the proxy; if you want to keep it simple, you can just lock it down. Many projects even consider being non-upgradable as an advantage.
·
--
After a day of toiling with the Preview environment of @MidnightNetwork , I initially just wanted to see how their privacy contract would provide proof, but as I ran around, I felt like I was about to run off course. The more I ran, the more I felt that what was truly difficult about Midnight might not be hiding the data on-chain, but rather how users can continue to live once the data is left locally. It is clear from the official side that public state is on-chain while private state remains local, and Preview even defaults to using a local proof server and encrypted private storage. At first glance, I wanted to applaud; finally, it's not one of those projects that talks about privacy while still swimming naked on-chain. But the problem lies in the word 'local'. We have formed a conditioned reflex with public chains; when changing devices or importing mnemonic phrases, assets are still intact. But with Midnight, what you need to recover isn't just the public key and balance, but private state, witness data, proof context, and a mess of everything that runs along with the local environment. After reinstalling the environment today, the contracts on-chain are still there, and the address is still there, but once that local private state is disconnected, the entire call immediately starts going haywire. In that moment, I felt that the way this chain protects privacy essentially just dumps a lot of responsibilities that you originally didn’t have to worry about back onto your own computer. This is really bizarre. The problem with traditional public chains is excessive transparency, $NIGHT has eliminated that transparency, but at the cost of turning everything into complex local state management. Want multi-device synchronization? Want seamless recovery? It’s simply overwhelming. This wallet doesn’t feel like a wallet; it feels like a small laboratory tied to an encrypted database and proof engine. I increasingly feel that the biggest challenge for Midnight right now is whether ordinary people can live long-term and safely carry this chunk of private state. If this doesn’t go smoothly, privacy won’t be a moat; it will be a moving hell. Next, we’ll see whether private state recovery and multi-device synchronization will be standardized, and whether the proof server will always be tied locally. Otherwise, #night the hardest part isn’t even on-chain, but rather going crazy on your computer.
After a day of toiling with the Preview environment of @MidnightNetwork , I initially just wanted to see how their privacy contract would provide proof, but as I ran around, I felt like I was about to run off course. The more I ran, the more I felt that what was truly difficult about Midnight might not be hiding the data on-chain, but rather how users can continue to live once the data is left locally.

It is clear from the official side that public state is on-chain while private state remains local, and Preview even defaults to using a local proof server and encrypted private storage. At first glance, I wanted to applaud; finally, it's not one of those projects that talks about privacy while still swimming naked on-chain.

But the problem lies in the word 'local'.

We have formed a conditioned reflex with public chains; when changing devices or importing mnemonic phrases, assets are still intact. But with Midnight, what you need to recover isn't just the public key and balance, but private state, witness data, proof context, and a mess of everything that runs along with the local environment.

After reinstalling the environment today, the contracts on-chain are still there, and the address is still there, but once that local private state is disconnected, the entire call immediately starts going haywire. In that moment, I felt that the way this chain protects privacy essentially just dumps a lot of responsibilities that you originally didn’t have to worry about back onto your own computer.

This is really bizarre. The problem with traditional public chains is excessive transparency, $NIGHT has eliminated that transparency, but at the cost of turning everything into complex local state management. Want multi-device synchronization? Want seamless recovery? It’s simply overwhelming. This wallet doesn’t feel like a wallet; it feels like a small laboratory tied to an encrypted database and proof engine.

I increasingly feel that the biggest challenge for Midnight right now is whether ordinary people can live long-term and safely carry this chunk of private state. If this doesn’t go smoothly, privacy won’t be a moat; it will be a moving hell.

Next, we’ll see whether private state recovery and multi-device synchronization will be standardized, and whether the proof server will always be tied locally. Otherwise, #night the hardest part isn’t even on-chain, but rather going crazy on your computer.
·
--
$BSB tasteless, a pity to discard Brothers probably didn't participate in this, right? #ALPHA
$BSB tasteless, a pity to discard
Brothers probably didn't participate in this, right?
#ALPHA
·
--
When I was recently reviewing the white paper @MidnightNetwork , what initially stumped me was not privacy or ZK. It was a very basic, but seriously critical question: why are enterprises still reluctant to run their businesses on the chain? The answer might not be as romantic as everyone thinks. Often, it’s not that they don’t want to go on-chain, but rather the budget simply doesn’t allow it. The gas fees and token prices of ordinary public chains are too tightly bound; when the token rises today, the network costs fluctuate tomorrow. For retail investors, this might just mean it's a bit expensive. But for teams that genuinely need to run their business, this kind of uncertainty is a hard injury. The official repeatedly emphasizes operational predictability in its economic design for $NIGHT , meaning that operational costs must be predictable. It separates NIGHT and DUST not to show off, but to ensure that asset prices and network usage costs are not completely tied together. NIGHT represents underlying capital and governance assets, while DUST is the execution resource, and DUST is non-transferable and will decay, resembling capacity more than money. This changes the perspective. Because many chains fundamentally resemble trading markets; you buy tokens, sell tokens, pay fees, and endure volatility. But what Midnight aims to become may not be a more exciting market, but rather a set of infrastructure that can be budgeted, operated, and integrated by enterprises. The official homepage positions itself as a rational privacy blockchain, not just discussing data concealment, but also advocating that utility and privacy do not have to be mutually exclusive. This utility, to put it bluntly, is not just about whether functions can be performed, but also about whether costs can be managed. So now when I look at $NIGHT again, I feel that what it truly wants to improve is probably not how to make privacy chains cooler, but addressing an old institutional issue that many public chains have not truly resolved. When chains start to resemble infrastructure, rather than just trading venues, who will ensure that the right of use is never held hostage by token prices? #night
When I was recently reviewing the white paper @MidnightNetwork , what initially stumped me was not privacy or ZK. It was a very basic, but seriously critical question: why are enterprises still reluctant to run their businesses on the chain?

The answer might not be as romantic as everyone thinks.

Often, it’s not that they don’t want to go on-chain, but rather the budget simply doesn’t allow it. The gas fees and token prices of ordinary public chains are too tightly bound; when the token rises today, the network costs fluctuate tomorrow. For retail investors, this might just mean it's a bit expensive. But for teams that genuinely need to run their business, this kind of uncertainty is a hard injury.

The official repeatedly emphasizes operational predictability in its economic design for $NIGHT , meaning that operational costs must be predictable. It separates NIGHT and DUST not to show off, but to ensure that asset prices and network usage costs are not completely tied together. NIGHT represents underlying capital and governance assets, while DUST is the execution resource, and DUST is non-transferable and will decay, resembling capacity more than money.

This changes the perspective.

Because many chains fundamentally resemble trading markets; you buy tokens, sell tokens, pay fees, and endure volatility. But what Midnight aims to become may not be a more exciting market, but rather a set of infrastructure that can be budgeted, operated, and integrated by enterprises. The official homepage positions itself as a rational privacy blockchain, not just discussing data concealment, but also advocating that utility and privacy do not have to be mutually exclusive. This utility, to put it bluntly, is not just about whether functions can be performed, but also about whether costs can be managed.

So now when I look at $NIGHT again, I feel that what it truly wants to improve is probably not how to make privacy chains cooler, but addressing an old institutional issue that many public chains have not truly resolved.

When chains start to resemble infrastructure, rather than just trading venues, who will ensure that the right of use is never held hostage by token prices? #night
·
--
What Sign really wants to manage is not 'proof,' but 'when the proof expires.'After looking at so much content, this time I didn't continue to look at @SignOfficial those big words. Words like identity, attestation, TokenTable; if I keep writing these, I'll find it annoying myself. So I simply turned it around and specifically focused on something particularly small, particularly fragmented, which a normal person would likely glance over in three seconds when reading the white paper: revocation. Yes, that's revocation. Also, expiration, status list, trust registry; at a glance, it looks very much like miscellaneous work in the appendix area, particularly unsexy, and very much like one of those things that you go 'uh-huh, got it, next.' As a result, when I looked at it, I realized it was bad. This thing is not an auxiliary function at all; this thing is the core. Because the Sign white paper doesn't just casually mention support for revocation and call it a day; it forces this matter into the entire lifecycle of the identity system.

What Sign really wants to manage is not 'proof,' but 'when the proof expires.'

After looking at so much content, this time I didn't continue to look at @SignOfficial those big words. Words like identity, attestation, TokenTable; if I keep writing these, I'll find it annoying myself.
So I simply turned it around and specifically focused on something particularly small, particularly fragmented, which a normal person would likely glance over in three seconds when reading the white paper: revocation.
Yes, that's revocation.
Also, expiration, status list, trust registry; at a glance, it looks very much like miscellaneous work in the appendix area, particularly unsexy, and very much like one of those things that you go 'uh-huh, got it, next.'
As a result, when I looked at it, I realized it was bad. This thing is not an auxiliary function at all; this thing is the core. Because the Sign white paper doesn't just casually mention support for revocation and call it a day; it forces this matter into the entire lifecycle of the identity system.
Login to explore more contents
Explore the latest crypto news
⚡️ Be a part of the latests discussions in crypto
💬 Interact with your favorite creators
👍 Enjoy content that interests you
Email / Phone number
Sitemap
Cookie Preferences
Platform T&Cs