-
Notifications
You must be signed in to change notification settings - Fork 63
make the transfer and issuance metadata available in the db #1034
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Hi @arner , thanks for this PR. I will look at it ASAP. |
I have only used liquibase before for Java. We need a tool that:
Here are some options (I'll keep on investigating):
I think that overall so far the best solution is Golang-migrate. It's straightforward and that would make its use with Kubernetes or ansible scripts easier. It's more manual, but I think we don't have any complex changes in the schemas. |
I have no experience with golang-migrate but I agree that its simplicity and the fact that it just uses SQL files is a good thing, it gives flexibility to the user and transparency. One additional requirement is that it must be possible (or even the default / mandatory?) to manage the schema outside of the application. At least for security: the database user of the token sdk application should not have the rights to change schemas and drop tables. But also for versioning, rollbacks and possible data updates, that would be risky and difficult through application deployments and configuration. And finally it's likely that in larger companies the application/deployment team is not the same as the database administration team, with specific audit requirements, etc, so in some cases it would not even be allowed to let the application handle it. That means that the content of the "CreateTable" should be externalized to the sql files that the tools require. All in one place would be easiest for an administrator, but it would break the independence of the different data stores a bit. And we would likely need a separate directory for postgres and one for sqlite. edit: the dynamic table names are a complication here. I know that some of the tests rely on it and I don't know if there are any features that require the namespacing. The names are based on the network and chaincode. |
I can see an epic to handle all this, honestly. There are many things to streamline in the lines of what you say @arner . |
Signed-off-by: Arne Rutjes <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM modulo the comments. Thanks.
Anchor string | ||
Inputs *InputStream | ||
Outputs *OutputStream | ||
Attributes map[string][]byte |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please, update the comment of the struct and maybe some information about what Attributes
is supposed to contain.
@@ -120,6 +120,9 @@ type TransactionRecord struct { | |||
// ApplicationMetadata is the metadata sent by the application in the | |||
// transient field. It is not validated or recorded on the ledger. | |||
ApplicationMetadata map[string][]byte | |||
// PublicMetadata is the metadata that is stored on the ledger as part | |||
// of an Issuance or Transfer Action (for instance the HTLC hash). | |||
PublicMetadata map[string][]byte |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we test that we get back what we store?
This PR makes the public metadata of Issuance and Transfer actions, which are stored on the ledger, available to the user via the database (on the sender, recipient, and auditor side). At the moment this includes the hash and preimage of HTLC lock and claim, which can be used to display to the end user or for reporting.
The user can already add this kind of metadata to a Transfer with the opt:
tkn.WithTransferMetadata("user.x."+tx.ID(), []byte("this is metadata"))
. At the moment this causes the validation to fail, but that can be fixed if we add a newValidateTransferFunc
, somewhat like the following:It would add some useful tools to the toolbox. For instance you could share metadata privately with the counterparty and optionally an auditor in the ApplicationMetadata, and a hash of that data on the public ledger, all tied to the same transaction. Or link it to external data, or create another custom validator for additional proofs. It would be the responsibility of the user to make sure that privacy is protected.
Important
This change is not backwards compatible due to changing the database schema. It would require a manual
ALTER TABLE ... ADD COLUMN public_metadata JSONB NOT NULL;
before upgrading. Until we have a common method for database migrations, I couldn't find a good way to handle it.Looking forward to feedback on this PR and the idea for the followup.