Track created_by and updated_by in Node and Revisions#1255
Track created_by and updated_by in Node and Revisions#1255jmaruland wants to merge 19 commits intobluesky:mainfrom
Conversation
0f4e558 to
1ce3bdc
Compare
ee06f6d to
505796b
Compare
|
Closes #1076 |
danielballan
left a comment
There was a problem hiding this comment.
Nice to see this coming together! I have a couple questions in line.
Also, this needs a database migration to add the new columns to existing databases.
tiled/server/router.py
Outdated
| data_sources=body.data_sources, | ||
| access_blob=access_blob, | ||
| created_by=( | ||
| principal.identities[0].id if len(principal.identities) > 0 else "" |
There was a problem hiding this comment.
This design consideration deserves some thought:
- What should we put here when the principal has no identities? Service Principals have no identities. Should we put
service:{principal.uuid}? Or nothing? - Should the fallback value be
""orNone/NULL?
There was a problem hiding this comment.
I actually had to add this conditional because test_access_control.py::test_service_principal_access_control was failing here. It seems like, at some point during the handshake with the server, principal.identities is passed as an empty list.
|
Suggestions for various scenarios:
To test a service principal: # Log in as admin. Then...
# Create a new Service Principal. It is identified by its UUID. It has no "identities"; it cannot log in with a password.
sp = admin_client.context.admin.create_service_principal()
# Create an API key for this Service Principal.
key_info = admin_client.context.admin.create_api_key(sp["uuid"])
# Then to log in as the principal, you can do this...
admin_client.logout()
sp_client.context.api_key = key_info["secret"]
# For clarity use a new variable name.
sp_client = admin_client
del admin_client |
|
Run a command line this to generate a template for a database migration. See other similar files in this directory for examples. Finally, add the unique ID for this migration (the first part of the filename, and it also appears in the file) to this list: Lines 7 to 26 in 2cf4c80 |
|
@danielballan I ran the command for the database migration but I have pre-commit a bit a bit mad because of flake8 This is an auto-generated file. Should I ignore these messages? |
|
Is it ok if I add an inline comment with |
|
False alarm. I understand what the next step us with the |
… database migration
| op.add_column( | ||
| "revisions", | ||
| sa.Column("updated_by", String, nullable=True, server_default="NULL"), | ||
| ) |
There was a problem hiding this comment.
I believe you changes to the ORM also drop the created_by column from the revisions table, and that change should be made here also.
It should correspondingly be added in the downgrade function, acknowledging that the information has been lost so the column will have to be initialized with 0 or something as a placeholder.
| metadata=None, | ||
| specs=None, | ||
| access_blob=None, | ||
| updated_by="", |
There was a problem hiding this comment.
Similar to created_by, I think None (NULL in SQL) may be a more appropriate default than "".
| "0b033e7fbe30", | ||
| "83889e049ddc", | ||
| "6825c778aa3c", | ||
| "8fd6ac88f2ec", |
There was a problem hiding this comment.
Wrong position, I think. See comment at top of list.
| specs=None, | ||
| data_sources=None, | ||
| access_blob=None, | ||
| created_by="", |
There was a problem hiding this comment.
I think None (NULL in SQL) may be a more appropriate default than "".
| access_blob_modified = access_blob != entry.access_blob | ||
| access_blob = entry.access_blob | ||
|
|
||
| if principal is None: |
There was a problem hiding this comment.
Since this logic occurs in three places, and is a bit fiddly, I think there's an argument for refactoring it into a function.
| !.pixi/config.toml | ||
|
|
||
| # copilot files | ||
| copilot-instructions.md |
There was a problem hiding this comment.
If we have useful copilot instructions, maybe we should commit them!
Checklist
This PR introduces created_by and updated_by columns in the database for forensic purposes.