preprocessing-sfa provides two Enduro child workflows for SFA SIPs: a preprocessing child workflow and an AIS poststorage child workflow. Despite the project name, the worker binary starts two independent Temporal workers, one for each child workflow.
The worker binary starts a preprocessing Temporal worker and an AIS poststorage Temporal worker. They need to share the filesystem with Enduro's a3m or Archivematica workers, connect to the same Temporal server, and be related to Enduro with the correct namespace, task queue and workflow names.
An example configuration for the worker binary:
debug = false
verbosity = 0
sharedPath = "/home/preprocessing/shared"
checkDuplicates = false
[persistence]
dsn = "user:password@tcp(mysql.enduro-sdps:3306)/preprocessing_sfa"
driver = "mysql"
migrate = true
[temporal]
address = "temporal-frontend.enduro-sdps:7233"
namespace = "default"
taskQueue = "preprocessing"
workflowName = "preprocessing"
[worker]
maxConcurrentSessions = 1
[bagit]
checksumAlgorithm = "md5"
[apis]
enabled = true
url = "http://apis-mock.enduro-sdps:8080"
timeout = "10s"
pollInterval = "1s"
token = "mock-token"
[apis.oidc]
enabled = false
providerURL = "http://keycloak:7470/realms/artefactual"
tokenURL = ""
clientID = "enduro-s2s"
clientSecret = "uSh7f2r4j2U5wA9d7mJ3xP6nQ8cT1vL0"
scopes = ""
audience = ""
tokenExpiryLeeway = "30s"
retryMaxAttempts = 3
retryInitialInterval = "500ms"
retryMaxInterval = "2s"
retryBackoffCoefficient = 2.0
[ais]
workingDir = "/tmp"
[ais.temporal]
address = "temporal-frontend.enduro-sdps:7233"
namespace = "default"
taskQueue = "ais"
workflowName = "ais"
[ais.worker]
maxConcurrentSessions = 1
[ais.amss]
url = "http://ambox.enduro-sdps:64081"
user = "test"
key = "test"
[ais.bucket]
endpoint = "http://minio.enduro-sdps:9000"
pathStyle = true
accessKey = "minio"
secretKey = "minio123"
region = "us-west-1"
bucket = "ais"
[fileFormat]
allowlistPath = "/home/preprocessing/.config/allowed_file_formats.csv"
[filevalidate.verapdf]
path = "/opt/verapdf/verapdf"The child workflow sections for Enduro's configuration:
[[childWorkflows]]
type = "preprocessing"
namespace = "default"
taskQueue = "preprocessing"
workflowName = "preprocessing"
extract = true
sharedPath = "/home/enduro/preprocessing"
[[childWorkflows]]
type = "poststorage"
namespace = "default"
taskQueue = "ais"
workflowName = "ais"This project provides two child workflows for the Enduro development
environment. The supported development workflow is to run tilt up from the
Enduro repository and load this repository through Enduro's
CHILD_WORKFLOW_PATHS mechanism.
Bring up the Enduro environment by following the Enduro development manual.
The specific requirements for preprocessing-sfa are:
- clone this repository as a sibling of the Enduro repository
- configure
CHILD_WORKFLOW_PATHS=../preprocessing-sfa - configure
MOUNT_PREPROCESSING_VOLUME=true - run
tilt upfrom the Enduro repository
All other development workflow details, including .tilt.env, live updates,
starting, stopping, and clearing the environment, are documented in Enduro.
This repository can also provide local overrides through its own .tilt.env
file, including settings such as TRIGGER_MODE_AUTO.
While we run the services inside a Kubernetes cluster we recomend installing Go and other tools locally to ease the development process.
The Makefile provides developer utility scripts via command line make tasks.
Running make with no arguments (or make help) prints the help message.
Dependencies are downloaded automatically.
The debug mode produces more output, including the commands executed. E.g.:
$ make env DBG_MAKEFILE=1
Makefile:10: ***** starting Makefile for goal(s) "env"
Makefile:11: ***** Fri 10 Nov 2023 11:16:16 AM CET
go env
GO111MODULE=''
GOARCH='amd64'
...Most of the activities documented below belong to the preprocessing child workflow.
- Calculate SIP checksum
- Check for duplicate SIP
- Unbag SIP
- Identify SIP structure
- Validate SIP structure
- Validate SIP name
- Verify SIP manifest
- Verify SIP checksums
- Validate SIP files
- Validate logical metadata
- Create premis.xml
- Restrucuture SIP
- Create identifiers.json
- Other activities
Part 1 of a 2-part activity around duplicate checking - see also:
Generates and stores a checksum for the entire SIP, so it can be used to check for duplicates
- Generate a SHA256 checksum for the incoming package
- Read SIP name
- Store SIP name and checksum in the persistence layer (
sipstable)
- A SHA256 checksum is successfully generate for the SIP
- the SIP name and generated checksum are stored in the persistence layer
Part 2 of a 2-part activity around duplicate checking - see also:
Determines if an identical SIP has previously been ingested
- Use the generated checksum from part 1 to search
for an existing match in the
sipsdatabase table - If an existing match is found, return a content error for a duplicateSIP and terminate the workflow
- Else, continue to next activity
- The activity is able to read the generated checksum and the
sipsdatabase table - No matching checksum is found the SIPs database table
Extracts the contents of the bag.
Only runs if the SIP is a BagIt bag. If the SIP is not a bag, this activity will not run.
- Check if SIP is a bag
- If yes, extract the contents of the bag for additional ingest processing
- Else, skip
- Bag is successfully extracted
Determines the SIP type by analyzing the name and distinguishing features of the package, based on eCH-0160 requirements and other internal policies.
Package types include:
- BornDigitalSIP
- DigitizedSIP
- BornDigitalAIP
- DigitizedAIP
- Base type is BornDigitalSIP; assume this is the SIP type unless other conditions are met
- Check if the package contains a
Prozess_Digitalisierung_PREMIS.xmlfile- If yes, it is a Digitized package - either DigitizedSIP or DigitizedAIP
- Check if the package contains an additional directory
- If yes, it is a migration AIP - either BornDigitalAIP or DigitizedAIP
- Compare check results and determine package type
- Package is successfully identified as one of the 4 supported types
Ensures that the SIP directory structure conforms to eCH-0160 specifications, that no empty directories are included, and that there are no disallowed characters used in file and directory names.
Note: Character restrictions for file and directory names are based on some of the requirements of the tools used by Archivematica during preservation processing - at present, the file name cleanup steps in Archivematica cannot be modified or disabled without forking. To ensure that SFA package metadata matches the content, this validation check ensures that no disallowed characters are included in file or directory names that might be automatically changed once received by Archivematica.
- Read SIP type from previous activity
- Check for presence of
contentandheaderdirectories - Check all file and directory names for invalid characters
- Check for empty directories
- Files and directories only contain valid characters
A-Z,a-z,0-9, or-_.()
- SIPs contain
contentandheaderdirectories- If content type is an AIP, it also contains an
additionaldirectory
- If content type is an AIP, it also contains an
- No empty directories are found
Ensure that submitted SIPs use the required naming convention for the identified package type.
- Read SIP type from previous activity
- Use regular expression to validate SIP name based on identified type
- SIP follows expected naming convention for package type:
- BornDigitalSIP:
SIP_[YYYYMMDD]_[delivering office]_[reference] - DigitizedSIP:
SIP_[YYYYMMDD]_Vecteur_[delivering office]_[reference]
- BornDigitalSIP:
Checks if all files and directories listed in the metadata manifest match those found in the SIP, and that no extra files or directories are found.
- Load SIP metadata manifest into memory
- Parse the manifest contents and return a list of files and directories
- Parse the SIP and return a list of files and directories
- Compare lists
- Return a list of any missing files found in the manifest but not the SIP
- Return a list of unexpected files found in the SIP but not the manifest
- There is a matching file or directory for every entry found in the
metadata.xml(orUpdatedAreldaMetadata.xml) manifest - No unexpected files that are not listed in the manifest are found
Confirms that the checksums included in the metadata manifest match those calculated during validation.
- Check if a given file exists in the manifest
- If yes, calculate a checksum - else skip
- Compare calculated checksum to manifest checksum
- A checksum calculated using the same algorithm as the one used in the metadata file returns the same value as the one included in the metadata manifest for each file listed
Ensures that files included in the SIP are well-formed and match their format specifications.
- For PDF/As, use VeraPDF to validate against the PDF/A specification
- Note: additional format validation checks will be added in the future
- All files pass validation
Ensures that a logical metadata file is included for AIPs being migrated from DIR and validates the file against a PREMIS schema file
Note : this activity uses some custom workflow code and a locally stored copy of the PREMIS schema to run the general temporal activity xmlvalidate.
- Read package type from memory
- If package type is bornDigitalAIP or DigitizedAIP, check for XML file in
additionaldirectory - If found, validate the XML file against a locally stored copy of the PREMIS schema; fail ingest if any errors are returned
- Logical metadata file is found in the
additionaldirectory of the package - Logical metadata file validates against PREMIS 3.x schema
Generates a PREMIS XML file that captures ingest preservation actions performed by Enduro as PREMIS events for inclusion in the resulting AIP METS file.
NOTE: This activity is broken up into 3 different activity files in
/internal/activites:
add_premis_agent.goadd_premis_event.goadd_premisobjects.go
The XML output is then assembled via /internal/premis/premis.go.
- Review event details for all successful tasks
- Create premis.xml file in a new metadata directory
- Write PREMIS objects to file
- Write PREMIS events to file
- Write PREMIS agents to file
- A
premis.xmlfile is successfully generated with ingest events
Reorganizes SIP directory structure into a Preservation Information Package (PIP) that the preservation engine (Archivematica) can process.
- Check if
metadatadirectory exists, else create a newmetadatadirectory - Move the
Prozess_Digitalisierung_PREMIS.xmlfile to themetadatadirectory - For AIPs, move the
UpdatedAreldaMetatdata.xmland logical metadata files to themetadatadirectory - Create an
objectsdirectory, and in that directory create a sub-directory with the SIP name - Delete
xsddirectory and its contents fromheaderdirectory - Move
contentdirectory into the newobjectsdirectory - Create a new
headerdirectory in objects - Move the
metadata.xmlfile into the newheaderdirectory - Delete original top-level directories
- XSD files are removed
- Restructured package now has
objectsandmetadatadirectories immediately inside parent container - All content for preservation is within the
objectsdirectory - Enduro-generated PREMIS file is in the
metadatadirectory - For Digitized packages,
Prozess_Digitalisierung_PREMIS.xmlfile is in the metadata directory
Extract original UUIDs from the SIP metadata file and add them to an
identifiers.json file added to the metadata directory of the package for
parsing by the preservation engine
- Parse SIP metadata file
- Extract persistent identifiers and write to memory
- Convert manifest file paths to the restructured PIP file paths
- Exclude any files in the manifest that aren't found in the PIP
- Using extracted identifiers, generate an
identifiers.jsonfile that conforms to Archivematica's expectations - Move generated file to package
metadatadirectory
- An
identifiers.jsonfile is added to themetadatadirectory of the package - UUIDs present in the original SIP metadata are maintained and used by the preservation engine during preservation processing
The preprocessing child workflow that invokes the activities listed above (see the preprocessing.go file) also uses a number of other more general Enduro temporal activites, including:
archiveextractbagcreatebagvalidateffvalidatexmlvalidate
The AIS poststorage child workflow uses one custom workflow activity maintained in this repository:
Extracts all relevant metadata from the SIP and resulting AIP and delivers it to the AIS for synchronization.
- Generate a new XML document that combines the contents of the two source files
(the SIP
metadata.xmlorUpdatedAreldaMetadata.xmlfile, and the AIP METS file) - ZIP the generated file and deposit it in an
aisMinIO bucket
- Metadata bundle is successfully generated and deposited
- AIS is able to receive and ingest the metadata