The folder is organised in the following way:
proposal/
: contains LaTex files of the initial proposal of this projectstyle-reference.md
: contains a list of useful LaTex tips and styling guides
The project relies on the cryptobib submodule. After cloning or pulling the repository, please run:
git submodule update --init
To update the cryptobib submodule:
cd cryptobib && git pull origin master
The first time you clone the project please execute:
git submodule init
Then, every time you udpate from main, you should update cryptobib if git status
finds differences for that folder:
git submodule update
Please, open a PR with your changes and ask for review. To do so, please create a new branch where to store your changes first (assuming you are in main):
- Stash any changes you have locally (optional)
git add . && git stash
- Update your main local branch
git pull --rebase origin main
- Switch to a new branch:
git checkout -b "your-branch-name"
- Unstash your changes (optional, conflicts might arise, in which case you need to solve them, you can check the offending files using
git status
):
git stash pop
- Commit your changes to the branch and push
git add . && git commit -m "your message" && git push
- Follow the link displayed in the terminal to open a PR against main.
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
You will also need to install wasm-pack
to compile some of the crates (common
, baseline
, ssf
) into Webassembly:
cargo install wasm-pack
Currently the project has been tested and compiled successfully using:
rustc
: 1.72, 1.78.0wasm-pack
: 0.12.1
You will need also to install some JS tools:
nodejs
: 20.13.1, 21.5.0npm
: 10.5.0
If you donwload the official installer you will install both of them at once.
Mocked external dependencies are available and can be activeted using the provided docker-compose.yaml The project uses LocalStack to simulate AWS S3.
To start the dependencies locally:
docker compose -f services/docker-compose.yaml up
To stop them:
docker compose -f services/docker-compose.yaml down --remove-orphans
You can install a complete version of LaTex from the latex project.
Please, install the extensions suggested by the editor:
- Latex-workshop
- You should add the binary for LaTex in your
PATH
environment variable, if you downloaded LaTex following the above in MacOs you should add the following line in your~/.zshrc
:
export PATH=$PATH:/Library/TeX/texbin
License is automatically applied using licensure
. You can install it using:
cargo install licensure
and run using:
licensure --project
www
: contains the website, loading the ssf implementation compiled to webassebly. (probably not in scope)ssf
: the actual core protocol of the Secure Shared Folder scheme.cli
: the Command Line Interface for the Secure Shared Folder scheme.baseline
: the naive implementation of the Shared Folder system to compare the SSF against (for performance testing).services
: dockerized dependencies to be used while developing. It contains databases, cloud services and so on. Further it contains the code for the backends:pki
: our Public Key Infrastructure service, basically a local Certificate Authority serverds
: our SSF server (called DS from Delivery Service - although we might change name if we use a pre-built delivery service), providing the required infrastructure to store and retrieve the files from the Cloud Storage Provider, and performing ACL checks on the folder access.
As S3 storage is not publicly available, we organise the system in 3 main different components:
- CA server (PKI), to address security concerns for this we could also use
KeyTransparency
but it is out of scope for the thesis. The server is also providing averify
endpoint, but clients can just use perform the verification by themselves using the public CA certificate (this would implement theAuthorization Service
verify in our case). - Client application, each client creates a key pair for asymmetric encryption, and
register
to the CA- we want to re-use this as a form of authentication as well, instead of having a password, so we would like to use mTLS.
- the SSF server, which is basically the company providing the storage. In our case, we offload the storage in S3. All the endpoints of the SSF are authenticated using mTLS. The SSF server is further divided into different logical components:
- DS: delivery service
- Storage service (manages the folders and ACLs to them)