Example scripts and integration code for Akave documentation.
This repository contains reference implementations demonstrating how to integrate various tools and frameworks with Akave's S3-compatible storage.
The urandom repository provides production-ready example scripts that showcase Akave's capabilities across different use cases. Each directory contains standalone examples with detailed documentation.
Integration examples for using Hugging Face datasets with Akave's S3-compatible storage.
π Full Documentation
Files:
-
huggingface_s3.py- CoreHuggingFaceS3class providing methods to:- Transfer Hugging Face datasets to Akave storage
- Save and load datasets from S3-compatible endpoints
- List buckets and manage dataset paths
-
huggingface_test.py- Comprehensive test suite with 7 examples:- Creating and saving custom datasets
- Dataset transformations with tokenization
- Transferring existing datasets (MNIST)
- Dataset streaming
- Dataset versioning
- Different dataset formats (Arrow, Parquet)
- Loading and saving pre-existing datasets
Configuration:
An example .env.example file is included in the huggingface/ directory:
AKAVE_ACCESS_KEY=your_access_key
AKAVE_SECRET_KEY=your_secret_key
AKAVE_ENDPOINT_URL=your_endpoint_url
AKAVE_BUCKET_NAME=your_bucket_nameTo use the example, copy the .env.example file to a new .env file in the huggingface/ directory and fill in your actual credentials.
Direct S3FS integration examples for file operations with Akave O3 storage.
π Full Documentation
Files:
s3fs_test.py- Complete test script demonstrating:- Bucket listing and content exploration
- File upload/download operations
- Directory creation and management
- File copying and deletion
- Pandas DataFrame integration (CSV/JSON)
- Reading file contents directly from storage
Configuration:
Uses AWS CLI profile akave-o3 with endpoint https://o3-rc2.akave.xyz
Usage:
# Run all tests
python s3fs_test.py your-bucket-name --operation all
# List buckets and contents
python s3fs_test.py your-bucket-name --operation list
# Upload a file
python s3fs_test.py your-bucket-name --operation upload --file path/to/file
# Download a file
python s3fs_test.py your-bucket-name --operation download --file remote/pathpip install s3fs datasets transformers pandas numpy python-dotenvpip install s3fs pandas-
Clone the repository
git clone <repository-url> cd urandom
-
Set up credentials
- For Hugging Face examples: Create a
.envfile in thehuggingface/directory - For S3FS examples: Configure AWS CLI profile
akave-o3
- For Hugging Face examples: Create a
-
Install dependencies
# Install packages individually as listed above -
Run examples
# Hugging Face cd huggingface python huggingface_test.py # S3FS cd s3fs python s3fs_test.py your-bucket-name
For complete documentation on Akave's S3-compatible storage and additional integration examples, visit docs.akave.xyz.