Cross-Level Fusion for Rotating Machinery Fault Diagnosis under Compound Variable Working Conditions
Official implementation of our paper: "Cross-Level Fusion for Rotating Machinery Fault Diagnosis under Compound Variable Working Conditions."
The project has been restructured using a large model. If there are any issues with the code, please contact me.
- Paper: Cross-Level fusion for rotating machinery fault diagnosis under compound variable working conditions
- Journal: Measurement
- Volume: 199
- Pages: 111455
- Year: 2022
- Publisher: Elsevier
@article{wang2022cross,
title={Cross-Level fusion for rotating machinery fault diagnosis under compound variable working conditions},
author={Wang, Sihan and Wang, Dazhi and Kong, Deshan and Li, Wenhui and Wang, Huanjie and Pecht, Michael},
journal={Measurement},
volume={199},
pages={111455},
year={2022},
publisher={Elsevier}
}This repository provides a unified implementation of multiple few-shot learning methods for rotating machinery fault diagnosis under compound variable working conditions. The code supports various approaches including:
- Relation Networks: Traditional relation-based few-shot learning
- Prototypical Networks: Distance-based prototypical learning
- Cosine Similarity: Image-to-class metric learning
- KL Divergence: Multi-level metric learning with KL divergence
- Wasserstein Distance: Multi-level metric learning with Wasserstein distance
- Unified Framework: Single codebase supporting multiple few-shot learning methods
- Multi-Domain Support: Handles both 1D time series and 2D image data
- Flexible Configuration: Easy-to-use configuration system
- Comprehensive Evaluation: Built-in testing and evaluation metrics
- Clean Architecture: Well-organized, maintainable code structure
- Python 3.7+
- PyTorch 1.8+
- NumPy
- Matplotlib
- SciPy
- PIL (Pillow)
# Clone the repository
git clone https://github.com/your-username/multi-domain-few-shot-learning.git
cd multi-domain-few-shot-learning
# Install dependencies
pip install torch torchvision numpy matplotlib scipy pillow- Train a relation network model:
python train.py --method relation --modeltype 1d --class_num 5 --sample_num_per_class 3- Train a prototypical network:
python train.py --method proto --modeltype 1d --class_num 5 --sample_num_per_class 1- Fine-tune a pre-trained model:
python finetune.py --sample_num_per_class 10 --unfrozen 2python run_experiments.pyThe config.py file contains all experiment configurations. You can modify parameters such as:
- Model architecture (1D/2D)
- Number of classes and samples
- Training episodes
- Learning rates
- Data types
multi-domain-few-shot-learning/
├── train.py # Main training script
├── finetune.py # Fine-tuning script
├── run_experiments.py # Batch experiment runner
├── config.py # Configuration settings
├── utils/ # Utility modules
│ ├── models.py # Neural network models
│ ├── data_generator.py # Data loading and preprocessing
│ └── logger.py # Logging utilities
├── results/ # Experiment results
└── README.md # This file
Traditional few-shot learning using relation networks to compare query and support samples.
Distance-based learning using class prototypes computed from support samples.
Image-to-class metric learning using cosine similarity between local descriptors.
Multi-level metric learning incorporating KL divergence between feature distributions.
Multi-level metric learning using Wasserstein distance for distribution matching.
The code expects PU bearing dataset with the following structure:
data/
├── KA01/
│ └── KA01/
│ ├── N09_M07_F10_KA01_1.mat
│ ├── N09_M07_F10_KA01_2.mat
│ └── ...
├── KA03/
└── ...
Training results are saved in the results/ directory, including:
- Model checkpoints
- Training logs
- Accuracy plots
- Performance metrics
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.
- Original paper authors for the research work
- PyTorch team for the deep learning framework
- Contributors to the few-shot learning community