Skip to content

Official repository for the paper: "Cross-Level fusion for rotating machinery fault diagnosis under compound variable working conditions"

Notifications You must be signed in to change notification settings

wongsihan/multi-domain-few-shot-learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Cross-Level Fusion for Rotating Machinery Fault Diagnosis under Compound Variable Working Conditions

Official implementation of our paper: "Cross-Level Fusion for Rotating Machinery Fault Diagnosis under Compound Variable Working Conditions."

The project has been restructured using a large model. If there are any issues with the code, please contact me.

Paper Information

Citation

@article{wang2022cross,
    title={Cross-Level fusion for rotating machinery fault diagnosis under compound variable working conditions},  
    author={Wang, Sihan and Wang, Dazhi and Kong, Deshan and Li, Wenhui and Wang, Huanjie and Pecht, Michael}, 
    journal={Measurement}, 
    volume={199},
    pages={111455},
    year={2022},
    publisher={Elsevier}
}

Overview

This repository provides a unified implementation of multiple few-shot learning methods for rotating machinery fault diagnosis under compound variable working conditions. The code supports various approaches including:

  • Relation Networks: Traditional relation-based few-shot learning
  • Prototypical Networks: Distance-based prototypical learning
  • Cosine Similarity: Image-to-class metric learning
  • KL Divergence: Multi-level metric learning with KL divergence
  • Wasserstein Distance: Multi-level metric learning with Wasserstein distance

Features

  • Unified Framework: Single codebase supporting multiple few-shot learning methods
  • Multi-Domain Support: Handles both 1D time series and 2D image data
  • Flexible Configuration: Easy-to-use configuration system
  • Comprehensive Evaluation: Built-in testing and evaluation metrics
  • Clean Architecture: Well-organized, maintainable code structure

Installation

Requirements

  • Python 3.7+
  • PyTorch 1.8+
  • NumPy
  • Matplotlib
  • SciPy
  • PIL (Pillow)

Setup

# Clone the repository
git clone https://github.com/your-username/multi-domain-few-shot-learning.git
cd multi-domain-few-shot-learning

# Install dependencies
pip install torch torchvision numpy matplotlib scipy pillow

Usage

Quick Start

  1. Train a relation network model:
python train.py --method relation --modeltype 1d --class_num 5 --sample_num_per_class 3
  1. Train a prototypical network:
python train.py --method proto --modeltype 1d --class_num 5 --sample_num_per_class 1
  1. Fine-tune a pre-trained model:
python finetune.py --sample_num_per_class 10 --unfrozen 2

Run All Experiments

python run_experiments.py

Configuration

The config.py file contains all experiment configurations. You can modify parameters such as:

  • Model architecture (1D/2D)
  • Number of classes and samples
  • Training episodes
  • Learning rates
  • Data types

Project Structure

multi-domain-few-shot-learning/
├── train.py                 # Main training script
├── finetune.py             # Fine-tuning script
├── run_experiments.py      # Batch experiment runner
├── config.py               # Configuration settings
├── utils/                  # Utility modules
│   ├── models.py           # Neural network models
│   ├── data_generator.py   # Data loading and preprocessing
│   └── logger.py           # Logging utilities
├── results/                # Experiment results
└── README.md               # This file

Methods Supported

1. Relation Networks

Traditional few-shot learning using relation networks to compare query and support samples.

2. Prototypical Networks

Distance-based learning using class prototypes computed from support samples.

3. Cosine Similarity

Image-to-class metric learning using cosine similarity between local descriptors.

4. KL Divergence

Multi-level metric learning incorporating KL divergence between feature distributions.

5. Wasserstein Distance

Multi-level metric learning using Wasserstein distance for distribution matching.

Data Format

The code expects PU bearing dataset with the following structure:

data/
├── KA01/
│   └── KA01/
│       ├── N09_M07_F10_KA01_1.mat
│       ├── N09_M07_F10_KA01_2.mat
│       └── ...
├── KA03/
└── ...

Results

Training results are saved in the results/ directory, including:

  • Model checkpoints
  • Training logs
  • Accuracy plots
  • Performance metrics

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Submit a pull request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • Original paper authors for the research work
  • PyTorch team for the deep learning framework
  • Contributors to the few-shot learning community

About

Official repository for the paper: "Cross-Level fusion for rotating machinery fault diagnosis under compound variable working conditions"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages