Skip to content

Official repository for "Exploiting Multimodal Knowledge Graph for Multimodal Machine Translation", TMM 2025.

License

Notifications You must be signed in to change notification settings

NLP2CT/MMKG_MMT_Released

 
 

Repository files navigation

Exploiting Multimodal Knowledge Graph for Multimodal Machine Translation

Our implementation is based on FairSeq. Dataset used in the experiments are Multi30k and IKEA.

🚀 Getting Started

Install the dependencies using pip

pip install -r requirements.txt

The method is designed to be plug-and-play, making it applicable to various low-resource corpora. A multimodal knowledge graph compatible with the specific corpus can be obtained through the execution of either crawl_direct.py or crawl_indirect.py:

python crawl_direct.py

python crawl_indirect.py

Subsequently, data enhancement can be achieved by utilizing the generate_pseudo_data.py:

python generate_pseudo_data.py

For reference purposes, the Multi30k and IKEA datasets enhanced through our methodology have been made available here (password: yUxF).

📖 Citation

If you find our paper and code useful in your research, please consider giving a star ⭐ and citation 📖.

@inproceedings{mmkg_mmt,
  author       = {Xu, Tianjiao and Liu, Xuebo and Wong, Derek F. and Zhang, Yue and Chao, Lidia S. and Zhang, Min and Gan, Tian},
  title        = {Exploiting Multimodal Knowledge Graph for Multimodal Machine Translation},
  journal      = {IEEE Transactions on Multimedia},
  year         = {2025},
}

License

Code released under the Apache-2.0 License. Dataset released under the CC BY-NC-SA 4.0.

About

Official repository for "Exploiting Multimodal Knowledge Graph for Multimodal Machine Translation", TMM 2025.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%