Hi @NVIDIA-NeMo 🤗
Niels here from the open-source team at Hugging Face. I discovered your work through Hugging Face's daily papers as yours got featured: https://huggingface.co/papers/2604.12374.
The paper page lets people discuss about your paper and lets them find artifacts about it (your models, datasets or demo for instance), you can also claim
the paper as yours which will show up on your public profile at HF, add Github and project page URLs.
It'd be great to make all the Nemotron 3 Super checkpoints and the "Nemotron-Super-Post-Training-Data" dataset available on the 🤗 hub, to improve their discoverability/visibility.
We can add tags so that people find them when filtering https://huggingface.co/models and https://huggingface.co/datasets.
I found the NVIDIA-Nemotron-3-Super-120B-A12B-BF16 and Nemotron-3-Super-49B-v1 models, and the Nemotron-Pretraining-Specialized-v1.1 dataset already on the Hub through your GitHub repository, which is fantastic!
However, the paper mentions additional Nemotron 3 Super 120B-A12B checkpoints in various quantization formats:
- Nemotron 3 Super 120B-A12B NVFP4
- Nemotron 3 Super 120B-A12B FP8
- Nemotron 3 Super 120B-A12B Base BF16
And also a reward model:
- Qwen3-Nemotron-235B-A22B-GenRM-2603
Could you please provide the Hugging Face links for these specific model variants and the reward model?
Additionally, the paper mentions the "Nemotron-Super-Post-Training-Data" dataset. I couldn't find a direct link for this specific dataset ID in the GitHub repository. Could you clarify if this dataset is available on Hugging Face and provide its URL, or if it's planned for release?
Uploading models
See here for a guide: https://huggingface.co/docs/hub/models-uploading.
In this case, we could leverage the PyTorchModelHubMixin class which adds from_pretrained and push_to_hub to any custom nn.Module. Alternatively, one can leverages the hf_hub_download one-liner to download a checkpoint from the hub.
We encourage researchers to push each model checkpoint to a separate model repository, so that things like download stats also work. We can then also link the checkpoints to the paper page.
Uploading dataset
Would be awesome to make the dataset available on 🤗 , so that people can do:
from datasets import load_dataset
dataset = load_dataset("your-hf-org-or-username/your-dataset")
See here for a guide: https://huggingface.co/docs/datasets/loading.
Besides that, there's the dataset viewer which allows people to quickly explore the first few rows of the data in the browser.
Let me know if you're interested/need any help regarding this!
Cheers,
Niels
ML Engineer @ HF 🤗
Hi @NVIDIA-NeMo 🤗
Niels here from the open-source team at Hugging Face. I discovered your work through Hugging Face's daily papers as yours got featured: https://huggingface.co/papers/2604.12374.
The paper page lets people discuss about your paper and lets them find artifacts about it (your models, datasets or demo for instance), you can also claim
the paper as yours which will show up on your public profile at HF, add Github and project page URLs.
It'd be great to make all the Nemotron 3 Super checkpoints and the "Nemotron-Super-Post-Training-Data" dataset available on the 🤗 hub, to improve their discoverability/visibility.
We can add tags so that people find them when filtering https://huggingface.co/models and https://huggingface.co/datasets.
I found the
NVIDIA-Nemotron-3-Super-120B-A12B-BF16andNemotron-3-Super-49B-v1models, and theNemotron-Pretraining-Specialized-v1.1dataset already on the Hub through your GitHub repository, which is fantastic!However, the paper mentions additional Nemotron 3 Super 120B-A12B checkpoints in various quantization formats:
And also a reward model:
Could you please provide the Hugging Face links for these specific model variants and the reward model?
Additionally, the paper mentions the "Nemotron-Super-Post-Training-Data" dataset. I couldn't find a direct link for this specific dataset ID in the GitHub repository. Could you clarify if this dataset is available on Hugging Face and provide its URL, or if it's planned for release?
Uploading models
See here for a guide: https://huggingface.co/docs/hub/models-uploading.
In this case, we could leverage the PyTorchModelHubMixin class which adds
from_pretrainedandpush_to_hubto any customnn.Module. Alternatively, one can leverages the hf_hub_download one-liner to download a checkpoint from the hub.We encourage researchers to push each model checkpoint to a separate model repository, so that things like download stats also work. We can then also link the checkpoints to the paper page.
Uploading dataset
Would be awesome to make the dataset available on 🤗 , so that people can do:
See here for a guide: https://huggingface.co/docs/datasets/loading.
Besides that, there's the dataset viewer which allows people to quickly explore the first few rows of the data in the browser.
Let me know if you're interested/need any help regarding this!
Cheers,
Niels
ML Engineer @ HF 🤗