Skip to content

Commit 3c786ae

Browse files
committed
formatted blog better.
1 parent 742b968 commit 3c786ae

File tree

1 file changed

+1
-1
lines changed
  • src/routes/blogs/deepseek-r1-on-device

1 file changed

+1
-1
lines changed

src/routes/blogs/deepseek-r1-on-device/+page.svx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ url: 'https://onnxruntime.ai/blogs/deepseek-r1-on-device'
1111
---
1212
Are you a developer looking to harness the power of your users' local compute for AI inferencing on PCs with NPUs, GPUs, and CPUs? Look no further!
1313

14-
With the new release you can now run these models on CPU and GPU. Additionally you can also these models on NPU: [Running Distilled DeepSeek R1 models locally on Copilot+ PCs, powered by Windows Copilot Runtime - Windows Developer Blog](https://blogs.windows.com/windowsdeveloper/2025/01/29/running-distilled-deepseek-r1-models-locally-on-copilot-pcs-powered-by-windows-copilot-runtime/) You can now download and run the ONNX optimized variants of the models from [Hugging Face](https://huggingface.co/onnxruntime/DeepSeek-R1-Distill-ONNX).
14+
With the new release you can now run these models on CPU and GPU. Additionally you can also these models on NPU: [Windows Developer Blog](https://blogs.windows.com/windowsdeveloper/2025/01/29/running-distilled-deepseek-r1-models-locally-on-copilot-pcs-powered-by-windows-copilot-runtime/). You can now download and run the ONNX optimized variants of the models from [Hugging Face](https://huggingface.co/onnxruntime/DeepSeek-R1-Distill-ONNX).
1515

1616

1717

0 commit comments

Comments
 (0)