Thanks for your work! I find that the inference scripts in [UltraChat/UltraLM/inference_cli.py at main · thunlp/UltraChat · GitHub](https://github.com/thunlp/UltraChat/blob/main/UltraLM/inference_cli.py) is still a vanilla one. Do you plan to provide deployment scripts on low-resource devices such as MacBook?