Replies: 2 comments
-
|
哥,请问你解决了吗 |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
有人解决了吗?我特地先跑一次 docker commit 了,结果拉到没有网络的环境下,还是要联网检查一下,导致服务起不来 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
C:\Users\Admin>paddlex --serve --pipeline OCR
No model hoster is available! Please check your network connection to one of the following model hosts:
HuggingFace (https://huggingface.co),
ModelScope (https://modelscope.cn),
AIStudio (https://aistudio.baidu.com), or
BOS (https://paddle-model-ecology.bj.bcebos.com).
Otherwise, only local models can be used.
Creating model: ('PP-LCNet_x1_0_doc_ori', None)
Using official model (PP-LCNet_x1_0_doc_ori), the model files will be automatically downloaded and saved in C:\Users\Admin.paddlex\official_models.
No available model hosting platforms detected. Please check your network connection.
Traceback (most recent call last):
File "", line 198, in run_module_as_main
File "", line 88, in run_code
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Scripts\paddlex.exe_main.py", line 6, in
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex_main.py", line 26, in console_entry
main()
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\paddlex_cli.py", line 481, in main
serve(
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\paddlex_cli.py", line 380, in serve
pipeline = create_pipeline(
^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\inference\pipelines_init_.py", line 166, in create_pipeline
pipeline = BasePipeline.get(pipeline_name)(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\utils\deps.py", line 202, in _wrapper
return old_init_func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\inference\pipelines_parallel.py", line 103, in init
self._pipeline = self._create_internal_pipeline(config, self.device)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\inference\pipelines_parallel.py", line 158, in _create_internal_pipeline
return self.pipeline_cls(
^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\inference\pipelines\ocr\pipeline.py", line 76, in init
self.doc_preprocessor_pipeline = self.create_pipeline(
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\inference\pipelines\base.py", line 138, in create_pipeline
pipeline = create_pipeline(
^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\inference\pipelines_init.py", line 166, in create_pipeline
pipeline = BasePipeline.get(pipeline_name)(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\utils\deps.py", line 202, in _wrapper
return old_init_func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\inference\pipelines_parallel.py", line 103, in init
self._pipeline = self._create_internal_pipeline(config, self.device)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\inference\pipelines_parallel.py", line 158, in _create_internal_pipeline
return self.pipeline_cls(
^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\inference\pipelines\doc_preprocessor\pipeline.py", line 69, in init
self.doc_ori_classify_model = self.create_model(doc_ori_classify_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\inference\pipelines\base.py", line 105, in create_model
model = create_predictor(
^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\inference\models_init.py", line 69, in create_predictor
model_dir = official_models[model_name]
~~~~~~~~~~~~~~~^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\inference\utils\official_models.py", line 577, in getitem
return self._get_model_local_path(model_name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Admin\AppData\Local\Programs\python\Python311\Lib\site-packages\paddlex\inference\utils\official_models.py", line 552, in _get_model_local_path
raise Exception(msg)
Exception: No available model hosting platforms detected. Please check your network connection.
程序已经启动了好几遍了
因为单位的运行环境是不能联网的,所有还得请专家帮忙看一下
Beta Was this translation helpful? Give feedback.
All reactions