Description
服务端正常启动
Going to Run Comand
/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle_serving_server/serving-cpu-avx-mkl-0.9.0/serving -enable_model_toolkit -inferservice_path workdir_9393 -inferservice_file infer_service.prototxt -max_concurrency 0 -num_threads 4 -port 9393 -precision fp32 -use_calib=False -reload_interval_s 10 -resource_path workdir_9393 -resource_file resource.prototxt -workflow_path workdir_9393 -workflow_file workflow.prototxt -bthread_concurrency 4 -max_body_size 536870912
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralDetectionOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralDistKVInferOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralDistKVQuantInferOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralFeatureExtractOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralInferOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralPicodetOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralReaderOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralRecOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralRemoteOp
I0100 00:00:00.000000 13001 op_repository.h:68] RAW: Succ regist op: GeneralResponseOp
I0100 00:00:00.000000 13001 service_manager.h:79] RAW: Service[LoadGeneralModelService] insert successfully!
I0100 00:00:00.000000 13001 load_general_model_service.pb.h:333] RAW: Success regist service[LoadGeneralModelService][PN5baidu14paddle_serving9predictor26load_general_model_service27LoadGeneralModelServiceImplE]
I0100 00:00:00.000000 13001 service_manager.h:79] RAW: Service[GeneralModelService] insert successfully!
I0100 00:00:00.000000 13001 general_model_service.pb.h:1650] RAW: Success regist service[GeneralModelService][PN5baidu14paddle_serving9predictor13general_model23GeneralModelServiceImplE]
I0100 00:00:00.000000 13001 factory.h:155] RAW: Succ insert one factory, tag: PADDLE_INFER, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 13001 paddle_engine.cpp:34] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine->::baidu::paddle_serving::predictor::InferEngine, tag: PADDLE_INFER in macro!
--- Running analysis [ir_graph_build_pass]
--- Running analysis [ir_graph_clean_pass]
--- Running analysis [ir_analysis_pass]
--- Running analysis [ir_params_sync_among_devices_pass]
--- Running analysis [adjust_cudnn_workspace_size_pass]
--- Running analysis [inference_op_replace_pass]
--- Running analysis [memory_optimize_pass]
--- Running analysis [ir_graph_to_program_pass]
C++ Serving service started successfully!
使用http方式进行预测时,报错
aistudio@jupyter-3310911-5426973:/Serving$ curl -XPOST http://0.0.0.0:9393/GeneralModelService/inference -d ' {"x":"/home/aistudio/000004.png"}'/Serving$
[10.36.12.114:9393][E-5100]InferService inference failed!aistudio@jupyter-3310911-5426973:
支持https方式的调用吗?
Activity