Can LLM extraction use the OpenAI interface provided by Langchain or the OpenAI interface to be able to use more models? #1188
yf123s5
started this conversation in
Feature requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What needs to be done?
Replace the Litellm interface with the OpenAI interface or the OpenAI interface of Langchain. This way, the application scope will not be limited to using specific LLM models. Instead, more excellent models, including those from China, can also be applicable.
What problem does this solve?
1.By using the OpenAI interface, it can be compatible with more models, especially those from China. 2. By using the interface of OpenAI, different parameters corresponding to different models can be input. This enables us to better improve the accuracy of the extraction by adjusting the parameters of the model.
Target users/beneficiaries
More developers from more countries and with more models
Current alternatives/workarounds
No response
Proposed approach
When I was extracting data from the web, I found that using other models could achieve better and more accurate extraction results. For example, the Chinese Qwen3 model has strong capabilities and is relatively inexpensive. Therefore, it is necessary to use the OpenAI interface to support more models.
Beta Was this translation helpful? Give feedback.
All reactions