Replies: 3 comments
-
OpenAI Batch API is not realtime and it takes up to 24hr to get the result. https://platform.openai.com/docs/guides/batch |
Beta Was this translation helpful? Give feedback.
-
exactly because its not real time it can be useful for the first time you want to index a large vault, it's gonna take a long time anyway, might as well make it cost a lot less |
Beta Was this translation helpful? Give feedback.
-
@Mahgozar are you ok with having your index ready in 24hr? Most people are not. Might as well just use a local model for free and it's much faster as well. |
Beta Was this translation helpful? Give feedback.
-
Batch API offers significantly reduced costs specially for embedding models that you need a lot of content to be indexed
Beta Was this translation helpful? Give feedback.
All reactions