Conversation
| readme = "README.md" | ||
| requires-python = ">=3.12" | ||
| dependencies = [ | ||
| "gel>=0.0.1", |
There was a problem hiding this comment.
| "gel>=0.0.1", | |
| "gel[ai]>=0.0.1", |
There was a problem hiding this comment.
oh yes, these things I will update at the end... I couldnt even install deps from pyproject.toml, I saw some error about having both app and dbschema in the root level but didn't fix it (didn't even try)... @deepbuzin were u able to install deps for fastapi tutorial, u have the similar folder structure there?
There was a problem hiding this comment.
Don't remember running into any issues in either case. I'm using uv. You can tell me what to do to reproduce, I'll look into it
There was a problem hiding this comment.
@deepbuzin my question is how do you install deps in Fastapi tutorial if you don't install them explicitly mentioning all of them, other examples have a Makefile but since the Fastapi tutorial doesn't have it, I thought maybe is not mandatory?
For users that just want to clone the example and start using it without following the tutorial and without installing all deps one by one, but want to run one command and install all of them? What is that command in the case when there's no Makefile?
| requires-python = ">=3.12" | ||
| dependencies = [ | ||
| "gel>=0.0.1", | ||
| "fastapi>=0.109.0", |
There was a problem hiding this comment.
| "fastapi>=0.109.0", | |
| "fastapi[standard]>=0.109.0", |
vectorstore-image-search/app/main.py
Outdated
There was a problem hiding this comment.
I think it would be better if we replaced this CLIP implementation with Transformers. It's better maintained, and would also unlock SigLIP, which is a slightly more modern variation of the same thing.
I'm also in favor of replacing the OpenAI library dep with a raw HTTP request (or better yet, an HTTP request to Ollama to go fully local).
There was a problem hiding this comment.
Using openai and clip is easier for the tutorial and there's a bigger chance people are more familiar with them, we can write somewhere in the tutorial that people can also use these other things... I can use http for open ai call (even tho I don't think is a big difference, we also use fastapi for http requests). But ofc if u and others think it'll be better I will update : )
There was a problem hiding this comment.
On the other side if Transformers and Ollama are better/more popular ways to do it nowadays, I will update.. I just didn't hear of it and didn't find it when I researched the topic :D
There was a problem hiding this comment.
I will leave this one up to you :) FWIW I think a mild sprinkling of unfamiliar things would only add to the excitement, especially because both are very user friendly (and Ollama is free of charge, unlike OpenAI). Anyway, you decide.
vectorstore-image-search/README.md
Outdated
There was a problem hiding this comment.
Some guide to get this example running would be nice.
There was a problem hiding this comment.
oh yes yes, this will be added
| import os | ||
| import torch | ||
| from pathlib import Path | ||
| from dotenv import load_dotenv |
There was a problem hiding this comment.
(no need to fix) FYI there's the FastAI-suggested pydantic-settings, as well as the already-included Starlette Configuration. Such high-level config infras are more popular in real applications.
There was a problem hiding this comment.
(trivial) it's usually a good practice to sort the imports and put them into groups (builtins, libraries, self-package)
There was a problem hiding this comment.
(no need to fix) I would leave the images out of version control (like using the GitHub release attachments) and download them during setup.
vectorstore-image-search/app/main.py
Outdated
There was a problem hiding this comment.
(no need to fix) we usually import just the Python modules and use e.g. constants.IMAGES_DIR for easier mocks in tests.
No description provided.