Intregrating Ml Model with OpenEx Backend #32
-
|
Great work on the Admin Moderation Model, @lendrik-kumar! However, we need to address a few more things. The current setup for verifying the decency of products allows us to log in as admin to approve or disapprove items. This process feels redundant and overly intensive. We should aim for a system that ensures items are appropriate without requiring admin moderation. Regarding input, I am not concerned about whether the model validates based on images alone or if it takes additional information like item names. If the model can validate automatically using all available information, that would be ideal. If it only processes images, we need to address that. Additionally, if the model recognizes an item but does not validate, that requires fixing as well. @lendrik-kumar, could you clarify this? I also want to encourage all members to share their thoughts on how we can make the item approval process more seamless for sellers. Should we implement a 10-minute limit for admins to validate products? If they don’t respond within that timeframe, should the item be automatically sent to the machine learning model? Alternatively, should we consider a different flow, where we validate first and then send to the model? Furthermore, we need to establish a system that allows two repositories in production to communicate. I would appreciate everyone's input on this. Your thoughts are essential for moving forward. |
Beta Was this translation helpful? Give feedback.
Replies: 5 comments 2 replies
-
I believe a capable llm with image input abilities should be enough to read labels and determine whether something is good or bad.
for this is believe the inverse should be done that the model gets the preference and the admins get the option to "approve" the model's decision.
Um yes. This is necessary as discussions are pretty slow and less often than needed. |
Beta Was this translation helpful? Give feedback.
-
|
i am working on this. worked out this problem and decided to make a api for model hosted somewhere and api requests from backend , is this approch ok or we should workout something else? |
Beta Was this translation helpful? Give feedback.
-
|
yes good work @lendrik-kumar |
Beta Was this translation helpful? Give feedback.
-
|
If we are using the llm model to validate the product images what parameters are we using to confirm whether the product can be considered good or not. And instead if we can use text description for the products by the user then using nlp we can categorize the products as well. |
Beta Was this translation helpful? Give feedback.
-
We can reduce the admin workload by only sharing low-confidence cases for manual review and automating rest or using a high parameter model and taking a little larger dataset from seller about product.
To this I completely agree with Aarav's suggestion to provide a time bound feature to the user to validate ai's decision.
We can define webhooks, event-driven messaging or a solid API-based communication system to provide smooth and seamless data exchange. |
Beta Was this translation helpful? Give feedback.
We can reduce the admin workload by only sharing low-confidence cases for manual review and automating rest or using a high parameter model and taking a little larger dataset from seller about product.