-
Notifications
You must be signed in to change notification settings - Fork 1.1k
chore: pass prompt #932
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
chore: pass prompt #932
Conversation
WalkthroughThe PR enhances telemetry tracking for LLM-based robots by adding an Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes
Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (2)
server/src/routes/storage.ts (1)
588-593: Verify privacy implications of capturing user prompts in telemetry.The prompt field is now being captured in telemetry. User prompts may contain sensitive information, PII, or proprietary data. Ensure that:
- Your analytics platform has appropriate data retention and privacy policies
- Users are informed that prompts will be included in telemetry
- Consider sanitizing or truncating very long prompts before capture
Optional: Add prompt length limit for telemetry
capture('maxun-oss-llm-robot-created', { robot_meta: newRobot.recording_meta, recording: newRobot.recording, llm_provider: llmProvider || 'ollama', - prompt: prompt, + prompt: prompt.length > 500 ? prompt.substring(0, 500) + '...' : prompt, });server/src/api/sdk.ts (1)
703-707: Verify privacy implications of capturing user prompts in telemetry.Similar to the storage.ts endpoint, user prompts may contain sensitive information. Ensure appropriate privacy safeguards are in place.
Optional: Add prompt length limit for telemetry
capture("maxun-oss-llm-robot-created", { robot_meta: robot.recording_meta, recording: robot.recording, - prompt: prompt, + prompt: prompt.length > 500 ? prompt.substring(0, 500) + '...' : prompt, });
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
server/src/api/sdk.ts(3 hunks)server/src/routes/storage.ts(1 hunks)
🔇 Additional comments (2)
server/src/api/sdk.ts (2)
91-91: LGTM: isLLM flag added to robot metadata.The addition of the
isLLMflag appropriately tracks whether a robot was created using LLM capabilities, enabling differentiated telemetry events.
106-113: Good practice: Conditional prompt inclusion in telemetry.The conditional logic properly handles the optional nature of the prompt field in the generic robot creation endpoint. This ensures the prompt is only included in telemetry when the robot is LLM-based and the prompt exists.
No description provided.