- 
                Notifications
    
You must be signed in to change notification settings  - Fork 6.5k
 
update helicone docs + examples #20208
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
| 
           Check out this pull request on   See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB  | 
    
| "dev:api": "cd api_reference && poetry install && poetry run mkdocs build -d ../.mkdocs/ && npx http-server ../.mkdocs/" | ||
| "dev:api": "cd api_reference && poetry install && poetry run mkdocs build -d ../.mkdocs/ && npx http-server ../.mkdocs/", | ||
| "dev:all": "sh -c 'pnpm dev & pnpm run dev:api & wait'" | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
convenience change
can remove on request
| response = llm.complete("Hello World!") | ||
| message: ChatMessage = ChatMessage(role="user", content="Hello world!") | ||
| response = helicone.chat(messages=[message]) | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
change to docs was made because helicone does not support the legacy chat api, only chat completions and response
didn't force is_chat_model in the event that the openai_like class changes its default call or adds support for the response api
| # Globally ignore example notebooks under docs/examples/ | ||
| exclude: ^docs/examples/ | ||
| 
               | 
          
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pre-commit was failing for all of the docs/examples, so i just ignored it for future devs
Description
Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.
Fixes # (issue)
New Package?
Did I fill in the
tool.llamahubsection in thepyproject.tomland provide a detailed README.md for my new integration or package?Version Bump?
Did I bump the version in the
pyproject.tomlfile of the package I am updating? (Except for thellama-index-corepackage)Type of Change
Please delete options that are not relevant.
How Has This Been Tested?
Your pull-request will likely not be merged unless it is covered by some form of impactful unit testing.
Suggested Checklist:
uv run make format; uv run make lintto appease the lint gods