Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add claude 3.7 sonnet #7391

Closed

Conversation

BurhanCantCode
Copy link

Add Claude 3.7 Sonnet Support

This PR adds support for the new Claude 3.7 Sonnet model to the Anthropic provider.

Changes:

  • Added Claude 3.7 Sonnet model to Anthropic provider's static models list
  • Ensured Anthropic is in the enabled providers list

Testing:

  • Verified the model appears in the dropdown
  • Tested code generation with the new model

dustinwloring1988 and others added 30 commits December 15, 2024 16:37
mkdoc consistent style
new section heading added in index
Made links clickable in docs
Fix clickable links docs
Added and set a default provider icon
added and set a default provider icon
fix: Added auto detect branch name and version tag for Debug Tab
feat: Show token usage on LLM call assistant message
fix: UI bug debug tab : System Information
Stijnus and others added 29 commits January 18, 2025 19:25
- Enhanced text for bolt.diy docs section and better visibility to guide people there instead using github readme which is more for devs
- added NodeJS based applications, as this is not clear and some people asked about in the community
…patch-readme-changes-v1

docs: update README.md
docs: replace docker-compose with docker compose
…VIEW_V3

fix: for Open preview in a new tab.
…abs#1139)

* updated system prompt to have correct indentations

* removed a section
fix: get environment variables for docker
…mplementing summary generation (stackblitz-labs#1091) #release

* feat: add context annotation types and enhance file handling in LLM processing

* feat: enhance context handling by adding chatId to annotations and implementing summary generation

* removed useless changes

* feat: updated token counts to include optimization requests

* prompt fix

* logging added

* useless logs removed
* fix: docker prod env variable fix

* lint and typecheck

* removed hardcoded tag
* feat: better push to githubbutton

* added url update on push to github
* fix: import bolt on bolt fix

* added escape on folder import

* type fix
…labs#1187)

This PR introduces a new model, deepseek-r1-distill-llama-70b, to the staticModels array and ensures compatibility with the Groq API. The changes include:

Adding the deepseek-r1-distill-llama-70b model to the staticModels array with its relevant metadata.

Updating the Groq API call to use the new model for chat completions.

These changes enable the application to support the deepseek-r1-distill-llama-70b model, expanding the range of available models for users.
…pport (stackblitz-labs#1202)

Added the new gemini-2.0-flash-thinking-exp-01-21 model to the GoogleProvider's static model configuration. This model supports a significantly increased maxTokenAllowed limit of 65,536 tokens, enabling it to handle larger context windows compared to existing Gemini models (previously capped at 8k tokens). The model is labeled as "Gemini 2.0 Flash-thinking-exp-01-21" for clear identification in the UI/dropdowns.
…labs#1191)

* fix: docker prod env variable fix

* lint and typecheck

* removed hardcoded tag

* better summary generation

* improved  summary generation for context optimization

* remove think tags from the generation
- Implement Codestral provider class with Mistral integration
- Configure API endpoints and model parameters
- Add documentation for Codestral usage
- Fix TypeScript type definitions
- Resolve linting and formatting issues
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.