### Required prerequisites - [x] I have searched the [Issue Tracker](https://github.com/camel-ai/camel/issues) and [Discussions](https://github.com/camel-ai/camel/discussions) that this hasn't already been reported. (+1 or comment there if it has.) - [ ] Consider asking first in a [Discussion](https://github.com/camel-ai/camel/discussions/new). ### Motivation Following #3587 which added prompt caching for Anthropic and OpenAI, we should extend support to: - **Google Gemini** - [Context caching](https://ai.google.dev/gemini-api/docs/caching) with implicit/explicit modes - **AWS Bedrock** - [Prompt caching](https://docs.aws.amazon.com/bedrock/latest/userguide/prompt-caching.html) for Claude and Nova models - **Azure OpenAI** - [Prompt caching](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/prompt-caching) for GPT-4o+ Related: #3586, #3587 ### Solution _No response_ ### Alternatives _No response_ ### Additional context _No response_