Skip to content

Conversation

@freshystar
Copy link
Contributor

@freshystar freshystar commented Oct 13, 2025

Implement Rust Sidecar and Enhanced PHP Communication for OpenAI Assistant

Rust Sidecar Implementation:

  • Built openai-moodle-sidecar using Actix-web and Tokio for async HTTP server functionality
  • Implemented dual-mode architecture: HTTP server (port 8081) with automatic stdio fallback
  • Added shared OpenAI client instances using Arc for memory efficiency
  • Created comprehensive CLI argument parsing with environment variable support
  • Integrated health check endpoint (/health) and main API endpoint (/amplement Rust Sidecar and Enhanced PHP Communication for OpenAI Assistant

PHP Plugin Enhancements:

Developed

  • stdio_communicator.php: class with intelligent communication strategy
  • Implemented HTTP-first approach with CURL-based requests and configurable timeouts
  • Added robust stdio fallback using proc_open() for process spawning
  • Created environment variable mapping for seamless configuration
  • Added comprehensive error handling, logging, and sensitive data masking
  • Integrated binary existence and permission validation

- Added chat interface for interacting with OpenAI API in chat.php.
- Created api_client class for handling API requests and responses.
- Developed stdio_communicator class to manage communication with the Rust sidecar.
- Defined database schema for storing AI conversations in install.xml.
- Added language strings for plugin localization in local_openai_assistant.php.
- Implemented navigation integration for the OpenAI Assistant in lib.php.
- Included CSS styles for improved UI in styles.css.
- Specified plugin versioning and requirements in version.php.
- Removed the openai crate dependency and replaced it with direct reqwest calls in OpenAIClient.
- Updated OpenAIClient to include api_key and base_url as fields.
- Refactored message handling in the chat and summarize handlers to use JSON directly.
- Enhanced error handling for API responses to provide clearer feedback.
- Updated argument parsing to use environment variables for configuration.
- Added a new chat widget with a welcome message and improved UI interactions.
- Implemented a simple Markdown-like renderer for AI responses in the chat widget.
- Improved logging and error handling in the PHP API client and communicator classes.
- Added new JavaScript and CSS files for the chat widget functionality and styling.
…_footer function

feat: Add modal HTML structure and improve message handling in OpenAIChatWidget
…esponse Handling

- Added custom CSS for AI response formatting in chat interface.
- Refactored API client to handle empty responses and validate response structure.
- Implemented HTTP request handling in stdio communicator with fallback to stdio mode.
- Improved AI response rendering with client-side sanitization and server-side markdown processing.
- Introduced a simple markdown renderer for assistant responses, ensuring safety and compatibility.
- Updated styles for AI responses, including syntax highlighting for code blocks.
- Created deployment documentation for the Rust sidecar, detailing setup and configuration.
- Added a startup script for the Rust sidecar to streamline deployment.
Removed the build/ from the gitignore
Removed init script in rust-sidecar and README file
@marcjazz
Copy link

Please resolve merge conflicts.

Copy link

@marcjazz marcjazz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still dont't get it. This implementation seems to still process binary per request...

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this supposed to be committed?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove this file

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did

pub async fn send_chat_request(&self, messages: Vec<serde_json::Value>, max_tokens: Option<u32>) -> Result<String, Box<dyn std::error::Error + Send + Sync>> {
// Accept either a full completions endpoint or a base URL and normalize it.
let base = self.base_url.trim_end_matches('/');
let url = if base.ends_with("/chat/completions") || base.ends_with("chat/completions") {

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are been redundant here.

Comment on lines +38 to +42
let payload = json!({
"model": self.model,
"messages": messages,
"max_tokens": tokens
});

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't like this because it is error prone. Use serde struct instead.

client: Arc<OpenAIClient>,
args: Args
) -> Result<(), Box<dyn std::error::Error>> {
info!("Starting OpenAI Moodle Sidecar in HTTP mode on 127.0.0.1:{}", args.port);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It makes more sense to print after the server was successfully started

.route("/ai", web::post().to(process_request_http))
.route("/health", web::get().to(health_check))
})
.bind(("127.0.0.1", args.port))?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does it not make sense to bind it to any address!?

Comment on lines +77 to +129
async fn run_stdio_mode(client: Arc<OpenAIClient>) -> Result<(), Box<dyn std::error::Error>> {
info!("Starting OpenAI Moodle Sidecar in STDIO mode");

let stdin = io::stdin();
let reader = BufReader::new(stdin.lock());

for line in reader.lines() {
match line {
Ok(input) => {
if input.trim().is_empty() {
continue;
}

match serde_json::from_str::<Request>(&input) {
Ok(request) => {
let response = match request.action.as_str() {
"chat" => handlers::chat::handle(&client, web::Json(request)).await,
"summarize" => handlers::summarize::handle(&client, web::Json(request)).await,
"analyze" => handlers::analyze::handle(&client, web::Json(request)).await,
_ => {
warn!("Unknown action: {}", request.action);
Response::error("Unknown action")
},
};

match serde_json::to_string(&response) {
Ok(json) => {
println!("{}", json);
io::stdout().flush()?;
}
Err(e) => {
error!("Failed to serialize response: {}", e);
}
}
}
Err(e) => {
error!("Failed to parse request: {}", e);
let error_response = Response::error("Invalid JSON request");
if let Ok(json) = serde_json::to_string(&error_response) {
println!("{}", json);
io::stdout().flush()?;
}
}
}
}
Err(e) => {
error!("Error reading from stdin: {}", e);
break;
}
}
}
Ok(())
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could be improved if you break it down.


public function __construct() {
// Path to the mounted binary in Docker
$this->binary_path = '/bitnami/moodle/openai-sidecar/openai-moodle-sidecar';

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should also get it from env

@freshystar freshystar self-assigned this Oct 14, 2025
Copy link
Contributor

@stephane-segning stephane-segning left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's have a demo ;)

MAX_TOKENS: ${MAX_TOKENS}
SUMMARIZE_THRESHOLD: ${SUMMARIZE_THRESHOLD}
entrypoint: /bin/bash
command: # This script was removed from https://github.com/bitnami/containers/blob/main/bitnami/moodle/5.0/debian-12/rootfs/opt/bitnami/scripts/moodle/entrypoint.sh
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

???

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The ticket demanded that I should add summarization to optimize API calls. So I placed a summarize threshold, which is the limit at which summarization starts.

. /opt/bitnami/scripts/mysql-client/setup.sh
. /opt/bitnami/scripts/postgresql-client/setup.sh
. /opt/bitnami/scripts/moodle/setup.sh
. /post-init.sh
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why removing this?

pub port: u16,
}

impl Args {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you need this implementation while using clap?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes sir, because I am equally getting my configurations from env (as stated It should configure credentials only using ENV params only. Not Credentials configuration from the dashboard.) and clap only supports command line arguments.

@freshystar freshystar marked this pull request as draft October 22, 2025 06:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants