|
| 1 | += Developing apps with Podman Desktop AI |
| 2 | +include::_attributes.adoc[] |
| 3 | + |
| 4 | +So far, you containerized and deployed a Quarkus application using Podman Desktop. |
| 5 | + |
| 6 | +Let's change this application and add a call to the local model. |
| 7 | + |
| 8 | +As seen in the previous section, Podman Desktop AI Lab provides snippets for several languages and frameworks in addition to `curl`. |
| 9 | + |
| 10 | +== Select Language and Framework to integrate with |
| 11 | + |
| 12 | +Click on the first dropbox and select *Java*, and in the second dropbox select *Quarkus Langchain4J*: |
| 13 | + |
| 14 | +image::podman-desktop-ai-quarkus.png[Podman Desktop Extension AI with Quarkus Snippets, 600] |
| 15 | + |
| 16 | +As with `curl`, you see specific snippets for interacting with the model but using Java. |
| 17 | + |
| 18 | +== Develop Code |
| 19 | + |
| 20 | +First of all, open the `pom.xml` file from the previous project (`podify-quarkus-redis`), and add the following dependencies: |
| 21 | + |
| 22 | +[.console-input] |
| 23 | +[source,bash,subs="+macros,+attributes"] |
| 24 | +---- |
| 25 | +<dependency> |
| 26 | + <groupId>io.quarkiverse.langchain4j</groupId> |
| 27 | + <artifactId>quarkus-langchain4j-core</artifactId> |
| 28 | + <version>0.21.0</version> |
| 29 | +</dependency> |
| 30 | +<dependency> |
| 31 | + <groupId>io.quarkiverse.langchain4j</groupId> |
| 32 | + <artifactId>quarkus-langchain4j-openai</artifactId> |
| 33 | + <version>0.21.0</version> |
| 34 | +</dependency> |
| 35 | +---- |
| 36 | + |
| 37 | +Then open the `src/main/resources/application.properties` file and configure Langchain4J to connect to the model deployed in Podman: |
| 38 | + |
| 39 | +[.console-input] |
| 40 | +[source,bash,subs="+macros,+attributes"] |
| 41 | +---- |
| 42 | +quarkus.langchain4j.openai.base-url=http://localhost:58184/v1 |
| 43 | +quarkus.langchain4j.openai.api-key=sk-dummy |
| 44 | +---- |
| 45 | + |
| 46 | +Integrating Langchain4J with Quarkus lets developers send requests to a model quickly. |
| 47 | +You only need to create and annotate a Java interface with `@RegisterAiService`, and Quarkus will do the rest. |
| 48 | + |
| 49 | +Create a new class named `AiService.java` at `src/main/java/com/redhat/developers` directory with the following content: |
| 50 | + |
| 51 | +[.console-input] |
| 52 | +[source,java,subs="+macros,+attributes"] |
| 53 | +.src/main/java/com/redhat/developers/AiService.java |
| 54 | +---- |
| 55 | +package com.redhat.developers; // <1> |
| 56 | +
|
| 57 | +import dev.langchain4j.service.UserMessage; |
| 58 | +import io.quarkiverse.langchain4j.RegisterAiService; |
| 59 | +
|
| 60 | +@RegisterAiService // <2> |
| 61 | +public interface AiService { |
| 62 | +
|
| 63 | + @UserMessage("{question}") // <3> |
| 64 | + String request(String question); // <4> |
| 65 | +} |
| 66 | +---- |
| 67 | +<1> Package where the class is stored |
| 68 | +<2> Interface annotated |
| 69 | +<3> This is the message sent to model |
| 70 | +<4> Sets the message to send |
| 71 | + |
| 72 | +IMPORTANT: Copy/Paste the content for this file from the *Client Code* section. The only thing you need to adapt is the `AiService` class package. |
| 73 | + |
| 74 | +The last thing to create is a new endpoint to make calls to the model. |
| 75 | + |
| 76 | +Create a class named `AiResource` with the following content: |
| 77 | + |
| 78 | +[.console-input] |
| 79 | +[source,java,subs="+macros,+attributes"] |
| 80 | +.src/main/java/com/redhat/developers/AiResource.java |
| 81 | +---- |
| 82 | +package com.redhat.developers; |
| 83 | +
|
| 84 | +import jakarta.inject.Inject; |
| 85 | +import jakarta.ws.rs.Consumes; |
| 86 | +import jakarta.ws.rs.POST; |
| 87 | +import jakarta.ws.rs.Path; |
| 88 | +import jakarta.ws.rs.Produces; |
| 89 | +import jakarta.ws.rs.core.MediaType; |
| 90 | +
|
| 91 | +@Path("/ai") |
| 92 | +public class AiResource { |
| 93 | +
|
| 94 | + @Inject // <1> |
| 95 | + AiService aiService; |
| 96 | +
|
| 97 | + @POST |
| 98 | + @Consumes(MediaType.TEXT_PLAIN) |
| 99 | + @Produces(MediaType.TEXT_PLAIN) |
| 100 | + public String chat(String message) { |
| 101 | + return aiService.request(message); |
| 102 | + } |
| 103 | +
|
| 104 | +} |
| 105 | +---- |
| 106 | +<1> Injects the interface that connects to the local model |
| 107 | + |
| 108 | +IMPORTANT: The version of Quarkus should be `<quarkus.platform.version>3.16.4</quarkus.platform.version>`. |
| 109 | + |
| 110 | +Now, you can boot up the service and try it. |
| 111 | + |
| 112 | +=== Running the example |
| 113 | + |
| 114 | +Go to the terminal window at the root of the project and start the application by typing: |
| 115 | + |
| 116 | +[.console-input] |
| 117 | +[source,bash,subs="+macros,+attributes"] |
| 118 | +---- |
| 119 | +./mvnw quarkus:dev |
| 120 | +---- |
| 121 | + |
| 122 | +In another terminal window, `curl` the endpoint created in the Quarkus service: |
| 123 | + |
| 124 | +[.console-input] |
| 125 | +[source,bash,subs="+macros,+attributes"] |
| 126 | +---- |
| 127 | +curl -X POST -H "Content-Type: text/plain" -d "What is the capital of France" localhost:8080/ai |
| 128 | +---- |
| 129 | + |
| 130 | + |
| 131 | +[.console-output] |
| 132 | +[source,bash,subs="+macros,+attributes"] |
| 133 | +---- |
| 134 | +The capital of France is Paris |
| 135 | +---- |
| 136 | + |
| 137 | +Thanks to Podman Desktop AI, you can infer a model locally and get precise steps to develop an application for ingesting the model. |
0 commit comments