Skip to content

Commit e063413

Browse files
committed
Adds development phase for AI
1 parent ca86018 commit e063413

File tree

4 files changed

+139
-2
lines changed

4 files changed

+139
-2
lines changed
364 KB
Loading

documentation/modules/ROOT/nav.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
1717
* Podman Desktop AI Lab
1818
** xref:ai.adoc[Podman Desktop AI]
19-
** xref:kubernetes-deploying.adoc[Deploying to Kubernetes]
19+
** xref:ai-dev.adoc[Interacting with the model]
2020
2121
////
2222
* More Tutorials
Lines changed: 137 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,137 @@
1+
= Developing apps with Podman Desktop AI
2+
include::_attributes.adoc[]
3+
4+
So far, you containerized and deployed a Quarkus application using Podman Desktop.
5+
6+
Let's change this application and add a call to the local model.
7+
8+
As seen in the previous section, Podman Desktop AI Lab provides snippets for several languages and frameworks in addition to `curl`.
9+
10+
== Select Language and Framework to integrate with
11+
12+
Click on the first dropbox and select *Java*, and in the second dropbox select *Quarkus Langchain4J*:
13+
14+
image::podman-desktop-ai-quarkus.png[Podman Desktop Extension AI with Quarkus Snippets, 600]
15+
16+
As with `curl`, you see specific snippets for interacting with the model but using Java.
17+
18+
== Develop Code
19+
20+
First of all, open the `pom.xml` file from the previous project (`podify-quarkus-redis`), and add the following dependencies:
21+
22+
[.console-input]
23+
[source,bash,subs="+macros,+attributes"]
24+
----
25+
<dependency>
26+
<groupId>io.quarkiverse.langchain4j</groupId>
27+
<artifactId>quarkus-langchain4j-core</artifactId>
28+
<version>0.21.0</version>
29+
</dependency>
30+
<dependency>
31+
<groupId>io.quarkiverse.langchain4j</groupId>
32+
<artifactId>quarkus-langchain4j-openai</artifactId>
33+
<version>0.21.0</version>
34+
</dependency>
35+
----
36+
37+
Then open the `src/main/resources/application.properties` file and configure Langchain4J to connect to the model deployed in Podman:
38+
39+
[.console-input]
40+
[source,bash,subs="+macros,+attributes"]
41+
----
42+
quarkus.langchain4j.openai.base-url=http://localhost:58184/v1
43+
quarkus.langchain4j.openai.api-key=sk-dummy
44+
----
45+
46+
Integrating Langchain4J with Quarkus lets developers send requests to a model quickly.
47+
You only need to create and annotate a Java interface with `@RegisterAiService`, and Quarkus will do the rest.
48+
49+
Create a new class named `AiService.java` at `src/main/java/com/redhat/developers` directory with the following content:
50+
51+
[.console-input]
52+
[source,java,subs="+macros,+attributes"]
53+
.src/main/java/com/redhat/developers/AiService.java
54+
----
55+
package com.redhat.developers; // <1>
56+
57+
import dev.langchain4j.service.UserMessage;
58+
import io.quarkiverse.langchain4j.RegisterAiService;
59+
60+
@RegisterAiService // <2>
61+
public interface AiService {
62+
63+
@UserMessage("{question}") // <3>
64+
String request(String question); // <4>
65+
}
66+
----
67+
<1> Package where the class is stored
68+
<2> Interface annotated
69+
<3> This is the message sent to model
70+
<4> Sets the message to send
71+
72+
IMPORTANT: Copy/Paste the content for this file from the *Client Code* section. The only thing you need to adapt is the `AiService` class package.
73+
74+
The last thing to create is a new endpoint to make calls to the model.
75+
76+
Create a class named `AiResource` with the following content:
77+
78+
[.console-input]
79+
[source,java,subs="+macros,+attributes"]
80+
.src/main/java/com/redhat/developers/AiResource.java
81+
----
82+
package com.redhat.developers;
83+
84+
import jakarta.inject.Inject;
85+
import jakarta.ws.rs.Consumes;
86+
import jakarta.ws.rs.POST;
87+
import jakarta.ws.rs.Path;
88+
import jakarta.ws.rs.Produces;
89+
import jakarta.ws.rs.core.MediaType;
90+
91+
@Path("/ai")
92+
public class AiResource {
93+
94+
@Inject // <1>
95+
AiService aiService;
96+
97+
@POST
98+
@Consumes(MediaType.TEXT_PLAIN)
99+
@Produces(MediaType.TEXT_PLAIN)
100+
public String chat(String message) {
101+
return aiService.request(message);
102+
}
103+
104+
}
105+
----
106+
<1> Injects the interface that connects to the local model
107+
108+
IMPORTANT: The version of Quarkus should be `<quarkus.platform.version>3.16.4</quarkus.platform.version>`.
109+
110+
Now, you can boot up the service and try it.
111+
112+
=== Running the example
113+
114+
Go to the terminal window at the root of the project and start the application by typing:
115+
116+
[.console-input]
117+
[source,bash,subs="+macros,+attributes"]
118+
----
119+
./mvnw quarkus:dev
120+
----
121+
122+
In another terminal window, `curl` the endpoint created in the Quarkus service:
123+
124+
[.console-input]
125+
[source,bash,subs="+macros,+attributes"]
126+
----
127+
curl -X POST -H "Content-Type: text/plain" -d "What is the capital of France" localhost:8080/ai
128+
----
129+
130+
131+
[.console-output]
132+
[source,bash,subs="+macros,+attributes"]
133+
----
134+
The capital of France is Paris
135+
----
136+
137+
Thanks to Podman Desktop AI, you can infer a model locally and get precise steps to develop an application for ingesting the model.

documentation/modules/ROOT/pages/ai.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ Podman AI Lab is an open-source extension for Podman Desktop to work with LLMs (
77

88
To install Podman AI extension, click on extensions icon (placed at left of the screen) represented by a puzzle piece:
99

10-
image::podman-desktop-ai-extension.png[Podman Desktop Extension AI, 600]
10+
image::podman-desktop-ai-extensions.png[Podman Desktop Extension AI, 600]
1111

1212
If you already have the extension installed, you should see it there, if not, click on the *Catalog* section, search for the *Podman AI Lab Extension*, and push the install icon:
1313

0 commit comments

Comments
 (0)