diff --git a/en/docs/assets/attachments/connectors/kafka-connector.zip b/en/docs/assets/attachments/connectors/kafka-connector.zip
index 5da885c22..e8b44c5ac 100644
Binary files a/en/docs/assets/attachments/connectors/kafka-connector.zip and b/en/docs/assets/attachments/connectors/kafka-connector.zip differ
diff --git a/en/docs/assets/img/integrate/connectors/kafka-avro-example-2.png b/en/docs/assets/img/integrate/connectors/kafka-avro-example-2.png
deleted file mode 100644
index 8d5f3cf50..000000000
Binary files a/en/docs/assets/img/integrate/connectors/kafka-avro-example-2.png and /dev/null differ
diff --git a/en/docs/assets/img/integrate/connectors/kafka/kafka-avro-create-connection.png b/en/docs/assets/img/integrate/connectors/kafka/kafka-avro-create-connection.png
new file mode 100644
index 000000000..da5321796
Binary files /dev/null and b/en/docs/assets/img/integrate/connectors/kafka/kafka-avro-create-connection.png differ
diff --git a/en/docs/assets/img/integrate/connectors/kafka-avro-example-1.png b/en/docs/assets/img/integrate/connectors/kafka/kafka-avro-example-1.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafka-avro-example-1.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafka-avro-example-1.png
diff --git a/en/docs/assets/img/integrate/connectors/kafka-avro-example-3.png b/en/docs/assets/img/integrate/connectors/kafka/kafka-avro-example-3.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafka-avro-example-3.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafka-avro-example-3.png
diff --git a/en/docs/assets/img/integrate/connectors/kafka-conn-add-api.png b/en/docs/assets/img/integrate/connectors/kafka/kafka-conn-add-api.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafka-conn-add-api.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafka-conn-add-api.png
diff --git a/en/docs/assets/img/integrate/connectors/kafka-conn-add-new-connection.png b/en/docs/assets/img/integrate/connectors/kafka/kafka-conn-add-new-connection.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafka-conn-add-new-connection.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafka-conn-add-new-connection.png
diff --git a/en/docs/assets/img/integrate/connectors/kafka-conn-add-operation.png b/en/docs/assets/img/integrate/connectors/kafka/kafka-conn-add-operation.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafka-conn-add-operation.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafka-conn-add-operation.png
diff --git a/en/docs/assets/img/integrate/connectors/kafka-conn-add-resource.png b/en/docs/assets/img/integrate/connectors/kafka/kafka-conn-add-resource.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafka-conn-add-resource.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafka-conn-add-resource.png
diff --git a/en/docs/assets/img/integrate/connectors/kafka-conn-config-operation.png b/en/docs/assets/img/integrate/connectors/kafka/kafka-conn-config-operation.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafka-conn-config-operation.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafka-conn-config-operation.png
diff --git a/en/docs/assets/img/integrate/connectors/kafka-create-new-inbound-endpoint.png b/en/docs/assets/img/integrate/connectors/kafka/kafka-create-new-inbound-endpoint.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafka-create-new-inbound-endpoint.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafka-create-new-inbound-endpoint.png
diff --git a/en/docs/assets/img/integrate/connectors/kafka-create-new-project.png b/en/docs/assets/img/integrate/connectors/kafka/kafka-create-new-project.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafka-create-new-project.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafka-create-new-project.png
diff --git a/en/docs/assets/img/integrate/connectors/kafka-create-new-sequence.png b/en/docs/assets/img/integrate/connectors/kafka/kafka-create-new-sequence.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafka-create-new-sequence.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafka-create-new-sequence.png
diff --git a/en/docs/assets/img/integrate/connectors/kafka-custom-endpoint-config-1.png b/en/docs/assets/img/integrate/connectors/kafka/kafka-custom-endpoint-config-1.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafka-custom-endpoint-config-1.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafka-custom-endpoint-config-1.png
diff --git a/en/docs/assets/img/integrate/connectors/kafka-custom-endpoint-config-2.png b/en/docs/assets/img/integrate/connectors/kafka/kafka-custom-endpoint-config-2.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafka-custom-endpoint-config-2.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafka-custom-endpoint-config-2.png
diff --git a/en/docs/assets/img/integrate/connectors/kafka-store.png b/en/docs/assets/img/integrate/connectors/kafka/kafka-store.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafka-store.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafka-store.png
diff --git a/en/docs/assets/img/integrate/connectors/kafkaconnector.png b/en/docs/assets/img/integrate/connectors/kafka/kafkaconnector.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafkaconnector.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafkaconnector.png
diff --git a/en/docs/assets/img/integrate/connectors/kafkaconnectorpublishmessage.png b/en/docs/assets/img/integrate/connectors/kafka/kafkaconnectorpublishmessage.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafkaconnectorpublishmessage.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafkaconnectorpublishmessage.png
diff --git a/en/docs/assets/img/integrate/connectors/kafkainboundendpoint.png b/en/docs/assets/img/integrate/connectors/kafka/kafkainboundendpoint.png
similarity index 100%
rename from en/docs/assets/img/integrate/connectors/kafkainboundendpoint.png
rename to en/docs/assets/img/integrate/connectors/kafka/kafkainboundendpoint.png
diff --git a/en/docs/reference/connectors/kafka-connector/kafka-connector-avro-producer-example.md b/en/docs/reference/connectors/kafka-connector/kafka-connector-avro-producer-example.md
index 2b629412c..f40ca4339 100644
--- a/en/docs/reference/connectors/kafka-connector/kafka-connector-avro-producer-example.md
+++ b/en/docs/reference/connectors/kafka-connector/kafka-connector-avro-producer-example.md
@@ -8,67 +8,96 @@ Given below is a sample API that illustrates how you can connect to a Kafka brok
API has the `/publishMessages` context. It publishes messages via the topic to the Kafka server.
-## Set up Kafka
+## Prerequisites
-Before you begin, set up Kafka by following the instructions in [Setting up Kafka](setting-up-kafka.md).
+- Set up MI by following the instructions in [Setting up Kafka](setting-up-kafka.md)
+- Set up Confluent by following the [Confluent documentation](https://docs.confluent.io/platform/current/installation/overview.html).
## Set up the integration project
-Follow the steps in the [create integration project]({{base_path}}/develop/create-integration-project/) guide to set up the Integration Project.
-## Create the integration logic
+Follow the steps below to set up the integration project using the WSO2 Integrator: MI Visual Studio Code extension.
-1. Click `+` on the Extension panel APIs to create the REST API.
+### Create a new project
-2. Specify the API name as `KafkaTransport` and API context as `/publishMessages`. Click create.
+Follow the steps in the [create integration project]({{base_path}}/develop/create-integration-project/) guide to set up WSO2 MI and create a new integration project. Use a suitable Project Name for your integration.
-
+### Create a connection
-3. Click the `/resource` default endpoint to open the **Resource View**. Then click the `+` arrow below the Start node to open the side panel. Select **Externals** and click **Add new Connection**. Search `kafkaTransport` and click.
-
-4. Provide values as below and click **Add**.
+1. In the Design View, click the **+** button and select **Connection**.
+
+2. In the search bar, type `Kafka` and select the `Kafka connector` from the list.
+
+
+
+3. In the Connection Configuration pane, enter the following required information:
- **Connection Name** - Sample_Kafka
- **Connection Type** - kafka
- **Boostrap Servers** - localhost:9092
- **Key Serializer Class** - io.confluent.kafka.serializers.KafkaAvroSerializer
- **Value Serializer Class** - io.confluent.kafka.serializers.KafkaAvroSerializer
- **Schema Registry URL** - http://localhost:8081
- - **Max Pool Size** - 100
+ - **Max Active Connections** - 100
+
+### Create an API
+
+1. Click on the **API** button in create an integration project pane.
+
+
+
+2. Enter the API Name as `KafkaTransport` and the Context as `/publishMessages`, then click **Create**.
+
+
+
+3. To add the Kafka connector:
+ - In the **Design View**, click the **+** button.
+ - In the **Mediator** section, search for `Kafka`.
+ - Select the **Kafka** connector and click **Download**
+
+### Implement the API
+
+1. Go to the **Source View** of the API by clicking on the **<>** icon in the top right corner of the **Design View**.
+
+
+2. Copy and paste the following code in the **Source View** of the API.
-
-5. You can go to the XML configuration of the API (source view) and copy the following configuration.
+??? note "Source view of the implemented resource"
```xml
-
-
-
+
+
+
-
-
-
-
-
- {$ctx:topic}
- 0
- {$ctx:key}
- {$ctx:value}
- {$ctx:valueSchema}
- false
- false
-
-
-
-
+
+
+
+
+
+ {$ctx:topic}
+ 0
+ {$ctx:key}
+ {$ctx:value}
+ {$ctx:valueSchema}
+ false
+ false
+
+
+
+
```
## Deployment
-Follow these steps to deploy the exported CApp in the Enterprise Integrator Runtime.
-
-**Deploying on WSO2 Integrator: MI**
To deploy and run the project, refer to the [Build and Run]({{base_path}}/develop/deploy-artifacts/#build-and-run) guide.
-
+
## Test
-Invoke the API (http://localhost:8290/publishMessages) with the following payload,
+Invoke the API as shown below using the MI VSCode Extension.
+
+
+
+### Sample request:
+
+- Content-Type: application/json
+- Request body:
````json
{
@@ -91,19 +120,18 @@ Invoke the API (http://localhost:8290/publishMessages) with the following payloa
````
**Expected Response**:
-!!!info
- Refer to the [confluent documentation](https://docs.confluent.io/platform/current/installation/overview.html) for installing confluent.
-
+
Run the following command to verify the messages:
````bash
[confluent_home]/bin/kafka-avro-console-consumer.sh --topic myTopic --bootstrap-server localhost:9092 --property print.key=true --from-beginning
````
+
See the following message content:
````json
{"f1":{"string":"sampleValue"}}
````
-Sample Connection configuration when the Confluent Schema Registry is secured with basic auth
+> NOTE: Connection configuration for the Confluent Schema Registry is secured with basic auth
```xml
@@ -115,26 +143,22 @@ Sample Connection configuration when the Confluent Schema Registry is secured wi
io.confluent.kafka.serializers.KafkaAvroSerializer
http://localhost:8081
USER_INFO
- admin:admi
- 100
+ admin:admin
+ 100
false
```
In the above example, the basicAuthCredentialsSource parameter is configured as USER_INFO. For example, consider a scenario where the basicAuthCredentialsSource parameter is set to URL as follows:
-
````xml
URL
````
-
Then, the schemaRegistryUrl parameter should be configured as shown below.
-
````xml
http://admin:admin@localhost:8081
````
Refer to the [confluent documentation](https://docs.confluent.io/platform/current/schema-registry/serdes-develop/serdes-avro.html) for more details.
-This demonstrates how the Kafka connector publishes Avro messages to Kafka brokers.
## What's next
diff --git a/en/docs/reference/connectors/kafka-connector/kafka-connector-producer-example.md b/en/docs/reference/connectors/kafka-connector/kafka-connector-producer-example.md
index 3604cfc29..13b50f467 100644
--- a/en/docs/reference/connectors/kafka-connector/kafka-connector-producer-example.md
+++ b/en/docs/reference/connectors/kafka-connector/kafka-connector-producer-example.md
@@ -10,7 +10,7 @@ API has the context `/publishMessages`. It will publish messages via the topic t
The following diagram illustrates all the required functionality of the Kafka service that you are going to build.
-
+
If you do not want to configure this yourself, you can simply [get the project](#get-the-project) and run it.
@@ -20,64 +20,83 @@ Before you begin, set up Kafka by following the instructions in [Setting up Kafk
## Set up the integration project
-1. Follow the steps in [create integration project]({{base_path}}/develop/create-integration-project/) guide to set up the Integration Project.
+Follow the steps below to set up the integration project using the WSO2 Integrator: MI Visual Studio Code extension.
-2. Create a new Kafka connection.
- 1. Goto `Local Entries` -> `Connections` and click on the `+` sign.
- 2. Select `KafkaTransport` connector.
-
+### Create a new project
- 3. Use the following values to create the connection.
- - Connection Name - `KafkaConnection`
- - Connection Type - `kafka`
- - Bootstrap Servers - `localhost:9092`
- - Key Serializer Class - `org.apache.kafka.common.serialization.StringSerializer`
- - Value Serializer Class - `org.apache.kafka.common.serialization.StringSerializer`
- - Pooling Enabled - `false`
+Follow the steps in the [create integration project]({{base_path}}/develop/create-integration-project/) guide to set up WSO2 MI and create a new integration project. Use a suitable Project Name for your integration.
-## Create the integration logic
+### Create a connection
-1. Select WSO2 Integrator: MI and click on `+` in APIs to create a REST API. Provide `KafkaTransport` as name and `publishMessages` as context.
-
+1. In the Design View, click the **+View More** button and select **Connection**.
-2. Create a resource with the below configuration.
-
+2. In the search bar, type `Kafka` and select the `Kafka connector` from the list.
-3. Select the created resource and add the `PublishMessages` operation.
-
+
- - Use the following values to fill the appearing form.
- - Connection - `KafkaConnection`
- - Topic - `test`
- - Partition Number - `0`
+3. In the Connection Configuration pane, enter the following required information:
+ - **Connection Name** - KafkaConnection
+ - **Connection Type** - kafka
+ - **Boostrap Servers** - localhost:9092
+ - **Key Serializer Class** - org.apache.kafka.common.serialization.StringSerializer
+ - **Value Serializer Class** - org.apache.kafka.common.serialization.StringSerializer
+ - **Pooling Enabled** - false
-
+### Create an API
-The source view of the XML configuration file of the API will be as below.
+1. Click on the **API** button in create an integration project pane.
-```xml
-
+
+
+2. Enter the API Name as `KafkaTransport` and the Context as `/publishMessages`, then click **Create**.
+
+
+
+3. To add the Kafka connector:
+ - In the **Design View**, click the **+** button.
+ - In the **Mediator** section, search for `Kafka`.
+ - Select the **Kafka** connector and click **Download**
+
+### Implement the API
+
+1. Go to the **Source View** of the API by clicking on the **<>** icon in the top right corner of the **Design View**.
+
+
+2. Copy and paste the following code in the **Source View** of the API.
+
+??? note "Source view of the implemented resource"
+ ```xml
+
-
+
test
0
+
+
+
+
+
+
+
+
false
+
+
+
false
+
+ None
+ []
+
-
+
-```
-
-Now, we can export the imported connector and the API into a single CAR application. The CAR application needs to be deployed during server runtime.
-
-## Export integration logic as a carbon application
-
-To export the project, please refer to the [build and export the carbon application]({{base_path}}/develop/deploy-artifacts/#build-and-export-the-carbon-application) guide.
+ ```
## Get the project
@@ -89,12 +108,14 @@ You can download the ZIP file and extract the contents to get the project code.
## Deployment
-To deploy and run the project, please refer to the [build and run]({{base_path}}/develop/deploy-artifacts/#build-and-run) guide.
+To deploy and run the project, refer to the [Build and Run]({{base_path}}/develop/deploy-artifacts/#build-and-run) guide.
-You can further refer the application deployed through the CLI tool. See the instructions on [managing integrations from the CLI]({{base_path}}/observe-and-manage/managing-integrations-with-micli).
-
## Test
+Invoke the API as shown below using the MI VSCode Extension.
+
+
+
**Create a topic**:
Let’s create a topic named `test` with a single partition and only one replica.
@@ -104,12 +125,15 @@ bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-fac
```
**Sample request**:
-
-Send a message to the Kafka broker using a CURL command or sample client.
-```bash
-curl -X POST -d '{"name":"sample"}' "http://localhost:8290/publishMessages" -H "Content-Type:application/json" -v
-```
+- Content-Type: application/json
+- Request body:
+ ```json
+ {
+ "name": "sample"
+ }
+ ```
+
**Expected response**:
Navigate to the `` and run the following command to verify the messages:
@@ -120,9 +144,7 @@ bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test --f
See the following message content:
```bash
{"name":"sample"}
-```
-
-This demonstrates how the Kafka connector publishes messages to the Kafka brokers.
+```
## What's next