This application is a Chatbot that uses the OpenAI API to maintain a conversation, it has an interface similar to ChatGPT itself.
chatbot-demo.mov
Chat Front-end
- Next.js
- BFF (Route Handlers)
- SES
- gRPC Client
- MySQL
- Prisma
- SWR
- Next Auth
- Keycloak
- Tailwindcss
- Docker + Docker Compose
Chat Microservice
- Golang
- MySQL
- SQLC
- Go Chi
- Go OpenAI
- gRPC Server Stream
- REST API
- Docker + Docker Compose
Access the folder:
$ cd chat-serviceCreate .env file:
$ cp .env.example .envUpdate the variable values as needed.
Create OpenAI Secret Key and update the variable below:
OPENAI_API_KEY="PASTE THE OPENAI SECRET KEY VALUE"Access the folder:
$ cd chatCreate .env file:
$ cp .env.example .envUpdate the variable values as needed.
Configure /etc/hosts:
# Mac and Linux
# /etc/hosts
# Windows
# C:\Windows\System32\drivers\etc\hosts
127.0.0.1 host.docker.internalKeycloak settings
Access Keycloak Administration Console
Username: admin
Password: admin
http://host.docker.internal:9000Create Realm:
Realm name: chat-app
Enabled: yes
Create User:
This user will be used for you to access app, so define as you wish the name, email, etc.
After create user, go to
Credentialstab and set a new password, unchecked temporary option.
Create Client:
ClientID: nextjs
Valid redirect URIs: http://localhost:3000/*
Web origins: http://localhost:3000/*
Client authentication: On
Authentication flow: Standard flow and Direct access grants
After create client, go to Credentials tab and copy Client secret value and add in the .env file:
KEYCLOAK_CLIENT_SECRET="PASTE THE CLIENT SECRET VALUE"Up Container:
$ docker-compose up -dAccess container:
$ docker exec -it chatservice_app /bin/bashInstall dependencies:
$ go mod tidyRun migrations:
$ make migrateRun microservice:
$ go run cmd/chatservice/main.goUp Container:
$ docker-compose up -dAccess container:
$ docker exec -it chat_app /bin/bashInstall dependencies:
$ npm installRun migrations:
$ npx prisma migrate devRun front-end:
$ npm run dev





