Contents
Some "proof of concept" code:
- poc1.py - example code showing how to create a Kafka producer using
python-kafka
, and send some messages. - poc2.py - the same thing, but using
aiokafka
to do it asynchronously. - poc3.py - me working out how to do Kafka producer and consumer in a Textual app.
The demo programs as show in the talk:
- demo1-cod-and-chips.py - a till sending orders to a topic, and a food preparer consuming them.
- demo2-cod-and-chips-3tills.py - three tills, but still only one preparer, who can't keep up.
- demo3-cod-and-chips-3tills-2preparers.py - three tills and two preparers, who can keep up.
- demo4-with-added-cook.py - back to one till and one food preparer, but now we have a cook to prepare plaice for us.
and
- demo_helpers.py which has common code for the above.
Known problems:
- If you make the terminal window narrow enough that the text in panels has to wrap, then the panels won't display all the lines. My code is counting logical lines, not display lines.
I use poetry to manage the dependencies needed by this demo.
So, for instance, in this directory I would start a new poetry
shell using:
$ poetry shell
and then install the dependencies using:
$ poetry install
If you wish, you can exit the poetry
shell using exit
.
Assuming you have a Kafka service, and the SSL credentials for connecting to
it are in the creds
directory, then you can run each demo with:
./DEMO_FILE HOST_URL:SSL_PORT -d creds
For me, running the first demo against my Kafka service at
tibs-kafka-fish-dev-sandbox.aivencloud.com:12693
, I would use:
./demo1-cod-and-chips.py tibs-kafka-fish-dev-sandbox.aivencloud.com:12693 -d creds
You don't need to run the demos using Aiven services, but I think it's the easiest option if you don't already have Kafka up and running.
If you don't yet have an Aiven account, you can sign up for a free trial
Get an authentication token, as described at Create an authentication token, copy it, and log in using the following command. You'll need to replace YOUR-EMAIL-ADDRESS with the email address you've registered with Aiven:
avn user login YOUR-EMAIL-ADDRESS --token
This will prompt you to paste in your token.
Aiven uses "projects" to organise which services you can access. You can list them with:
avn project list
Choose the project you want to use with the following command, replacing
PROJECT-NAME
with the appropriate name:
avn project switch PROJECT_NAME
You then need to decide what cloud you want to run the service in. Use:
avn cloud list
to find the clouds. Since Aiven is based out of Helsinki, I tend to choose
google-europe-north1
, which is Finland, but you'll want to make your own
choice.
Normally, you'd also want to decide on a service plan (which determines the number of servers, the memory, CPU and disk resources for the service). You can find the service plans for a cloud using:
avn service plans --service-type kafka --cloud CLOUD-NAME
However, for the these demo programs a kafka:startup-2
plan is sufficient,
and that's also the cheapest.
Note that if you want to use Kafka Connect with your Kafka service,
you'll need something more powerful than the startup plan, for instance
business-4
.
Now it's time to create the actual Kafka service, using the command below.
The service name needs to be unique and can't be changed - I like to put my
name in it (for instance, tibs-kafka-fish
).
The extra -c
switches enable the REST API to the service (used to get some
of the information available in the web console), the ability to create new
topics by publishing to them (we definitely want this), and use of the schema
registry (which we actually don't need in this demo, but it doesn't cost extra
and is often useful).
Again, remember to replace KAFKA_FISH_DEMO
with your actual service name,
and CLOUD_NAME
with the cloud name:
avn service create KAFKA_FISH_DEMO \
--service-type kafka \
--cloud CLOUD-NAME \
--plan startup-2 \
-c kafka_rest=true \
-c kafka.auto_create_topics_enable=true \
-c schema_registry=true
It takes a little while for a service to start up. You can wait for it using:
avn service wait KAFKA_FISH_DEMO
which will update you on the progress of the service, and exit when the
service is RUNNING
.
In order to let the demo programs talk to the Kafka service, you need to download the appropriate certificate files. Create a directory to put them into:
mkdir creds
and then download them:
avn service user-creds-download KAFKA_FISH_DEMO -d creds --username avnadmin
To connect to the Kafka service, you need its service URI. You can find that out with:
avn service get KAFKA_FISH_DEMO --format '{service_uri}'
And now you're ready to run the demo programs
This is the same information as at How to run the demos earlier in this README, put here so you don't need to scroll all the way to the top again.
Given the service URI you found using avn service get
(just above here),
and assuming you saved the credentials to a directory called creds
, then
run a demo with:
./DEMO_FILE SERVICE_URI -d creds
For me, running the first demo against my Kafka service at
tibs-kafka-fish-dev-sandbox.aivencloud.com:12693
, I would use:
./demo1-cod-and-chips.py tibs-kafka-fish-dev-sandbox.aivencloud.com:12693 -d creds
To find out more information about a Kafka topic, look at the documentation for avn service topic.
You can also find useful information about a service using the Aiven web console, on the Services page for your Kafka service.
If you're not using your Kafka service for a while, and don't mind losing any data in its event stream, then it makes sense to power it off, as you don't get charged (real money or free trial credits) when a service is powered off.
You can power off the service (remember, this will discard all your data) with:
avn service update $KAFKA_SERVICE --power-off
and bring it back again with:
avn service update $KAFKA_SERVICE --power-on
This will take a little while to finish, so wait for it with:
avn service wait KAFKA_FISH_DEMO
If you've entirely finished using the Kafka service, you can delete it with:
avn service terminate KAFKA_FISH_DEMO
Homework projects suggested in the talk:
- Use a JDBC Kafka Connector to send orders from the main topic to a PostgreSQL® database, and then add a widget to the demo that queries that database periodically and updates a panel with some summary information (perhaps as simple as the total count of cod, chip and plaice orders).
- Use a Redis® cache to simulate the cook preparing food for the hot cabinet. There's a brief summary in the slides. For extra credit, also have the cook "wake up" periodically to check if they need to cook more cod or chips to keep the amount in the hot cabinet at the right level.
Let me know if you play with these ideas!
Other ideas:
- Fix the problem where I'm counting logical lines in the panels, and not lines as displayed. I just didn't have time to do this.
- Make a proper generator for customer orders - if you will, a queue of customers. Each till would then ask for the next customer to "come forward" and present its order, rather than asking for an order directly from the order generator. This in turn would allow having a panel to show the current customer queue, which would help visualise the incoming flow of orders.
I also want to work out why I had some problems with sending to all the partitions (in demo 3) without actually specifying the partition number. I've been given the hint to research "sticky partitioning".
I'm also interested why it's only in demo 3 that I have to worry about making the producers wait for the consumers to be ready.
And, of course, there's much more to learn about Apache Kafka®, and also about Textualize and Rich.
You may also be interested in
- My Aiven blog post Get things done with the Aiven CLI
- The Aiven github repository Python Jupyter Notebooks for Apache Kafka® which is a series of Jupyter Notebooks on how to start with Apache Kafka® and Python, using Aiven managed services.
- The Aiven for Apache Kafka® section of the Aiven developer documentation
For acknowledgements of product names, see the toplevel README.
The source code in this directory is dual-licensed under the MIT license (see LICENSE.txt) and Creative Commons Attribution-ShareAlike 4.0 International License. Choose whichever seems most appropriate for your use.