Skip to main content
Version: 0.7.0

@consumes basics

You can use @consumes decorator to consume messages from Kafka topics.

In this guide we will create a simple FastKafka app that will consume HelloWorld messages from hello_world topic.

Import FastKafka​

To use the @consumes decorator, first we need to import the base FastKafka app to create our application.

from fastkafka import FastKafka

In this demo we will log the messages to the output so that we can inspect and verify that our app is consuming properly. For that we need to import the logger.

from fastkafka._components.logger import get_logger

logger = get_logger(__name__)

Define the structure of the messages​

Next, you need to define the structure of the messages you want to consume from the topic using pydantic. For the guide we’ll stick to something basic, but you are free to define any complex message structure you wish in your project, just make sure it can be JSON encoded.

Let’s import BaseModel and Field from pydantic and create a simple HelloWorld class containing one string parameter msg

from pydantic import BaseModel, Field

class HelloWorld(BaseModel):
msg: str = Field(
...,
example="Hello",
description="Demo hello world message",
)

Create a base FastKafka app​

Now we will create and define a base FastKafka app, replace the <url_of_your_kafka_bootstrap_server> and <port_of_your_kafka_bootstrap_server> with the actual values of your Kafka bootstrap server



kafka_brokers = {
"demo_broker": {
"url": "<url_of_your_kafka_bootstrap_server>",
"description": "local demo kafka broker",
"port": "<port_of_your_kafka_bootstrap_server>",
}
}

app = FastKafka(kafka_brokers=kafka_brokers)

Create a consumer function and decorate it with @consumes​

Let’s create a consumer function that will consume HelloWorld messages from hello_world topic and log them.

@app.consumes()
async def on_hello_world(msg: HelloWorld):
logger.info(f"Got msg: {msg}")

The function decorated with the @consumes decorator will be called when a message is produced to Kafka.

The message will then be injected into the typed msg argument of the function and its type will be used to parse the message.

In this example case, when the message is sent into a hello_world topic, it will be parsed into a HelloWorld class and on_hello_world function will be called with the parsed class as msg argument value.

Final app​

Your app code should look like this:

from fastkafka import FastKafka
from pydantic import BaseModel, Field
from fastkafka._components.logger import get_logger

logger = get_logger(__name__)

class HelloWorld(BaseModel):
msg: str = Field(
...,
example="Hello",
description="Demo hello world message",
)


kafka_brokers = {
"demo_broker": {
"url": "<url_of_your_kafka_bootstrap_server>",
"description": "local demo kafka broker",
"port": "<port_of_your_kafka_bootstrap_server>",
}
}

app = FastKafka(kafka_brokers=kafka_brokers)
@app.consumes()
async def on_hello_world(msg: HelloWorld):
logger.info(f"Got msg: {msg}")

Run the app​

Now we can run the app. Copy the code above in consumer_example.py and run it by running

fastkafka run --num-workers=1 --kafka-broker=demo_broker consumer_example:app

After running the command, you should see this output in your terminal:

[513863]: [INFO] fastkafka._application.app: set_kafka_broker() : Setting bootstrap_servers value to '127.0.0.1:9092'
[513863]: [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() starting...
[513863]: [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer created using the following parameters: {'bootstrap_servers': '127.0.0.1:9092', 'auto_offset_reset': 'earliest', 'max_poll_records': 100}
[513863]: [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer started.
[513863]: [INFO] aiokafka.consumer.subscription_state: Updating subscribed topics to: frozenset({'hello_world'})
[513863]: [INFO] aiokafka.consumer.consumer: Subscribed to topic(s): {'hello_world'}
[513863]: [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer subscribed.
[513863]: [INFO] aiokafka.consumer.group_coordinator: Metadata for topic has changed from {} to {'hello_world': 1}.
Starting process cleanup, this may take a few seconds...
[INFO] fastkafka._server: terminate_asyncio_process(): Terminating the process 513863...
[513863]: [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop(): Consumer stopped.
[513863]: [INFO] fastkafka._components.aiokafka_consumer_loop: aiokafka_consumer_loop() finished.
[INFO] fastkafka._server: terminate_asyncio_process(): Process 513863 terminated.

Send the message to kafka topic​

Lets send a HelloWorld message to the hello_world topic and check if our consumer kafka application has logged the received message. In your terminal, run:

echo {\"msg\": \"Hello world\"} | kafka-console-producer.sh --topic=hello_world --bootstrap-server=<addr_of_your_kafka_bootstrap_server>
print(consumer_task.value[1].decode("UTF-8"))

You should see the “Got msg: msg='Hello world'" being logged by your consumer.

Choosing a topic​

You probably noticed that you didn’t define which topic you are receiving the message from, this is because the @consumes decorator determines the topic by default from your function name. The decorator will take your function name and strip the default “on_" prefix from it and use the rest as the topic name. In this example case, the topic is hello_world.

You can choose your custom prefix by defining the prefix parameter in consumes decorator, like this:

Also, you can define the topic name completely by defining the topic in parameter in consumes decorator, like this:

Message data​

The message received from kafka is translated from binary JSON representation int the class defined by typing of msg parameter in the function decorated by the @consumes decorator.

In this example case, the message will be parsed into a HelloWorld class.

Message metadata​

If you need any of Kafka message metadata such as timestamp, partition or headers you can access the metadata by adding a EventMetadata typed argument to your consumes function and the metadata from the incoming message will be automatically injected when calling the consumes function.

Let’s demonstrate that.

Create a consumer function with metadata​

The only difference from the original basic consume function is that we are now passing the meta: EventMetadata argument to the function. The @consumes decorator will register that and, when a message is consumed, it will also pass the metadata to your function. Now you can use the metadata in your consume function. Lets log it to see what it contains.

First, we need to import the EventMetadata

Now we can add the meta argument to our consuming function.

Your final app should look like this:

Now lets run the app and send a message to the broker to see the logged message metadata.

You should see a similar log as the one below and the metadata being logged in your app.

As you can see in the log, from the metadata you now have the information about the partition, offset, timestamp, key and headers. 🎉

Dealing with high latency consuming functions​

If your functions have high latency due to, for example, lengthy database calls you will notice a big decrease in performance. This is due to the issue of how the consumes decorator executes your consume functions when consumeing events. By default, the consume function will run the consuming funtions for one topic sequentially, this is the most straightforward approach and results with the least amount of overhead.

But, to handle those high latency tasks and run them in parallel, FastKafka has a DynamicTaskExecutor prepared for your consumers. This executor comes with additional overhead, so use it only when you need to handle high latency functions.

Lets demonstrate how to use it.

decorate_consumes_executor = """@app.consumes(executor="DynamicTaskExecutor")
async def on_hello_world(msg: HelloWorld):
logger.info(f"Got msg: {msg}")
"""
md(f"```python\n{decorate_consumes}\n```")
@app.consumes()
async def on_hello_world(msg: HelloWorld):
logger.info(f"Got msg: {msg}")

Lets send a HelloWorld message to the hello_world topic and check if our consumer kafka application has logged the received message. In your terminal, run:

echo {\"msg\": \"Hello world\"} | kafka-console-producer.sh --topic=hello_world --bootstrap-server=<addr_of_your_kafka_bootstrap_server>

You should see the “Got msg: msg='Hello world'" being logged by your consumer.