Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
joowani committed Feb 9, 2021
1 parent c1bcaf3 commit 644bf73
Showing 1 changed file with 16 additions and 16 deletions.
32 changes: 16 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
## KQ: Kafka Job Queue for Python
# KQ: Kafka Job Queue for Python

![Build](https://github.com/joowani/kq/workflows/Build/badge.svg)
![CodeQL](https://github.com/joowani/kq/workflows/CodeQL/badge.svg)
Expand All @@ -11,25 +11,25 @@
execute jobs asynchronously using [Apache Kafka](https://kafka.apache.org/). It uses
[kafka-python](https://github.com/dpkp/kafka-python) under the hood.

### Announcements
## Announcements

* Support for Python 3.5 will be dropped from KQ version 3.0.0.
* See [releases](https://github.com/joowani/kq/releases) for latest updates.

### Requirements
## Requirements

* [Apache Kafka](https://kafka.apache.org) 0.9+
* Python 3.6+

### Installation
## Installation

Install using [pip](https://pip.pypa.io):

```shell
pip install kq
```

### Usage
## Getting Started

Start your Kafka instance.
Example using [Docker](https://github.com/lensesio/fast-data-dev):
Expand All @@ -47,22 +47,22 @@ from kafka import KafkaConsumer
from kq import Worker

# Set up logging.
formatter = logging.Formatter('[%(levelname)s] %(message)s')
formatter = logging.Formatter("[%(levelname)s] %(message)s")
stream_handler = logging.StreamHandler()
stream_handler.setFormatter(formatter)
logger = logging.getLogger('kq.worker')
logger = logging.getLogger("kq.worker")
logger.setLevel(logging.DEBUG)
logger.addHandler(stream_handler)

# Set up a Kafka consumer.
consumer = KafkaConsumer(
bootstrap_servers='127.0.0.1:9092',
group_id='group',
auto_offset_reset='latest'
bootstrap_servers="127.0.0.1:9092",
group_id="group",
auto_offset_reset="latest"
)

# Set up a worker.
worker = Worker(topic='topic', consumer=consumer)
worker = Worker(topic="topic", consumer=consumer)
worker.start()
```

Expand All @@ -82,16 +82,16 @@ from kafka import KafkaProducer
from kq import Queue

# Set up a Kafka producer.
producer = KafkaProducer(bootstrap_servers='127.0.0.1:9092')
producer = KafkaProducer(bootstrap_servers="127.0.0.1:9092")

# Set up a queue.
queue = Queue(topic='topic', producer=producer)
queue = Queue(topic="topic", producer=producer)

# Enqueue a function call.
job = queue.enqueue(requests.get, 'https://google.com')
job = queue.enqueue(requests.get, "https://google.com")

# You can also specify the job timeout, Kafka message key and partition.
job = queue.using(timeout=5, key=b'foo', partition=0).enqueue(requests.get, 'https://google.com')
job = queue.using(timeout=5, key=b"foo", partition=0).enqueue(requests.get, "https://google.com")
```

The worker executes the job in the background:
Expand All @@ -100,7 +100,7 @@ The worker executes the job in the background:
python my_worker.py
[INFO] Starting Worker(hosts=127.0.0.1:9092, topic=topic, group=group) ...
[INFO] Processing Message(topic=topic, partition=0, offset=0) ...
[INFO] Executing job c7bf2359: requests.api.get('https://www.google.com')
[INFO] Executing job c7bf2359: requests.api.get("https://www.google.com")
[INFO] Job c7bf2359 returned: <Response [200]>
```

Expand Down

0 comments on commit 644bf73

Please sign in to comment.