README.md in karafka-sidekiq-backend-1.2.0 vs README.md in karafka-sidekiq-backend-1.3.0.rc1
- old
+ new
@@ -1,8 +1,8 @@
# Karafka Sidekiq Backend
-[![Build Status](https://travis-ci.org/karafka/sidekiq-backend.png)](https://travis-ci.org/karafka/karafka-sidekiq-backend)
+[![Build Status](https://travis-ci.org/karafka/sidekiq-backend.svg)](https://travis-ci.org/karafka/sidekiq-backend)
[![Join the chat at https://gitter.im/karafka/karafka](https://badges.gitter.im/karafka/karafka.svg)](https://gitter.im/karafka/karafka?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
[Karafka Sidekiq Backend](https://github.com/karafka/sidekiq-backend) provides support for consuming (processing) received Kafka messages inside of Sidekiq workers.
## Installations
@@ -43,19 +43,19 @@
```ruby
App.routes.draw do
consumer_group :videos_consumer do
topic :binary_video_details do
- controller Videos::DetailsController
+ consumer Videos::DetailsConsumer
worker Workers::DetailsWorker
interchanger Interchangers::MyCustomInterchanger
end
end
end
```
-You don't need to do anything beyond that. Karafka will know, that you want to run your controllers ```#perform``` method in a background job.
+You don't need to do anything beyond that. Karafka will know, that you want to run your consumer's ```#consume``` method in a background job.
## Configuration
There are two options you can set inside of the ```topic``` block:
@@ -65,36 +65,67 @@
| interchanger | Class | Name of an interchanger class that we want to use to pass the incoming data to Sidekiq |
### Workers
-Karafka by default will build a worker that will correspond to each of your controllers (so you will have a pair - controller and a worker). All of them will inherit from ```ApplicationWorker``` and will share all its settings.
+Karafka by default will build a worker that will correspond to each of your consumers (so you will have a pair - consumer and a worker). All of them will inherit from ```ApplicationWorker``` and will share all its settings.
To run Sidekiq you should have sidekiq.yml file in *config* folder. The example of ```sidekiq.yml``` file will be generated to config/sidekiq.yml.example once you run ```bundle exec karafka install```.
However, if you want to use a raw Sidekiq worker (without any Karafka additional magic), or you want to use SidekiqPro (or any other queuing engine that has the same API as Sidekiq), you can assign your own custom worker:
```ruby
topic :incoming_messages do
- controller MessagesController
+ consumer MessagesConsumer
worker MyCustomWorker
end
```
-Note that even then, you need to specify a controller that will schedule a background task.
+Note that even then, you need to specify a consumer that will schedule a background task.
Custom workers need to provide a ```#perform_async``` method. It needs to accept two arguments:
- ```topic_id``` - first argument is a current topic id from which a given message comes
- - ```params_batch``` - all the params that came from Kafka + additional metadata. This data format might be changed if you use custom interchangers. Otherwise it will be an instance of Karafka::Params::ParamsBatch.
+ - ```params_batch``` - all the params that came from Kafka + additional metadata. This data format might be changed if you use custom interchangers. Otherwise, it will be an instance of Karafka::Params::ParamsBatch.
**Note**: If you use custom interchangers, keep in mind, that params inside params batch might be in two states: parsed or unparsed when passed to #perform_async. This means, that if you use custom interchangers and/or custom workers, you might want to look into Karafka's sources to see exactly how it works.
### Interchangers
Custom interchangers target issues with non-standard (binary, etc.) data that we want to store when we do ```#perform_async```. This data might be corrupted when fetched in a worker (see [this](https://github.com/karafka/karafka/issues/30) issue). With custom interchangers, you can encode/compress data before it is being passed to scheduling and decode/decompress it when it gets into the worker.
+To specify the interchanger for a topic, specify the interchanger inside routes like this:
+
+```ruby
+App.routes.draw do
+ consumer_group :videos_consumer do
+ topic :binary_video_details do
+ consumer Videos::DetailsConsumer
+ interchanger Interchangers::MyCustomInterchanger
+ end
+ end
+end
+```
+Each custom interchanger should define `encode` to encode params before they get stored in Redis, and `decode` to convert the params to hash format, as shown below:
+
+```ruby
+class Base64Interchanger
+ class << self
+ def encode(params_batch)
+ # Note, that you need to cast the params_batch to an array in order to get it work
+ # in sidekiq later
+ Base64.encode64(Marshal.dump(params_batch.to_a))
+ end
+
+ def decode(params_string)
+ Marshal.load(Base64.decode64(params_string))
+ end
+ end
+end
+
+```
+
**Warning**: if you decide to use slow interchangers, they might significantly slow down Karafka.
## References
* [Karafka framework](https://github.com/karafka/karafka)
@@ -103,10 +134,10 @@
## Note on contributions
First, thank you for considering contributing to Karafka! It's people like you that make the open source community such a great community!
-Each pull request must pass all the rspec specs and meet our quality requirements.
+Each pull request must pass all the RSpec specs and meet our quality requirements.
To check if everything is as it should be, we use [Coditsu](https://coditsu.io) that combines multiple linters and code analyzers for both code and documentation. Once you're done with your changes, submit a pull request.
Coditsu will automatically check your work against our quality standards. You can find your commit check results on the [builds page](https://app.coditsu.io/karafka/repositories/karafka-sidekiq-backend/builds/commit_builds) of Karafka Sidekiq Backend repository.