Purpose of this producer is used for Google Cloud Managed Kafka load test. Hence you could use that for sizing and cost estimation
Use this dataflow consumer to achieve best performance and retrieve stats information.
- install the server
./authserver install- start the server
./authserver start- check server status
./authserver statusyou could check authserver.log for running information and errors, or pass
stop, restart parameters accordingly for server controls.
topicNameis the Kafka Topic you want to send to"bootstrap.servers": "bootstrap.dingo-kafka.us-central1.managedkafka.du-hast-mich.cloud.goog:9092",replace the value accordingly to your Kafka server- Optional
numPublishersnumber of Kafka publishers concurrentlynumDataGenThreadsnumber of data generation threads, only increase the number if data pool is constantly empty which may potentially affect the publishing performancenumWorkersbetter to keep this same asnumPublishers, this only pull the data from the pool and fill the data chanel, which is dedicated for each publisher
- init the project (you only need to do that once)
make init- build the Golang code
make build- Dump the data
You should be able see a binary named
mainin the project root directory, simply run it./mainormake run
Ctr + C to stop it