The Kafka data source plugin allows you to visualize streaming Kafka data from within Grafana.
- Apache Kafka v0.9+
- Grafana v9.0+
Note: This is a backend plugin, so the Grafana server should've access to the Kafka broker.
Use the grafana-cli tool to install the plugin from the commandline:
grafana-cli plugins install hamedkarbasi93-kafka-datasource
The plugin will be installed into your grafana plugins directory; the default is /var/lib/grafana/plugins
. More information on the cli tool.
Alternatively, you can manually download the latest release .zip file and unpack it into your grafana plugins directory; the default is /var/lib/grafana/plugins
.
Add a data source by filling in the following fields:
Field | Description |
---|---|
Name | A name for this particular AppDynamics data source |
Servers | The URL of the Kafka bootstrap servers separated by comma. E.g. broker1:9092, broker2:9092 |
To query the Kafka topic, you have to config the below items in the query editor.
Field | Description |
---|---|
Topic | Topic Name |
Partition | Partition Number |
Auto offset reset | Starting offset to consume that can be from latest or last 100. |
Timestamp Mode | Timestamp of the message value to visualize; It can be Now or Message Timestamp |
- The plugin currently does not support any authorization and authentication method.
- The plugin currently does not support TLS.
This plugin supports topics publishing very simple JSON formatted messages. Note that only the following structure is supported as of now:
{
"value1": 1.0,
"value2": 2,
"value3": 3.33,
...
}
We plan to support more complex JSON data structures, Protobuf and AVRO in the upcoming releases. Contributions are highly encouraged!
In the example folder, there are simple producers for different languages that generate json sample values in Kafka. For more details on how to run them, please check the README.md
.
Thank you for considering contributing! If you find an issue or have a better way to do something, feel free to open an issue or a PR. To setup the development environment, follow these steps:
- Ubuntu 24.04 LTS
- node v22.15
- go 1.24.1
A data source backend plugin consists of both frontend and backend components.
-
Install dependencies
npm install
-
Build plugin in development mode and run in watch mode
npm run dev
-
Build plugin in production mode
npm run build
-
Run the tests (using Jest)
# Runs the tests and watches for changes, requires git init first npm run test # Exits after running all the tests npm run test:ci
-
Spin up a Grafana instance and run the plugin inside it (using Docker)
npm run server
-
Run the E2E tests (using Playwright)
# Spins up a Grafana instance first that we tests against npm run server # If you wish to start a certain Grafana version. If not specified will use latest by default GRAFANA_VERSION=11.3.0 npm run server # Starts the tests npm run e2e
-
Run the linter
npm run lint # or npm run lint:fix
-
Update Grafana plugin SDK for Go dependency to the latest minor version:
go get -u github.com/grafana/grafana-plugin-sdk-go go mod tidy
-
Build backend plugin binaries for Linux:
mage build:linux
To build for any other platform, use mage targets:
mage -v build:darwin mage -v build:windows mage -v build:linuxArm mage -v build:linuxArm64 mage -v build:darwinArm64
mage -v build:backend
builds for your current platform.
mage -v buildAll
builds for all platforms at once.
Grafana will be available on localhost:3000
with plugin already installed.
Kafka will be available on localhost:9092
for localhost connections and on kafka:29092
for docker connections.
This repository is open-sourced software licensed under the Apache License 2.0.