Skip to content

Commit 52d765d

Browse files
tomblenchricellis
andauthored
update readme, changes, version (#137)
* chore: prepare for 0.200.0 release * docs: update README and CHANGES - readme: update build badge - readme: remove pre-release section - changes: reword readme change - changes: clarify moves and renames with instructions for changing properties file - changes: add entry for record key format - changes: add entry for dlq/errors - changes: add entry for batch semantics - changes: re-order with most important breaking changes first - changes: whitespace, wrap at 120 chars * docs: fix copy/paste * docs: implicitly sort by date and include prereleases in badge * docs: PR feedback * docs: PR feedback * docs: suggested installation changes for zip distribution * Apply suggestions from code review Co-authored-by: Rich Ellis <[email protected]> Co-authored-by: Thomas Blench <[email protected]> Co-authored-by: Rich Ellis <[email protected]>
1 parent d3aa3d6 commit 52d765d

File tree

3 files changed

+45
-28
lines changed

3 files changed

+45
-28
lines changed

CHANGES.md

+29-13
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,33 @@
1-
# UNRELEASED
2-
- [FIXED] README to align with the 2.8.0 Kafka version.
3-
- [BREAKING CHANGE] Refactor package names
4-
- [BREAKING CHANGE] Source and sink connector moves and renames
5-
- [BREAKING CHANGE] Source connector flatten, schema generation and omit design documents options have been replaced by message transforms. See README for details.
6-
- [BREAKING CHANGE] Source connector now emits `java.util.Map` (not `String`) record values by default. See README for details.
7-
- [BREAKING CHANGE] Source connector now emits tombstone events for deleted documents. See [single message transforms](README.md#single-message-transforms) section in README for details.
1+
# 0.200.0 (2022-11-01)
2+
3+
- [FIXED] README and documentation have been extensively rewritten.
4+
- [BREAKING CHANGE] Rename source connector. Properties files should be updated to
5+
use `connector.class=com.ibm.cloud.cloudant.kafka.SourceChangesConnector`.
6+
- [BREAKING CHANGE] Rename sink connector. Properties files should be updated to
7+
use `connector.class=com.ibm.cloud.cloudant.kafka.SinkConnector`.
8+
- [BREAKING CHANGE] Configuration parameters have changed for url, database, authentication, and last change sequence.
9+
See README for details.
10+
- [BREAKING CHANGE] Source connector flatten, schema generation and omit design documents options have been replaced by
11+
message transforms. See README for details.
12+
- [BREAKING CHANGE] Source connector now emits `java.util.Map` (not `String`) event values by default. See README for
13+
details.
14+
- [BREAKING CHANGE] Source connector now emits `org.apache.kafka.connect.data.Struct` (not `String`) event keys. See
15+
README for details.
16+
- [BREAKING CHANGE] Source connector now emits tombstone events for deleted documents.
17+
See [single message transforms](README.md#single-message-transforms) section in README for details.
18+
- [BREAKING CHANGE] Converter support for sink connector has changed. See README for details.
19+
- [BREAKING CHANGE] Preserve `_rev` field message values in sink connector.
20+
See [sink connector config](README.md#converter-configuration-sink-connector) section in README for more details.
21+
- [BREAKING CHANGE] Semantics of `batch.size` configuration parameter changed: for sink connector this value no longer
22+
affects when `flush()` is called.
23+
- [BREAKING CHANGE] Sink connector will correctly honour `errors.tolerance`, `errors.log.enable`,
24+
and `errors.deadletterqueue.topic.name` configuration parameters.
25+
See [the sample sink properties file](docs/connect-cloudant-sink-example.properties) for a recommended example of how
26+
to configure these to continue processing when non-fatal errors occur.
27+
- [BREAKING CHANGE] Renamed from `kafka-connect-cloudant` to `cloudant-kafka-connector` and packaged as zipped directory instead of uber jar. See README for installation details.
828
- [BREAKING CHANGE] Publish releases to https://github.com/IBM/cloudant-kafka-connector/releases.
9-
- [BREAKING CHANGE] Rename package from `com.ibm.cloudant.kafka` to `com.ibm.cloud.cloudant.kafka`. Existing `connector.class` property values must be updated to use the new package.
10-
- [BREAKING CHANGE] Rename module from `kafka-connect-cloudant` to `cloudant-kafka-connector`.
11-
- [BREAKING CHANGE] Converter support for sink connector has changed. See README.md for details.
12-
- [BREAKING CHANGE] Configuration parameters have changed for url, database, authentication, and last change sequence. See README.md for details.
13-
- [BREAKING CHANGE] Preserve `_rev` field message values in sink connector. See [sink connector config](README.md#converter-configuration-sink-connector) section in README for more details.
14-
- [UPGRADED] Connector now supports all authentication types via the `cloudant.auth.type` configuration parameter. When using an authentication type of "iam", the API key is configured via the `cloudant.apikey` configuration parameter.
29+
- [UPGRADED] Connector now supports all authentication types via the `cloudant.auth.type` configuration parameter. When
30+
using an authentication type of "iam", the API key is configured via the `cloudant.apikey` configuration parameter.
1531
- [UPGRADED] Upgraded Gradle distribution from 4.5.1 to 7.4
1632
- [UPGRADED] Upgraded Kafka Connect API to 3.2.1.
1733
- [UPGRADED] Refactored to use the new `cloudant-java-sdk` library.

README.md

+15-14
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Cloudant Kafka Connector
22

3-
[![Maven Central](https://img.shields.io/maven-central/v/com.cloudant/kafka-connect-cloudant.svg)](http://search.maven.org/#search|ga|1|g:"com.cloudant"%20AND%20a:"kafka-connect-cloudant")
3+
[![Release](https://img.shields.io/github/v/release/IBM/cloudant-kafka-connector?include_prereleases)](https://github.com/IBM/cloudant-kafka-connector/releases/latest)
44

55
This project includes [Apache Kafka](https://kafka.apache.org/) [Connect](https://kafka.apache.org/documentation.html#connect) source and sink connectors for IBM Cloudant.
66

@@ -10,14 +10,6 @@ These connectors can stream events:
1010

1111
_Note:_ the connectors are also compatible with Apache CouchDB.
1212

13-
## Pre-release
14-
15-
**Note**: this README file is for a pre-release version of the
16-
connector. This means it refers to configuration options and features
17-
which are different to the currently released version. For information
18-
about the currently released version, please see the [README
19-
here](https://github.com/IBM/cloudant-kafka-connector/blob/0.100.2-kafka-1.0.0/README.md).
20-
2113
## Release Status
2214

2315
Experimental
@@ -28,10 +20,19 @@ Experimental
2820

2921
### Quick Start
3022

31-
1. Download the jar from the [releases page](https://github.com/IBM/cloudant-kafka-connector/releases). The jar file contains the plugin and the non-Kafka dependencies needed to run.
32-
2. Copy the jar to the `libs` directory _or_ the
33-
[configured `plugin.path`](https://kafka.apache.org/documentation.html#connectconfigs_plugin.path) of your Kafka installation.
34-
3. Edit the [source](docs/connect-cloudant-changes-source-example.properties) or [sink](docs/connect-cloudant-sink-example.properties) example properties files and save this to the `config` directory of your Kafka installation.
23+
1. Download the zip from the [releases page](https://github.com/IBM/cloudant-kafka-connector/releases). The zip file
24+
contains the plugin jar and the non-Kafka dependencies needed to run.
25+
2. Configure the [Kafka connect plugin path](https://kafka.apache.org/documentation.html#connectconfigs_plugin.path) for
26+
your Kafka distribution, for example: `plugin.path=/kafka/connect`.
27+
- This will be configured in either `connect-standalone.properties` or `connect-distributed.properties` in
28+
the `config` directory of your Kafka installation.
29+
- If you're not sure which to use, edit `connect-standalone.properties` and follow the standalone execution
30+
instructions below.
31+
2. Unzip and move to the plugin path configured earlier, for example:
32+
`unzip cloudant-kafka-connector-x.y.z.zip; mv cloudant-kafka-connector-x.y.z /kafka/connect`.
33+
3. Edit the [source](docs/connect-cloudant-changes-source-example.properties)
34+
or [sink](docs/connect-cloudant-sink-example.properties) example properties files and save this to the `config`
35+
directory of your Kafka installation.
3536
4. Start Kafka.
3637
5. Start the connector (see below).
3738

@@ -116,7 +117,7 @@ For the source connector:
116117
* Values are produced as a (schemaless) `java.util.Map<String, Object>`.
117118
* These types are compatible with the default `org.apache.kafka.connect.json.JsonConverter` and should be compatible with any other converter that can accept a `Struct` or `Map`.
118119
* The `schemas.enable` may be safely used with a `key.converter` if desired.
119-
* The source connector does not generate schemas for the record values by default. To use `schemas.enable` with the `value.converter` consider using a schema registry or the [`MapToStruct` SMT](docs/smt-reference.md#map-to-struct-conversion).
120+
* The source connector does not generate schemas for the event values by default. To use `schemas.enable` with the `value.converter` consider using a schema registry or the [`MapToStruct` SMT](docs/smt-reference.md#map-to-struct-conversion).
120121

121122
#### Converter Configuration: sink connector
122123

VERSION

+1-1
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
0.200.0-SNAPSHOT
1+
0.200.0

0 commit comments

Comments
 (0)