Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when creating sink connector with BytesToString$Value transforms #94

Open
moreiravictor opened this issue Oct 24, 2022 · 9 comments

Comments

@moreiravictor
Copy link

moreiravictor commented Oct 24, 2022

I installed the plugin via manual process as written on the docs here and then tried to create the new connector with the following body (note that the importante fields are the last ones):

{
  "name": "bill_events",
  "config": {
    "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
    "tasks.max": "1",
    "topics": "bill_events",
    "connection.url": "jdbc:postgresql://postgres_sink:5432/local?user=postgres&password=123&stringtype=unspecified",
    "transforms": "unwrap,created_at_converter,updated_at_converter,deleted_at_converter,insertTS,formatTS,type_field",
    "transforms.unwrap.type": "io.debezium.transforms.ExtractNewRecordState",
    "transforms.unwrap.drop.tombstones": "false",
    "auto.create": "true",
    "insert.mode": "upsert",
    "delete.enabled": "true",
    "pk.mode": "record_key",
    "errors.tolerance": "all",
    "errors.deadletterqueue.topic.name": "bill_events_dlq",
    "errors.deadletterqueue.context.headers.enable": true,
    "errors.deadletterqueue.topic.replication.factor": -1,
    "errors.log.enable": true,
    "transforms.created_at_converter.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
    "transforms.created_at_converter.field": "created_at",
    "transforms.created_at_converter.format": "yyyy-MM-dd'T'HH:mm:ss.SSS",
    "transforms.created_at_converter.target.type": "Timestamp",
    "transforms.updated_at_converter.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
    "transforms.updated_at_converter.field": "updated_at",
    "transforms.updated_at_converter.format": "yyyy-MM-dd'T'HH:mm:ss.SSS",
    "transforms.updated_at_converter.target.type": "Timestamp",
    "transforms.deleted_at_converter.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
    "transforms.deleted_at_converter.field": "deleted_at",
    "transforms.deleted_at_converter.format": "yyyy-MM-dd'T'HH:mm:ss.SSS",
    "transforms.deleted_at_converter.target.type": "Timestamp",
    "transforms.formatTS.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
    "transforms.formatTS.format": "yyyy-MM-dd HH:mm:ss.SSS",
    "transforms.formatTS.field": "kafka_timestamp",
    "transforms.formatTS.target.type": "Timestamp",
    "transforms.insertTS.type": "org.apache.kafka.connect.transforms.InsertField$Value",
    "transforms.insertTS.timestamp.field": "kafka_timestamp",
    "transforms.type_field.type": "com.github.jcustenborder.kafka.connect.transform.common.BytesToString$Value",
    "transforms.type_field.field": "type"
  }
}

Once I execute the POST in order to create the connector I receive the errer as follows:

<html>

<head>
	<meta http-equiv="Content-Type" content="text/html;charset=utf-8" />
	<title>Error 500 Request failed.</title>
</head>

<body>
	<h2>HTTP ERROR 500 Request failed.</h2>
	<table>
		<tr>
			<th>URI:</th>
			<td>/connectors</td>
		</tr>
		<tr>
			<th>STATUS:</th>
			<td>500</td>
		</tr>
		<tr>
			<th>MESSAGE:</th>
			<td>Request failed.</td>
		</tr>
		<tr>
			<th>SERVLET:</th>
			<td>org.glassfish.jersey.servlet.ServletContainer-13278a41</td>
		</tr>
	</table>
	<hr><a href="https://eclipse.org/jetty">Powered by Jetty:// 9.4.33.v20201020</a>
	<hr />

</body>

</html>

and on kafka-connect logs we can see the logs below:

error:

image

plugin being loaded by jdbcSink:

image

image

Important to note that we used the compiled files and put it into the plugin.path on kafka connect:

image

Can someone please help me with that? I have no clues on what to do right now (also tried installing an older version but build throws me some errors as well)

@jcustenborder
Copy link
Owner

@moreiravictor What version did you use? Do you see a guava-*.jar in that directory?

@moreiravictor
Copy link
Author

I used version 0.1.0.58. There's no guava-*.jar in here. Actually it isn't at the pom.xml as well.

@felipemotarocha
Copy link

I'm also experiencing the same issue.

@moreiravictor
Copy link
Author

@jcustenborder added some more info to the issue!

@jcustenborder
Copy link
Owner

@moreiravictor @felipemotarocha Try adding guava-31.1-jre to that directory. Let me know if that resolves the issue.

@felipemotarocha
Copy link

What directory specifically is that? /lib?

@jcustenborder
Copy link
Owner

What directory specifically is that? /lib?

Correct. It should sit right next to the kafka-connect-transform-common.jar

@tzahari
Copy link

tzahari commented Dec 19, 2022

I have the same issue but using the com.github.jcustenborder.kafka.connect.transform.common.ChangeCase$Value class.
Installed it with confluent-hub tool but there is no guava-*.jar

@Sreehari2001
Copy link

@moreiravictor @felipemotarocha Try adding guava-31.1-jre to that directory. Let me know if that resolves the issue.

Hi @jcustenborder Its working when we add this but a small request, can we add this in the repo again. I see in 0.1.0.54 we have this jar in the lib folder but in the latest we removed it. Any specific reason for removing it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants