-
Notifications
You must be signed in to change notification settings - Fork 55
Description
Using Java language
DuckDB adds Ducklake and Postgres.
I'm using Druid to manage connections in Java. Initially, I was able to write metadata to Postgres and generate Parquet files on disk, but after a while, the generation stopped. What's the cause?
·····
The technology stack is: Spring Boot + MyBatis + DruidSource + DuckDB + DuckLake + Postgres.
DruidSource is used to manage DuckDB. DuckDB connections are managed with a data connection pool.
At the beginning of the project, I could generate Parquet files using INSERT INTO statements and view the corresponding metadata in Postgres.
However, after a few hours, the corresponding metadata in Postgres could no longer be viewed, but there were no errors.
I simulated queries using two machines.
Initially, the two machines could share data for queries because the queries were performed through Postgres, and the files were written to disk, generating Parquet files.
However, after a few hours, when one machine wrote data, the Parquet files could not be generated, and the other machine could not read it. I suspect that the data was being written directly to the memory of the current machine.
What is causing this? Is the Postgres connection to DuckDB disconnected?
How can this be avoided?
····
Sample Code :
DuckDbContextHolder.java