-
Notifications
You must be signed in to change notification settings - Fork 97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
c:postgres: support Postgres Arrays in parameter binding #2191
Comments
Oof, we should definitely make that work at least. |
I believe that lists can be bound after #2157 (although I don't know if this extends to the dbapi wrapper!). |
Aha! Perhaps way ahead of me! Just need to wait for a new release in that case. |
Although handling binding Arrow arrays to Postgres Arrays might also be nice or would the 'proper' way for a client to do this be to convert the array to a list scalar? |
Should have done this a moment ago, but I did just check on
|
This should almost sort out my usecase and is precisely the expected behavior! 🥳 Perhaps worth a separate issue for >>> from adbc_driver_postgresql import dbapi
>>> postgres = dbapi.connect("postgresql://localhost:5432/postgres?user=postgres&password=password")
>>> with postgres.cursor() as cur:
... cur.execute('SELECT $1', (pyarrow.array([1, 2, 3]),))
... cur.fetch_arrow_table()
...
pyarrow.Table
?column?: list<item: int64>
child 0, item: int64
----
?column?: [[[1,2,3]]] working because I can see an argument for that being incorrect behavior since a RecordBatch or Table is arguably just a sequence of Arrays so maybe odd to have two different behaviors for passing |
What feature or improvement would you like to see?
The simplest case of this is something like this called from Python:
Currently, this results in
Along the same lines it would be nice if PyArrow arrays could be bound similarly like:
My use case for this is to be able to have a parameterized query which looks something like:
Other simpler but more common query patterns which might use this would be
The text was updated successfully, but these errors were encountered: