You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As this is an example / "Getting Started" project, I would consider including a security warning that indicates that users should exercise caution when granting LLM access to their database, especially if they plan to make it public-facing, due to prompt injection risk.
Some of the more obvious commands are blocked here but there are enough missing that imo it warrants a heads-up. Users can also convince the LLM to chain multiple commands to bypass the !startsWith("select")
This sort of thing works on the demo app and is something you wouldn't want leaking.
The text was updated successfully, but these errors were encountered:
Hello,
As this is an example / "Getting Started" project, I would consider including a security warning that indicates that users should exercise caution when granting LLM access to their database, especially if they plan to make it public-facing, due to prompt injection risk.
Some of the more obvious commands are blocked here but there are enough missing that imo it warrants a heads-up. Users can also convince the LLM to chain multiple commands to bypass the
!startsWith("select")
This sort of thing works on the demo app and is something you wouldn't want leaking.

The text was updated successfully, but these errors were encountered: