Skip to content

MagPasulke/llm_client

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ABAP LLM Client

Release Status

BETA - Expect bugs outside of the current test scope --> properly test before production use. Breaking changes are possible but will be documented.

Documentation

See Overview in docs folder.

Overview

ABAP LLM Client provides a universal strongly opinionated interface to on-premise and cloud-based LLM APIs, focusing primarily on chat endpoints. The client is designed to seamlessly integrate with various LLM providers while maintaining a consistent interface for ABAP applications.

Currently supported providers:

  • OLLAMA
  • OpenAI
  • OpenRouter
  • Azure OpenAI (only OpenAI Service endpoint, not AI Foundry)
  • Anthropic (structured output is not supported, considering simulating this via tool calls - vote in issue for that)

Features

  • Chat messages
  • Structured output
  • Tool (function) call
  • Multiple provider integration
  • Customizable

Known Limitations

  • Tested only on 7.52 & ABAP 2022
  • Supports only API-Key authentication (further ones likely based on provider requirements)
  • No advanced parsing to compensate LLM output errors especially in structured output and tool calls

Not planned

The following topics are currently not considered but feel free to challenge that in discussions if you have a specific use case.

  • Downport lower than 7.52 due to efforts to test this
  • Port to ABAP Cloud - if you use BTP you might aswell consider using Python or other languages with better GenAI library support
  • Non-Chat (e.g. image, audio, video) support

Why ABAP LLM Client?

ABAP might not be the first environment that comes to mind for GenAI projects, but it's the foundation for critical business processes in many organizations. This client exists to bridge this gap: It provides a straightforward way to integrate LLM capabilities into existing ABAP business logic.

Core Ideas

  • Simple Integration: Focus on the most commonly needed LLM features (mainly chat endpoints) rather than trying to support every possible API feature
  • Practical Approach: Built for ABAP developers who need to enhance existing business functions with LLM capabilities, not for ML experiments
  • Stable Integration: As tools and structured output become standard features across LLM providers, integration becomes more reliable and maintainable
  • Real-World Usage: Designed for actual business scenarios where you might want to:
    • Enhance existing ABAP reports with AI-powered insights
    • Add LLM capabilities to business workflows
    • Prototype AI features in existing ABAP applications

No Frills

This is not a full-featured ML framework or a complex AI platform. It's a practical tool for ABAP developers who need to call LLM APIs and process their responses within their ABAP environment. Nothing more, nothing less. If you need to integrate LLM capabilities into your ABAP systems and want to do it with minimal overhead, this client might be useful for you.

Usage

Refer to the llm_client_test repository on how to use the llm client and the docs folder for further details.

⚠️ Security Notice: Currently no safeguards. Use API keys with appropriate usage limits to prevent accidental charges e.g. for unexpected endless loops, etc.

Extension

Custom LLM Authenticaton implementations can be created by:

  1. Sub-classing the specific provider class
  2. Redefining the SET_AUTH method In general for most parts interfaces are available to easily use custom implementations preferrably via sub-classes.

Roadmap - Planned Features

With the Beta release my focus is on using this library for further AI based topics (stay tuned) --> bug fixes have priority. Open a discussion or an issue if you have a specific requirement. Further features mid-term:

  • More providers:
    • Azure AI Foundry
    • AWS Bedrock
    • Google Gemini Devleoper
    • Google Vertex AI
  • Workaround to force use required tool to simulate structured output where providers miss this feature (e.g. Azure AI Foundry)
  • Better UX for the settings and customizing tasks
  • Better error- and edge-case handling
  • Optional auto-retry for expected error cases (e.g. 429 HTTP Error)
  • Optimizations:
    • Code re-use for the clients where useful
    • Consider a common JSON-Schema builder for structured output and tool use
    • Performance Optimizations (low prio)
  • ??? - Open issues/discussions for whatever comes to your mind

Contributing

Feel free to:

  • Open issues for bug reports
  • Use discussions for feature requests
  • Request/Implement additional provider implementations
  • Suggest concept improvements

Security Recommendations

  • Use SM59 supported authentication methods e.g. by using an API Gateway for credential mapping
  • Set appropriate usage limits on API keys and monitor their usage
  • Implement custom authorization checks to limit who can use the LLM Client

About

LLM Client for ABAP MaPa Fork

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • ABAP 100.0%