Skip to content

feat(proxy): Add proxy configuration support#351

Open
liangzhang-keepmoving wants to merge 2 commits intoNevaMind-AI:mainfrom
liangzhang-keepmoving:feature/proxy-configuration
Open

feat(proxy): Add proxy configuration support#351
liangzhang-keepmoving wants to merge 2 commits intoNevaMind-AI:mainfrom
liangzhang-keepmoving:feature/proxy-configuration

Conversation

@liangzhang-keepmoving
Copy link

Description

Adds proxy configuration support to the memU project, allowing users to configure HTTP proxies for LLM and embedding clients. This feature addresses the issue of Telegram connection difficulties in some regions by enabling proxy settings.

Changes Made

  1. Added proxy configuration option in LLMConfig model (src/memu/app/settings.py)
  2. Updated service initialization to pass proxy config to LLM clients (src/memu/app/service.py)
  3. Modified HTTPLLMClient to accept and use proxy parameter (src/memu/llm/http_client.py)
  4. Updated OpenAISDKClient to support proxy configuration via httpx (src/memu/llm/openai_sdk.py)
  5. Added proxy support to LazyLLMClient (src/memu/llm/lazyllm_client.py)

Usage Example

from memu.app import MemoryService

# Create service with proxy configuration
service = MemoryService(
    llm_profiles={
        "default": {
            "api_key": "your_api_key",
            "proxy": "http://your-proxy-server:8080"  # Proxy configuration
        }
    },
    # Other configurations...
)

Benefits

  • Solves Telegram connection issues in regions with restricted access
  • Provides flexible proxy settings for all LLM providers
  • Supports all client backends (httpx, SDK, lazyllm)
  • Maintains backward compatibility - proxy is optional

References

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants