-
Notifications
You must be signed in to change notification settings - Fork 10
Implement translate-document tool for translating documents via the DeepL API #12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Implement translate-document tool for translating documents via the DeepL API #12
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry - I didn't realize that no one had looked at your PR!
So... I wonder if we could support files in a more natural way, via one of two mechanisms:
-
files brought into the flow from use of a tool like https://github.com/modelcontextprotocol/servers/tree/main/src/filesystem . In such a case, we'd get the file from their
read_text_file
,read_media_file
, orread_multiple_files
methods. Not sure whether this would be a common use case, but for users who'd installed a filesystem tool, we'd enable them to ask their AI client something like "Can you find that file in my Downloads folder that has Greek in it, and translate that into Spanish?" -
This resource, at least, tells us that LLMs have various ways of working with files. It does talk about Claude's way in particular.
Are you up for investigating either of these?
Hi, any news on this feature ? |
Hey @MarouaneZhani ! Glad to hear you're interested in this, because I don't think any of us have been pursuing it further. I don't know whether there's a standard way to access files an AI client has downloaded. Can you say more about your use case, and how precisely you'd like to use this? This could help us understand the best way to implement. |
Thanks @morsssss for your fast answer. |
Can you provide more details here? How would you set up this shared volume? Would you then need the MCP to have filesystem access as well? (In this case, it might be easier to compose our server with an existing filesystem server.) |
My idea would be that the AI client and DeepL MCP share a PVC, and then the LLM transmit the path to this file ... |
In the meantime, @akash-joshi actually implemented a document translation tool. You just give it a local path to your file, and it does the rest. It depends on your client having another tool installed that accesses the filesystem. In my own setup, I do this with Anthropic's own |
Context
So far, only the text translation and text rephrasing functionality of DeepL is exposed via MCP tools.
This change adds a tool for translating documents via the DeepL API. This allows the user/LLM to translate documents such as *.pdf, *.docx and more (see "Document Formats" resource below).
Example usage
Request:
Response:
Limitations
Currently, Claude cannot transmit files via the MCP protocol that have been drag&dropped into Claude Desktop. Hence, this implementation depends on the user specifying the path of the file, rather than drag&dropping the file into Claude Desktop.
Resources