Skip to content

Conversation

@pacwoodson
Copy link

@pacwoodson pacwoodson commented Mar 12, 2024

Very basic modifications to replace ollama by AI-Mask extension

This is a WIP, just to validate it works.

Main issue is: I had to move back the code that was in the web worker to the main app thread as they are isolated and can't communicate with extensions (chrome object undefined)
The embedding phase is now blocking the UI, but that's temporary

TODO:

  • Make it work
  • Move embedding calculations from main thread to AI-Mask
  • Create proper langchain libs for AI-Mask
  • Move back voy into a worker ?

@vercel
Copy link

vercel bot commented Mar 12, 2024

Someone is attempting to deploy a commit to a Personal Account owned by @jacoblee93 on Vercel.

@jacoblee93 first needs to authorize it.

@pacwoodson pacwoodson mentioned this pull request Mar 12, 2024
@jacoblee93
Copy link
Owner

Will check this out this weekend! Thank you for the PR :)

@jacoblee93
Copy link
Owner

jacoblee93 commented Mar 16, 2024

This looks really cool but yeah I think moving out of the worker/blocking on things like embeddings wouldn't be worth it yet - I'm looking into raw WebLLM support first and then will see about a protocol for a Chrome extension. I agree it's a much better form-factor.

CC @jasonmayes who had a similar idea!

@@ -0,0 +1,99 @@
import {
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would love to add this directly to LangChain.js - will open a PR to this repo first with streaming support and see how that goes.

@pacwoodson pacwoodson mentioned this pull request Mar 22, 2024
3 tasks
@pacwoodson
Copy link
Author

Closing in favour of #19

@pacwoodson pacwoodson closed this Mar 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants