This guide will help you set up and run the LangGraph PM Maestro project locally.
git clone https://github.com/PrasanKumar93/langgraph-pm-maestro.git
cd langgraph-pm-maestro
npm install
- Copy the example environment file to create your own configuration:
cp .env.example .env
-
Open the
.env
file and provide the necessary credentials-
Mandatory fields
-
DEFAULT_LLM_PROVIDER
: openai (openai/ aws_bedrock) -
If OpenAI is selected as DEFAULT_LLM_PROVIDER, then the following fields are required
OPENAI_API_KEY
: For LLM provider OpenAI
-
If AWS Bedrock is selected as DEFAULT_LLM_PROVIDER, then the following fields are required
AWS_BEDROCK_MODEL_NAME
: For LLM provider AWS BedrockAWS_REGION
: For AWS regionAWS_ACCESS_KEY_ID
: For AWS access key idAWS_SECRET_ACCESS_KEY
: For AWS secret access keyAWS_SESSION_TOKEN
: For AWS session token
-
REDIS_URL
: For checkpointers, vector DB, LLM cache ..etc -
TAVILY_SEARCH_API_KEY
: To search the latest web content for competitor analysis of the requested feature
-
-
Optional fields group
- SALESFORCE CONFIGURATION : To search the requested feature in Salesforce
- JIRA CONFIGURATION : To search the requested feature in Jira
- SLACK CONFIGURATION : To access the agent via slack
-
npm run dev
- Launches the agent locally and opens the studio interface in your browser at:
https://smith.langchain.com/studio?baseUrl=http://localhost:2024
(Test inchrome
browser)
To interact with the agent through Slack:
npm run start-slack-bot
Once running, you can communicate with the bot in your registered Slack channel. The agent will respond to your messages automatically.