STOCK ANALYST
- Create and activate conda environment:
conda create -n stock-analyst python=3.11 -y
conda activate stock-analyst- Install dependencies:
pip install -r requirements.txtOr install individually:
pip install pyyaml python-slugify newspaper3k google-search-results lxml_html_clean- Set API keys:
export SERPAPI_API_KEY='your_api_key_here'
export OPENAI_API_KEY='your_openai_key_here'
export ANTHROPIC_API_KEY='your_anthropic_key_here' # Optional: for Claude modelsOn Windows:
set SERPAPI_API_KEY=your_api_key_here
set OPENAI_API_KEY=your_openai_key_here
set ANTHROPIC_API_KEY=your_anthropic_key_herepip install --upgrade --force-reinstall git+https://github.com/Agentic-Analyst/vynn-core.git- Deploy to server:
docker buildx build --platform linux/amd64,linux/arm64 \
-t fuzanwenn/stock-analyst:latest --push .
docker pull fuzanwenn/stock-analyst:latest- Scrape articles:
python src/article_scraper.py --company "NVIDIA" --ticker NVDA --max 15- Filter articles:
python src/filter.py --ticker NVDA --min-score 5.0 --max-articles 8 --save-filtered --output-report- Screen and analyze:
python src/screener.py --ticker NVDA --min-confidence 0.7 --output-report --detailed-analysis- Scrape financial data:
python src/financial_scraper.py --ticker NVDA --statements modeling --save- Generate financial models:
python src/financial_model_generator.py --ticker NVDA --model comprehensive --save-excel- Complete pipeline with LLM selection:
# Default model
python main.py --ticker NVDA --company "NVIDIA" --email user@example.com --timestamp 20241003_120000
# Use specific model
python main.py --ticker NVDA --company "NVIDIA" --email user@example.com --timestamp 20241003_120000 --llm claude-3.5-sonnet
# List available models
python main.py --list-llmsThe Supervisor Agent provides intelligent, LLM-powered workflow orchestration that dynamically routes between analysis agents based on the current state. This is ideal for exploratory analysis where the system adapts to available data.
Key Features:
- 🧠 LLM-powered dynamic routing (non-sequential, intelligent decisions)
- ✅ Prerequisite validation (prevents invalid routing)
- 🔄 Deterministic fallback (when LLM fails)
- 📊 Complete observability (routing decisions, agent execution, results)
Quick Start:
# Basic supervisor workflow
python supervisor_main.py --ticker NVDA --company "NVIDIA" --email user@example.com
# Use Claude Sonnet for routing
python supervisor_main.py --ticker AAPL --company "Apple" --email user@example.com --llm claude-3.5-sonnet
# Run in Docker
./run_supervisor_docker.sh NVDA "NVIDIA" user@example.comSee README_SUPERVISOR.md for complete documentation.
The traditional main.py pipeline runs a fixed 7-step workflow in sequence. This is ideal for production scenarios where you want predictable, repeatable analysis.
Quick Start:
# Complete pipeline with LLM selection
python main.py --ticker NVDA --company "NVIDIA" --email user@example.com --timestamp 20241003_120000
# Use specific model
python main.py --ticker NVDA --company "NVIDIA" --email user@example.com --llm claude-3.5-sonnet
# List available models
python main.py --list-llmsBoth workflows support multiple LLM providers:
Available Models:
gpt-4o-mini(OpenAI) - Fast and cost-effectiveclaude-3.5-sonnet(Anthropic) - Balanced quality/speedclaude-3.5-haiku(Anthropic) - Fastest and cheapestclaude-3-opus(Anthropic) - Most capable
Balanced Weighting: Clear distribution of score components:
- Ticker mentions: up to 30%
- Relevance keywords: up to 40%
- Quality indicators: up to 20%
- Penalties: up to 30% deduction
Score Distribution:
- 0-3: Low relevance articles
- 3-5: Medium relevance articles
- 5-7: High relevance articles
- 7-10: Extremely relevant articles
- Multi-LLM Support: Choose between OpenAI GPT-4o-mini and Anthropic Claude models
- Autonomous News Collection: Comprehensive multi-aspect news scraping with AI-powered query generation
- Smart Filtering: Advanced relevance scoring based on content analysis
- Investment Screening: Extracts growth catalysts, risks, and mitigation strategies
- Financial Data Collection: Comprehensive financial statements and market data scraping
- LLM-Powered Financial Modeling: Generates professional DCF models and valuation analysis
- Professional Analyst Reports: AI-generated comprehensive financial reports
- Excel/CSV Export: Professional financial models in Excel format ready for analysis
- Structured Reports: Generates comprehensive markdown reports and JSON data
- Complete Pipeline: End-to-end workflow from news analysis to financial modeling
- Configurable: Adjustable confidence thresholds, LLM selection, and output options