DeepSeek is an AI-powered code companion designed to assist with coding, debugging, and providing solutions to programming problems. It uses the Ollama API and LangChain framework to generate helpful responses based on user input.
- Python Expertise: Provides solutions, code snippets, and explanations for Python-related queries.
- Debugging Assistance: Helps identify and debug issues in your code with useful print statements and suggestions.
- Code Documentation: Automatically generates explanations and comments for code.
- Solution Design: Offers high-level design and architectural suggestions for software development.
Before using this app, make sure you have the following:
- Python 3.7+ installed on your system.
- Streamlit installed (
pip install streamlit
). - LangChain installed (
pip install langchain
). - Ollama's API endpoint running locally on your machine. (You need to install Ollama and run their server on
http://localhost:11434
)
To get started, install the necessary Python libraries:
pip install streamlit langchain langchain_ollama
Make sure you have Ollama installed and running locally. Visit Ollama's website for detailed instructions on installation and configuration.
Clone the repository (or create a new file) containing your code.
git clone https://github.com/TanushreeSB/deepseek.git
cd deepseek
To start the Streamlit app, run:
streamlit run app.py
This will launch the app in your browser at http://localhost:8501
.
- User Interaction: The user types a programming-related question in the input box.
- AI Response Generation: The system prompts the Ollama model for an appropriate response, which can be a code solution, debug suggestion, or documentation.
- Chat Interface: All interactions are logged and displayed in a chat-like interface, allowing users to have ongoing conversations with the AI assistant.
- Model Selection: Choose between two pre-configured models:
deepseek-r1:1.5b
ordeepseek-r1:3b
. These models differ in their capability and performance. - Customization: Modify the system prompt to change how the assistant responds (concise, debug-friendly, etc.).
-
Main UI: The main user interface displays a chat interface for interacting with the AI assistant. It uses Streamlit's layout capabilities for the chat experience.
-
Sidebar: Contains options to configure the model and other settings.
-
System Message: The
system_prompt
defines the AI's behavior, guiding it to act as a concise coding assistant. -
Message Log: All user inputs and AI responses are stored in
st.session_state.message_log
, ensuring the chat history is maintained across interactions. -
Chat Interface: A custom chat interface is created using Streamlit components to display both user and AI messages.
-
Response Generation: Upon receiving user input, the app constructs a prompt chain using
LangChain
and sends it to Ollama for processing. -
Reset: A button is provided to clear the chat history and restart the interaction.