What's New in ReactorAI
Gemini Models Support - Latest Update!
- Google Gemini Integration: Access powerful Gemini Pro and Gemini Flash models
- Cloud-Based AI: No local setup required - just add your API key
- Seamless Switching: Toggle between Ollama (local) and Gemini (cloud) models
- Easy Configuration: Simple API key setup in settings
- Enhanced Performance: Choose the best model for your specific needs
Update now to experience the power of Google's latest AI models!
Model Context Protocol (MCP)
ReactorAI fully supports the Model Context Protocol (MCP), an open standard that enables AI models to securely interact with local and remote resources. This turns your AI from a text generator into a capable agent that can work with your data and tools.
Capabilities
With MCP, ReactorAI can:
- Access Local Files: Read, write, and analyze files on your computer.
- Connect to Databases: Query SQL/NoSQL databases directly.
- Use External Tools: Execute scripts, call APIs, and automate workflows.
- Browse the Web: (Via compatible MCP servers) Search and retrieve live information.
Built-in File System MCP
ReactorAI comes with a pre-installed File System MCP server, ready to use out of the box. No configuration required.
Read & Analyze
Ask ReactorAI to "Analyze the code in my project folder" or "Summarize this PDF". It can traverse directories and read file contents.
Create & Edit
Generate new files, refactor code, or write documentation directly to your disk. "Create a new Python script for..."
Adding Custom MCPs
You can extend ReactorAI with any MCP server, just like you do with Claude Desktop. The configuration is compatible and simple.
Prerequisites: Node.js & uv
Most MCP servers require a runtime environment. Ensure you have these installed:
- Node.js: Required for
npxservers. Download Here. - uv: Required for Python
uvservers. See installation below.
Install uv (macOS/Linux):
curl -LsSf https://astral.sh/uv/install.sh | sh
Install uv (Windows):
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
Install uv (via pip):
pip install uv
How to Add a Custom MCP:
- Click the MCP icon in the chat area to open the menu.
- Click the + icon next to MCP Tools.
- Enter the configuration JSON (same format as Claude).
Example Configuration (PostgreSQL):
{
"postgres": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"mcp/postgres",
"postgresql://user:password@localhost:5432/db"
]
}
}
Quick Access from Chat
Manage your MCP servers instantly without leaving your conversation. Click the MCP icon in the chat input area to open the tools menu.
Instant Control
Toggle servers on/off with a single click. Useful for quickly enabling specific tools for a task.
At-a-Glance Status
See which servers are active and how many tools each provides (e.g., "filesystem: 14 tools").
Pro Tip: Shared Ecosystem
Since ReactorAI uses the standard MCP protocol, you can use the vast ecosystem of existing MCP servers created for Claude and other tools!
All Features
- Local & Remote Connectivity: Connect to AI models hosted locally (Ollama) or remotely (Ollama/Gemini).
- Model Selection: Choose from multiple Ollama models or Gemini models using a dropdown menu.
- Session Management: View, rename, search, and manage your chat history.
- Session Prompts: Customize system messages to guide the model’s behavior.
- Backup & Restore: Export and import chats for safekeeping.
- Security: All data stays on your device unless using a remote server.
- Multilingual Support: Use the app in multiple languages, including Arabic.
Core Functionality
- AI-Powered Chat/Conversation: Interact with AI models through a chat interface.
- AI-Powered Image Description/Captioning: Generate descriptions and captions for images using AI.
- Code Generation/Display: Generate and display code snippets within the chat.
Effective Prompting Tips
Getting the best results from AI models requires effective prompting. Here are quick tips to improve your interactions:
Quick Tips:
- Be Specific: "Write a Python function to sort a list" is better than "help with Python"
- Set Context: Explain what you're working on and your experience level
- Use Session Prompts: Set consistent behavior with ReactorAI's session prompt feature
- Iterate: Refine your prompts based on the responses you get
Want to Master AI Prompting?
Check out our comprehensive prompting guide with advanced techniques, model-specific tips, and real examples.
Read Full Prompting GuideScreens
Home Screen
Features Overview:
- Top Bar:
- ReactorAI Logo: Application branding.
- Search Bar: Search messages/sessions.
- Settings Icon: Access app settings.
- Help Icon: Access documentation/help.
- Left Sidebar:
- Send Dropdown: Select number of last messages for context.
- Model Selection: Choose the AI model.
- Create New Session: Create and name new chat sessions.
- Session History: Manage and restore previous sessions.
- Chat Interface:
- Image Input/Display: Input and display images.
- Request Input: Input prompts.
- Send Button: Send messages.
- Message Display: Displays user and AI messages.
- AI Model Responses: Shows model name.
- Copy Button: Copy AI responses.
- Delete Button: Delete any message.
Screenshots
Home Screen
Chat Interface
Settings
Model Selector
Using the App
- Install and launch the app on your platform
- In settings, enter your local or remote Ollama URL, or configure Gemini API access
- Select a model and start a new chat session
- Use the session prompt input to customize behavior
- Manage chats via the sidebar: search, rename, delete
- Utilize image analysis by sending images to the chat
Need Help with Ollama Setup?
Check our detailed guide for installing Ollama and downloading models via command line.
View Ollama Installation Guide