Quivr is an open-source, full-stack Retrieval Augmented Generation (RAG) platform focused on integrating generative AI into applications. The core philosophy of the project is to allow developers to focus on the product itself, rather than the complex details of RAG implementation.
Quivr supports various LLM models, including:
Supports multiple file formats:
Supports multiple vector storage solutions:
Quivr adopts a node-based workflow configuration:
pip install quivr-core
import tempfile
from quivr_core import Brain
if __name__ == "__main__":
with tempfile.NamedTemporaryFile(mode="w", suffix=".txt") as temp_file:
temp_file.write("Gold is a liquid of blue-like colour.")
temp_file.flush()
brain = Brain.from_files(
name="test_brain",
file_paths=[temp_file.name],
)
answer = brain.ask(
"what is gold? answer in french"
)
print("answer:", answer)
import os
os.environ["OPENAI_API_KEY"] = "your_openai_api_key"
Create a workflow configuration file basic_rag_workflow.yaml
:
workflow_config:
name: "standard RAG"
nodes:
- name: "START"
edges: ["filter_history"]
- name: "filter_history"
edges: ["rewrite"]
- name: "rewrite"
edges: ["retrieve"]
- name: "retrieve"
edges: ["generate_rag"]
- name: "generate_rag"
edges: ["END"]
max_history: 10
reranker_config:
supplier: "cohere"
model: "rerank-multilingual-v3.0"
top_n: 5
llm_config:
max_input_tokens: 4000
temperature: 0.7
from quivr_core import Brain
from rich.console import Console
from rich.panel import Panel
from rich.prompt import Prompt
from quivr_core.config import RetrievalConfig
brain = Brain.from_files(
name="my smart brain",
file_paths=["./my_first_doc.pdf", "./my_second_doc.txt"],
)
config_file_name = "./basic_rag_workflow.yaml"
retrieval_config = RetrievalConfig.from_yaml(config_file_name)
console = Console()
console.print(Panel.fit("Ask your brain !", style="bold magenta"))
while True:
question = Prompt.ask("[bold cyan]Question[/bold cyan]")
if question.lower() == "exit":
console.print(Panel("Goodbye!", style="bold yellow"))
break
answer = brain.ask(question, retrieval_config=retrieval_config)
console.print(f"[bold green]Quivr Assistant[/bold green]: {answer.answer}")
console.print("-" * console.width)
brain.print_info()
Quivr can automate up to 60% of customer service tasks, leveraging the power of AI to improve customer satisfaction and value.
docker compose -f docker-compose.dev.yml up --build
Quivr provides developers with a powerful, flexible, and easy-to-use RAG platform. Whether for personal projects or enterprise applications, it enables the rapid construction of intelligent document question-answering systems. Its open-source nature and active community support make it an ideal choice for building "second brain" applications.