Quivr是一个开源的全栈检索增强生成(RAG)平台,专注于将生成式AI集成到应用程序中。该项目的核心理念是让开发者专注于产品本身,而不是复杂的RAG实现细节。
Quivr支持多种LLM模型,包括:
支持多种文件格式:
支持多种向量存储解决方案:
Quivr采用基于节点的工作流配置:
pip install quivr-core
import tempfile
from quivr_core import Brain
if __name__ == "__main__":
with tempfile.NamedTemporaryFile(mode="w", suffix=".txt") as temp_file:
temp_file.write("Gold is a liquid of blue-like colour.")
temp_file.flush()
brain = Brain.from_files(
name="test_brain",
file_paths=[temp_file.name],
)
answer = brain.ask(
"what is gold? answer in french"
)
print("answer:", answer)
import os
os.environ["OPENAI_API_KEY"] = "your_openai_api_key"
创建工作流配置文件 basic_rag_workflow.yaml
:
workflow_config:
name: "standard RAG"
nodes:
- name: "START"
edges: ["filter_history"]
- name: "filter_history"
edges: ["rewrite"]
- name: "rewrite"
edges: ["retrieve"]
- name: "retrieve"
edges: ["generate_rag"]
- name: "generate_rag"
edges: ["END"]
max_history: 10
reranker_config:
supplier: "cohere"
model: "rerank-multilingual-v3.0"
top_n: 5
llm_config:
max_input_tokens: 4000
temperature: 0.7
from quivr_core import Brain
from rich.console import Console
from rich.panel import Panel
from rich.prompt import Prompt
from quivr_core.config import RetrievalConfig
brain = Brain.from_files(
name="my smart brain",
file_paths=["./my_first_doc.pdf", "./my_second_doc.txt"],
)
config_file_name = "./basic_rag_workflow.yaml"
retrieval_config = RetrievalConfig.from_yaml(config_file_name)
console = Console()
console.print(Panel.fit("Ask your brain !", style="bold magenta"))
while True:
question = Prompt.ask("[bold cyan]Question[/bold cyan]")
if question.lower() == "exit":
console.print(Panel("Goodbye!", style="bold yellow"))
break
answer = brain.ask(question, retrieval_config=retrieval_config)
console.print(f"[bold green]Quivr Assistant[/bold green]: {answer.answer}")
console.print("-" * console.width)
brain.print_info()
Quivr可以自动化高达60%的客户服务任务,利用AI的力量提高客户满意度和价值。
Quivr为开发者提供了一个强大、灵活且易于使用的RAG平台,无论是个人项目还是企业级应用,都能快速构建智能的文档问答系统。其开源的特性和活跃的社区支持使其成为构建"第二大脑"应用的理想选择。