Whitepaper Finder by Facundo CappellaWhitepaper Finder by Facundo Cappella

Whitepaper Finder

Facundo Cappella

Facundo Cappella

Whitepaper Agent

A specialized AI agent designed to research and analyze whitepapers using Arxiv, built with Next.js 14, LangChain, and OpenAI.

šŸ“– Documentation

Architecture Overview: Understand how the "Brain" (LangChain), "Library" (Arxiv), and "interface" works together.
Challenges & Limitations: Read about the technical challenges like language barriers (Spanish/English) and static knowledge bases.

Features

🧠 Intelligent Research Agent: Uses LangChain to reason about queries and decide when to fetch external data.
šŸ“š Arxiv Integration: Automatically searches and retrieves recent scientific papers to answer technical queries.
šŸ’¬ Smart Context: Maintains conversation history in Local Storage for a seamless experience.
šŸš€ Streaming Responses: Real-time feedback using Server-Sent Events (SSE).
🌐 Language Bridge: Seamlessly handles Spanish queries by finding relevant English papers and synthesizing answers back in Spanish.
šŸŽØ Modern UI: Polished interface with shadcn/ui and responsive design.

Project Structure

src/
ā”œā”€ā”€ app/
│ ā”œā”€ā”€ api/chat/ # Streaming chat API endpoint
│ ā”œā”€ā”€ layout.tsx # Root layout with metadata
│ └── page.tsx # Main chat page
ā”œā”€ā”€ components/
│ ā”œā”€ā”€ chat/ # Chat-specific components
│ │ ā”œā”€ā”€ chat-container.tsx
│ │ ā”œā”€ā”€ chat-input.tsx
│ │ ā”œā”€ā”€ message-bubble.tsx
│ │ ā”œā”€ā”€ message-list.tsx
│ │ └── typing-indicator.tsx
│ └── ui/ # shadcn/ui base components
ā”œā”€ā”€ hooks/
│ └── use-chat.ts # Chat state management hook
└── lib/
ā”œā”€ā”€ config/ # Environment configuration
ā”œā”€ā”€ langchain/ # LangChain client and services
│ ā”œā”€ā”€ client.ts # ChatOpenAI singleton
│ ā”œā”€ā”€ chat-service.ts
│ └── types.ts
└── prompts/ # Decoupled prompt management
ā”œā”€ā”€ system-prompts.ts
└── templates.ts

Getting Started

Prerequisites

Node.js 18+
npm or pnpm
OpenAI API key

Installation

Clone and install dependencies:
cd chatbot-app
npm install
Configure environment variables:
cp .env.example .env.local
Edit .env.local and add your OpenAI API key:
OPENAI_API_KEY=sk-your-api-key-here
Run the development server:
npm run dev
Open http://localhost:3000 in your browser.

Configuration

Environment Variables

Variable Description Default OPENAI_API_KEY Your OpenAI API key Required OPENAI_MODEL Model to use gpt-4o OPENAI_TEMPERATURE Response creativity (0-1) 0.2 OPENAI_MAX_TOKENS Maximum response length 2048

Customizing Prompts

Prompts are decoupled in src/lib/prompts/system-prompts.ts. To add a new prompt:
export const SYSTEM_PROMPTS = {
// ... existing prompts

custom: {
version: '1.0.0',
content: `Your custom system prompt here...`,
},
} as const;
Then use it in the ChatContainer:
<ChatContainer systemPromptKey="custom" />

Best Practices Applied

This project follows best practices from:
React Best Practices - Parallel fetching, Suspense boundaries, memoization
Agentic Patterns - Tools integration, reasoning loops, prompt engineering
UI/UX Guidelines - Accessible components, clean typography, responsive layout

Scripts

npm run dev      # Start development server
npm run build # Build for production
npm run start # Start production server
npm run lint # Run ESLint

License

MIT
Like this project

Posted Mar 29, 2026

AI agent for whitepaper research via Arxiv. Features LangChain reasoning, streaming chat, multilingual support. Built with Next.js 14 & TypeScript.