The Truman Project Logo

The Truman Project

Hackathon Journey

Our Story
The Truman Project Logo

The story of how we built The Truman Project in 48 hours, from concept to a working product that can transcribe conversations and fact-check claims in real-time.

Backend Architecture

WebSocket ServerDeepgram APIFactCheckOrchestratorReal-time Resultsdeepgram.tsTranscriptManagerClaimDetectordeepresearch.tsSpeaker DetectionPunctuationTranscript HistoryGemini ModelClaimQueueOpenAI AnalysisAgentPoolFactVerifierAgent 1Agent 2Agent 3

Process Flow Legend

Main Flow
Processing
Functions
Agents

Key Components

  • WebSocket: Real-time communication
  • Deepgram: Speech transcription
  • ClaimDetector: Uses Gemini to extract factual claims
  • AgentPool: Parallel verification via Perplexity
Team starting the project

The Beginning

Our journey started with an idea to revolutionize fact-checking in real-time conversations.

Team whiteboarding and planning

The Planning Phase

We sketched out our ideas, designed the architecture, and planned the user experience flow.

The completed Truman Project

The Final Product

After 24 hours of coding, testing, and refining, we created The Truman Project.

Our Project Journey

This page describes the journey of building The Truman Project, our motivations, challenges, and the technology stack that powers it.

Inspiration

A close friend's harrowing experience when ICE attempted an unlawful deportation, resulting in a civil suit against the president of the United States, highlighted the urgent need for truthful, transparent, and accessible legal and political information.

Observing consistently low voter turnout during crucial elections underscored the public's disenchantment and mistrust in political processes, often fueled by misinformation.

Witnessing the alarming rise of political inaccuracies, deliberate misinformation, and unverified claims emphasized the importance of reliable fact-checking to maintain the integrity of public discourse.

What it does

Performs real-time fact-checking of speakers during live speeches, debates, and public statements.

Immediately alerts users to inaccuracies, providing verified sources and context to educate and inform.

Enhances civic engagement by empowering voters with accurate, unbiased information directly at their fingertips.

How we built it

We leveraged a multi-agent AI architecture, using a Gemini 2.0 Flash Light head agent to chunk incoming and classify outgoing text, and integrated three Perplexity agents to ensure nuanced analysis and accurate results.

We utilized Deepgram's speech-to-text technology to transcribe live conversations, feeding continuous textual data into our backend.

The frontend was built with React, TypeScript, and Framer Motion.

Challenges we ran into

Ensuring the accuracy and timeliness of the data utilized by the AI, as political facts and contexts frequently change.

Effectively parsing and structuring live conversational text to maintain coherent, contextual continuity for analysis by our AI backend.

Accomplishments that we're proud of

Successfully delivering accurate, real-time conversational context into our sophisticated multi-agent AI system.

Developing a reliable pipeline capable of managing and analyzing live speech data at scale, ensuring minimal latency.

Enhancing trust and credibility by significantly reducing misinformation during critical political dialogues.

What we learned

Understanding how chaining different AI models exponentially amplifies the capabilities and depth of analysis.

Recognizing the critical role of context in AI-driven conversations, particularly in sensitive and fast-changing scenarios like politics.

Discovering practical techniques to maintain real-time responsiveness without compromising accuracy and reliability.

What's next for The Truman Project

We'll introduce a feature for document upload, allowing users to fact-check written documents, policy proposals, and past statements.

We'll also expand the AI system to include historical conversation contexts, enabling it to detect patterns, recurring inaccuracies, and provide deeper analytical insights.

We'll also implement Deepsearch to expand on sources and facts and explore relevant information.