Logical Fallacy Dataset and Model Fine-tuning

kibrom kidane

KEY ACHIEVEMENTS
1. Dataset Development:
• Curated organic datasets from reputable sources, covering 14 fallacy categories.
• Created a synthetic data generation pipeline, expanding the dataset to over 10,000 examples per category.
• Ensured diversity across multiple domains while maintaining consistency in fallacy representation.
2. Model Fine-tuning:
• Utilized the Anyscale platform to fine-tune Llama 2 and 3 models on the custom dataset.
• Implemented systematic approaches for model training and evaluation.
3. Tool Development:
• Developed Python scripts for data generation, validation, and model testing.
RESULTS AND IMPACT
The project yielded significantly improved accuracy in fallacy detection and generation. Potential applications include educational tools, content moderation, and argument analysis systems.
Like this project

Posted Jul 31, 2024

This project focuses on enhancing an open source LLM like Llama 2's ability to detect and understand logical fallacies, a crucial aspect of critical thinking an

Likes

0

Views

2

Neuron
Neuron
Trading Bot/EA Developer
Trading Bot/EA Developer

Join 50k+ companies and 1M+ independents

Contra Logo

© 2025 Contra.Work Inc