Wamaitha Nyamu
Let's say you publish longform videos on YouTube and want to break them down into shortform content for other platforms such as TikTok or Instagram Reels. This project implements an event drive architecture that uses, RabbitMQ Celery, OpenAI LLM, whisper, FastAPI and Supabase. RabbitMQ is used as the broker while celery is used as the worker. OpenAI is used as the LLM that helps get insighful clips from the transcription we get back from Whisper. All the information is stored on Supabase.
I made use of an event driven architecture that had the following:
Docker - running Rabbitmq in a container
Celery - offloading tasks from the main thread to celery workers
FastAPI- I implemented the code using an API endpoint, instead of manually adding the urls to the code, the new implementation has an end point
Nginx - exposed the API endpoint on the web, such that anyone with the link can send requests to have their videos processed
Supabase- at every stage I had data updated on supabase so as to avoid transcribing or generating clips of videos we had before
Bash - writing scripts that run the celery workers, and the rabbitmq consumers
Whisper - Open AI model for transcription.
OpenCV - for transforming the videos.
Quick demo of the hosted version on a Linux VM on GCP