LLM-Powered Chat Application Deployment on AWS

Will Danckwerts

Automation Engineer
Cloud Security Engineer
AI Developer
AWS
Docker
Ollama
Developed and deployed a self-hosted LLM-powered chat application to AWS ECS with autoscaling capabilities, providing a secure, isolated environment for Generative AI exercises while maintaining full compliance with stringent security standards.
The architecture minimised operational overhead and redundant costs, supporting a strategic market entry. This project showcased my ability to build scalable and secure solutions tailored to business needs, leveraging cloud-native tools and DevOps practices.
Partner With Will
View Services

More Projects by Will