LLM-Powered Chat Application Deployment on AWS

Will Danckwerts

0

Automation Engineer

Cloud Security Engineer

AI Developer

AWS

Docker

Ollama

Developed and deployed a self-hosted LLM-powered chat application to AWS ECS with autoscaling capabilities, providing a secure, isolated environment for Generative AI exercises while maintaining full compliance with stringent security standards.
The architecture minimised operational overhead and redundant costs, supporting a strategic market entry. This project showcased my ability to build scalable and secure solutions tailored to business needs, leveraging cloud-native tools and DevOps practices.
Like this project
0

Posted Dec 4, 2024

Built and deployed a secure, autoscaling LLM chat app on AWS ECS, enabling GenAI demos in an isolated environment for strategic market entry initiatives.

Likes

0

Views

0

Tags

Automation Engineer

Cloud Security Engineer

AI Developer

AWS

Docker

Ollama

Virtual Desktop Coding Environment in AWS/Azure
Virtual Desktop Coding Environment in AWS/Azure
Data Engineering for a Sustainability-Focused Startup
Data Engineering for a Sustainability-Focused Startup