Freelancers using Keras
Freelancers using Keras
Sign Up
Post a job
Sign Up
Log In
Filters
1
Projects
People
Istiak Ahmed Khan
Dhaka, Bangladesh
Power BI Data Analyst + ML AI Automation Expert
5.0
Rating
94
Followers
Follow
Message
Power BI Data Analyst + ML AI Automation Expert
4
End-to-End Machine Learning Pipeline for Telecom Customer Churn 1. The Business Problem Customer churn is a major challenge for telecommunications companies, driven by competition, service issues, and changing consumer preferences. This project was designed to transition the company from reactive support to proactive retention using data-driven strategies such as customer segmentation, personalized offers, and loyalty programs,. 2. Data Exploration & Insights (EDA) I performed a comprehensive descriptive analysis on a database of 7,043 customers with 21 distinct variables,. Key findings included: Contractual Risk: Customers on month-to-month contracts showed significantly higher churn compared to those on one- or two-year commitments,. Service Preference: While Fiber Optic plans were the most popular, they also represented a critical segment for monitoring due to their higher price points,. Financial Indicators: Churned customers had a higher average monthly charge of $74.44, compared to $61.27 for retained customers. Payment Behavior: The "Electronic Check" payment method was most strongly associated with service cancellation,. 3. Engineering & Preprocessing Pipeline To prepare the data for high-performance modeling, I implemented a rigorous preprocessing workflow: Data Cleaning: Removed irrelevant identifiers like customerID and addressed potential data quality issues. The dataset was verified to have zero missing or NaN values,. Feature Engineering: Applied Label Encoding to transform categorical text variables into a numerical format suitable for machine learning algorithms,. Data Splitting: Adopted a standard 80/20 train-test split to ensure the model could generalize effectively to unseen data,. 4. Model Development & Benchmarking I developed and benchmarked eight distinct machine learning algorithms to identify the most effective solution for this specific application: Linear & Probabilistic: Logistic Regression, Naive Bayes. Tree-Based: Decision Tree, Random Forest. Boosting Frameworks: AdaBoost, Gradient Boosting, XGBoost, and LightGBM,. 5. Performance Evaluation & Results Models were evaluated using ROC curves, confusion matrices, and detailed classification reports,. Winner: Logistic Regression achieved the highest accuracy at 81.83%,. Secondary Performers: Gradient Boosting (81.05%) and AdaBoost (80.98%) also showed strong predictive power. 6. Technical Conclusion This data-driven approach proves that proactive churn prediction is essential for business sustainability. By identifying that customers prioritize high-speed fiber optic services but are sensitive to pricing and contract terms, the company can now optimize its pricing and retention strategies to maximize user satisfaction and revenue.
4
769
9
The E-Commerce Orders Dashboard provides a comprehensive overview of order performance, revenue trends, and customer purchasing behavior. Designed for online businesses, this dashboard transforms transactional order data into actionable insights that support growth, operational efficiency, and strategic decision-making.
2
9
1.2K
17
Email Marketing Analytics Dashboard – UI/UX Design Struggling to track campaign performance across multiple channels? This dashboard is designed to give you a complete, real-time view of your marketing efforts in one clean and intuitive interface. A powerful, easy-to-use dashboard that helps you monitor email, SMS, social media, and push campaigns without the confusion of scattered data. Every key metric is presented clearly so you can make faster, smarter decisions. Key Capabilities: Track open rates, click rates, conversions, and revenue in real time, Compare performance across multiple marketing channels, Identify your top-performing campaigns instantly, Understand audience engagement with clear visual breakdowns, Spot trends and optimize campaigns quickly. Most businesses run campaigns but struggle to understand what’s actually working. This dashboard eliminates guesswork by turning your data into clear, actionable insights — helping you improve ROI and scale winning strategies. Perfect For: Digital marketers, E-commerce brands, Agencies managing multiple campaigns, Startups looking to optimize growth. If you want a high-converting, professional dashboard that not only looks great but drives real business decisions — I can help you build it.
8
17
997
5
The Financial Performance Dashboard provides a comprehensive overview of an organization’s financial health by tracking revenue, expenses, profitability, and key financial indicators. Built using Power BI, this dashboard enables finance teams and decision-makers to monitor performance, identify trends, and make data-driven strategic decisions.
5
915
Keras
(1)
Follow
Message
Nathanael Mbale
pro
New Jersey, USA
Connecting code with intelligence
1x
Hired
26
Followers
Follow
Message
Connecting code with intelligence
1
SMS Spam Detection Using Neural Networks
1
2
0
Pomodor Study Planner
0
0
1
Chess AI Development with Alpha-Beta Pruning
1
2
1
Book Recommendation Engine with K-Nearest Neighbors
1
5
Keras
(1)
Follow
Message
Manideep racharla
Overland, USA
Data Analyst turning Data insights into business impact.
5.0
Rating
2
Followers
Follow
Message
Data Analyst turning Data insights into business impact.
0
Optimizing Home Delivery in Small Grocery Stores
0
4
2
Parking Lot Utilization Analysis Dashboard
2
28
0
Predictive Analysis of Airline Delays Using Machine Learning
0
3
0
Stock Price and Trading Indicators Analysis
0
1
Keras
(1)
Follow
Message
PATHAKHRK INC
Kangra, India
Creative tech solutions in AI and cybersecurity
Follow
Message
Creative tech solutions in AI and cybersecurity
0
Medical Diagnostics AI App - Health Platform
0
3
0
AI Sales Agent for Instagram Lead Management
0
5
0
Multilingual AI Sales Agent - Lead Management & CRM Automation
0
3
0
AI Multi-Agent Trip Planning System - Travel Intelligence
0
3
Keras
(1)
Follow
Message
Dylan Guidry
Canada
Senior Software Engineer | 10+ Yrs Across Industries
7
Followers
Follow
Message
Senior Software Engineer | 10+ Yrs Across Industries
1
AI-Powered COVID-19 Detection via Chest X-Rays
1
1
1
AI-Powered Produce Inspection System
1
2
1
Forge SoftwareHub - Applied AI Development for Web, Mobile & Bl…
1
2
1
AI-Powered Deposition Summaries for Legal Cases
1
19
Keras
(2)
Follow
Message
Anurag Nagare
Mumbai, India
I’m an AI & Machine Learning engineer with expertise in deve
Follow
Message
I’m an AI & Machine Learning engineer with expertise in deve
0
I recently built an AI-powered Mental Health Screening & Clinical Support System designed to assist both individuals and healthcare professionals in early detection and intervention. The system combines patient-reported questionnaires (PHQ-9 & GAD-7) with advanced NLP-based text analysis to assess depression, anxiety, and crisis risk levels. Based on the results, it generates personalized recommendations, safety plans, and professional referral letters, ensuring timely access to the right resources. This solution addresses a critical real-world problem: the growing gap in mental health care access. Many individuals experience symptoms but delay seeking professional help. Ultimately, it helps reduce the risk of untreated mental health crises and improves overall care outcomes.
0
56
0
Everyone's talking about quantum computing. Nobody's using it to feed farmers. India loses 20–30% of its crop yield every year to diseases and pests. Not because farmers don't care — but because early detection is hard, expensive, and inaccessible to the people who need it most. The existing solutions? Either a basic image classifier trained on lab-perfect photos that fail in real field conditions, or an agronomist visit that costs time and money most small farmers don't have. So I built QuantumEdge AgriGuard — a hybrid Quantum Neural Network app where a farmer can photograph a diseased leaf on their phone and get an instant diagnosis in under 5 seconds. Here's what makes it different from just another plant disease detector: Instead of a pure classical CNN, I built a hybrid architecture — a ResNet/EfficientNet backbone extracts visual features, then passes them into a Variational Quantum Circuit (VQC) for the final classification. The quantum layer uses angle embedding + StronglyEntanglingLayers, which gives it a measurable edge on small, noisy datasets — exactly the kind of data you get from Indian field conditions. The app doesn't just tell you what disease it is. It gives you: → Confidence score → Organic + chemical remedies (India-specific) → Yield impact estimate → A live classical vs quantum accuracy comparison so you can see the difference yourself I tested the quantum advantage claim honestly — ran both models on the same downsampled PlantVillage dataset and tracked accuracy, F1-score, and inference time side by side. The results are on the dashboard. No hand-waving. Built with PennyLane + PyTorch + Plotly Dash. Designed to run on simulators today and on QpiAI-Indus 25-qubit hardware tomorrow.
0
11
1
Most AI research tools are just a chatbot with a search button. I built something different. Every time you ask an AI to research something, you're getting one model, one pass, no quality check. It writes confidently, cites poorly, and you have no idea if what it produced is actually accurate. For anyone making real decisions from AI-generated research, that's a silent risk most people ignore. The problem gets worse at scale the longer and more complex the question, the more a single model hallucinates, misses sources, and loses structure. There's no one checking its work. So I built ResearchOS a 5-agent pipeline where each agent has one job. A Supervisor breaks down your question. A Researcher runs parallel searches across 22+ sources. An Analyst extracts data and auto-generates charts. A Writer synthesises a cited report. A Critic fact-checks it and sends it back for revision if anything is wrong. The loop runs up to 3 times before the report is approved. One question in. A full cited report with charts and PDF export in under 10 minutes. I tested it live by watching the Critic catch a missing citation mid-run and send the Writer back to fix it before approval. That's the part that makes this actually usable for real work. Built on LangGraph, Groq, Tavily, ChromaDB and runs entirely on free tiers.
1
61
0
It all started on a Sunday at the AWS User Group Mumbai meetup. I wasn't expecting to walk away with a new obsession, but then the speaker introduced me to Temporal and everything changed. Temporal is a durable execution engine that solves one of the hardest problems in agentic AI what happens when your LLM workflow crashes mid-run? Normally you lose everything So I went home and built this: an agent that monitors your competitors around the clock tracking pricing changes, product launches, hiring signals, and strategic moves. Every 24 hours it uses Mistral (running fully on-device via Ollama) to analyze the data and synthesize a structured executive briefing delivered straight to your inbox. Sometimes the best projects start with a Sunday conversation. https://github.com/AnuragNagare/Agentic-AI-.git
0
22
Keras
(1)
Follow
Message
Subash S
Chennai, India
Building scalable Next.js, Flutter & AI applications
New to Contra
Follow
Message
Building scalable Next.js, Flutter & AI applications
1
RAG is only as good as the data you feed it. 📄➡️🤖 I am excited to share that I’ve completed the Build an AI-Powered Document Retrieval System with IBM Granite and Docling lab from IBM SkillsBuild! While my previous work focused on the RAG pipeline, this lab went deeper into the most critical step: Document Parsing. We often forget that real-world data isn't clean text—it's locked in complex PDFs and formatted documents. What I built in this hands-on lab: 🔹 Advanced Parsing with Docling: I used Docling to not just "read" text, but to understand the structure of documents, preserving the context for the AI. 🔹 Granite Power: Leveraged IBM Granite models (granite-embedding-30m-english) to create high-quality vector embeddings. 🔹 Seamless Integration: Orchestrated the entire workflow using LangChain to connect the parsed data with the retrieval engine. This skill allows me to build AI agents that don't just "guess" answers but can accurately retrieve information from complex business documents. Technical breakdown of what I built: 🔹 Orchestration: Used LangChain to manage the flow between the user, the database, and the model. 🔹 Embeddings: Leveraged IBM Granite models (granite-embedding-30m-english) to convert text into vector representations. 🔹 Data Processing: Implemented document loading and chunking strategies to optimize context windows. 🔹 Synthesis: Created a system that retrieves relevant data and generates accurate, fact-based summaries. This experience has given me the practical skills to build AI applications that are not just "smart," but also accurate and domain-specific.
1
21
0
VitaCare🚀 1. Immutable Health Records (Blockchain & AES-256 Encryption) I moved beyond standard database storage to build a Tamper-Proof Medical Ledger. I learned how to implement a hybrid storage strategy where sensitive patient data is encrypted via AES-256 at the application layer before being anchored to a blockchain. This taught me how to ensure absolute data integrity, making medical histories immutable while providing a verifiable audit trail for every access request. 2. Privacy-First Consent Logic (Granular Data Sharing) Architecting the "Time-Limited Access" protocol taught me how to handle high-stakes privacy. I engineered a system where patients can issue temporary, scoped decryption keys to doctors via smart contracts. This taught me how to implement a Zero-Trust architecture, ensuring that healthcare providers only see what they need, exactly when they need it, with access automatically revoking after a set TTL (Time-To-Live). 3. Edge-Optimized Backend & Secure Validation By leveraging Supabase Edge Functions, I learned how to move critical business logic closer to the user while maintaining a "Thick-Client, Secure-Server" model. I architected isolated server-side environments for data validation and healthcare-specific compliance checks, which taught me how to drastically reduce latency in high-volume environments without compromising on server-side security. 4. Proactive Health Intelligence (Predictive Monitoring) I leveled up my AI integration skills by building an Advanced Command Center for Disease Surveillance. I learned how to aggregate anonymized, real-time data from disparate sources—including IoT wearable integrations—to generate heatmaps for disease outbreaks. This taught me the complexity of Geospatial Data Engineering and how to turn passive monitoring into proactive healthcare interventions. 5. Multi-Platform Synchronization (Unified Digital Ecosystem) Building a system that bridges Citizens, Doctors, and Government officials taught me the challenges of Cross-Stakeholder State Management. I learned how to maintain a "Single Source of Truth" across a multilingual Next.js web ecosystem and mobile interfaces, ensuring that a life-saving update on a doctor's portal is reflected on a patient's mobile dashboard in near real-time. 6. Inclusive Design & Localized Accessibility To tackle the diversity of the Indian healthcare landscape, I implemented a Multilingual UI Framework. I learned how to architect a scalable localization layer that supports regional languages, ensuring that the platform is accessible to rural citizens. This taught me the importance of Inclusive UX Engineering—where the technical complexity is hidden behind a simple, high-impact interface for non-technical users.
0
10
1
MIT Connect🎉 1. Hierarchical Access Control & Multi-Tenant Architecture I moved beyond basic authentication to implement a granular Role-Based Access Control (RBAC) system. By architecting a "Portal-Switch" logic, I learned how to serve distinct frontend environments (Admin vs. Student) from a unified backend, ensuring that administrative actions like fee management and academic overrides are cryptographically isolated from student-level access. 2. Predictive Academic Logic & Real-time Analytics Instead of static data display, I engineered a Proactive Attendance Engine. I learned how to write complex backend aggregation pipelines that don't just calculate percentages, but run "Safe-Miss" simulations. This taught me how to transform raw timestamped logs into actionable insights, helping users predict eligibility before it becomes a critical failure point. 3. Optimized Grid Scheduling & Sparse Data Handling Building the Dynamic Timetable Matrix taught me how to manage high-density relational data with significant "empty states." I learned how to optimize frontend rendering for a 2D coordinate-based schedule (Time vs. Day), ensuring that the UI remains performant and responsive even when mapping hundreds of unique course-section combinations across a decentralized database. 4. Financial Integrity & Transactional Consistency Handling the Invoices and Fee Administration module taught me the importance of ACID compliance. I learned how to architect transactional workflows in the database to ensure that financial records—from generation to payment status—remain immutable and consistent, preventing data drift in multi-step billing cycles. 5. Component-Driven Design & Scalable UI Systems To maintain consistency across the Analytics and Academic modules, I developed a proprietary library of reusable UI components. I learned how to build "Data-Agnostic" widgets—such as the Stat Cards and the Weekly Trend Bar Charts—that can be hot-swapped across different dashboards, drastically reducing technical debt and ensuring a uniform brand identity. 6. High-Throughput State Management Building the Intelligence & Analytics suite taught me how to manage global state across a complex dashboard ecosystem. I learned how to implement optimized fetching strategies (like SWR or React Query) to ensure that when an Admin updates an event or a student marks an attendance hour, the change propagates across the entire system without requiring manual refreshes or redundant API overhead.
1
31
1
Cognitive Guardian: Building Cognitive Guardian was a massive undertaking that pushed me to bridge the gap between physical hardware and cloud infrastructure. Transitioning this from a conceptual idea to a fully integrated digital tether taught me invaluable lessons in full-stack architecture, IoT, and edge computing. Architecting Resilient Systems (Failover Logic): I learned how to design a system that doesn't just fail gracefully, but adapts. Building the "Offline Handshake" protocol taught me how to seamlessly hand over session logic from a smartphone (Cellular/BLE) to a microcontroller (LoRaWAN) when entering network dead zones. Edge AI & Hardware-Software Integration: Instead of relying on cloud-based machine learning (which introduces latency), I learned how to program Edge AI natively on a microcontroller. Writing C++ state machines to calculate 3D acceleration vector magnitudes (via an MPU6050 sensor) taught me how to achieve zero-latency anomaly detection while operating under strict hardware constraints. Polyglot Database Strategy: I leveled up my data engineering skills by realizing one database doesn't fit all. I learned how to route high-throughput, real-time GPS telemetry into MongoDB (leveraging 2dsphere indexes for geospatial queries), while using PostgreSQL for strict relational state tracking, and Hyperledger Fabric for immutable audit logs. Privacy by Design (Self-Sovereign Identity): Handling sensitive medical data taught me modern compliance and security. I learned how to implement Decentralized Identifiers (DIDs) on a permissioned blockchain, ensuring that user data remains encrypted and is only temporarily accessible to authorities via smart contracts during an active SOS. Mobile Battery Optimization & Background Tasks: On the Flutter side, I learned how to handle intensive background processes without killing the user's device. Implementing dynamic location polling tied to the phone's internal accelerometer taught me deep, native-level power optimization for both Android and iOS. Managing High-Velocity Data Streams: Building the Node.js/Express backend taught me how to handle asynchronous data spikes. I learned to implement rate-limiting and use Socket.IO (http://Socket.IO) to bypass standard HTTP request cycles, successfully pushing critical hardware SOS alerts to a React web dashboard in under two seconds.
1
59
Keras
(1)
Follow
Message
MoonTech OneSixEight
Almelo, Netherlands
Tech Renaissance Leader: AI, FinTech, Web
Follow
Message
Tech Renaissance Leader: AI, FinTech, Web
0
Multi-MT5 Integrated, AI-Infused Financial Sentiment Engine
0
17
0
QuantumTrade ML Suite
0
41
0
AI MCQ | FIFA
0
10
View more →
Keras
(1)
Follow
Message
Explore people