Senior Project — University of Bahrain 2026

American Sign Language
Translator App

"Hands that speak, The clarity we seek"

A Flutter mobile application that recognizes ASL alphabet letters in real time using MediaPipe hand landmark detection and a trained MLP classifier — building words, speaking them aloud, and translating them.

ASL Translator
Hands that speak, The clarity we seek
American Sign Language Translator
Recognize ASL letters using the camera, build text, speak it aloud, and translate it.
📷 Start Detection
✅ 97.08% accuracy
🔤 28 ASL classes
28
ASL classes
10,731
Training images
63
Landmark features
97.08%
Model accuracy
Live
Real-time detection
Overview

About the project

A graduation project from the University of Bahrain built to bridge the communication gap for the deaf and hard of hearing community using AI and mobile technology.

🎯

Project overview

This application is designed to help recognize American Sign Language letters using a mobile camera. The app detects hand signs, predicts the corresponding letter, builds text from selected letters, speaks the text aloud, and can translate the final sentence into other languages including Arabic.

🤖

How it works

The user shows one hand clearly to the camera. MediaPipe detects the hand and extracts 21 landmark points — giving 63 coordinate features. These features are passed to a trained MLP classifier which predicts the ASL letter with a confidence score in real time.

💡

Why it matters

Sign language is the primary communication method for millions of deaf and hard of hearing people. This app removes the communication barrier by instantly translating hand signs into readable and speakable text, making everyday communication more accessible for everyone.

🏫

Academic context

Developed as a senior graduation project at the University of Bahrain. The model was trained in Google Colab using MediaPipe landmark extraction and an MLP neural network — exported as TFLite for on-device mobile inference inside the Flutter app.

Inside the app

App screens

A clean, simple interface designed for real users — from home to live detection to settings.

Home
ASL Translator
Recognize ASL letters using the camera, build text, speak it aloud, and translate it.
📷 Start Detection
Home screen
The welcome screen introduces the app with a single "Start Detection" button. Clean and distraction-free so users can jump straight into signing.
Detection
📷 Live camera
Prediction Found
O
99.99%
Top 3: O 100% · C 0% · D 0%
+ Add Sign
🔊 Speak
Detection screen
Live camera feed detects hand signs instantly. Shows the predicted letter, confidence percentage, and top 3 predictions. Confirmed letters are appended to a sentence. Supports both camera and gallery image input.
Settings
Detection Settings
Confidence Threshold
65%
Show Top 3
Front Camera
Translation
Language
Arabic ▾
Appearance
Dark Mode
Settings screen
Fully configurable — adjust the confidence threshold (default 65%), toggle Top 3 predictions, switch between front and back camera, choose translation language, and enable dark mode.
ASL Guide
ASL Alphabet
ASL alphabet guide
ASL guide
A built-in visual reference chart showing all 26 ASL alphabet hand shapes plus Space and Delete signs. Supports pinch-to-zoom so users can study any sign up close.
About
Project Overview
Designed to recognize ASL letters using a mobile camera. Detects hand signs, predicts letters, speaks text aloud, and translates it.
How It Works
User shows one hand. MediaPipe extracts landmarks, MLP model predicts the letter with confidence score.
About screen
Describes the app's purpose, how it works technically using MediaPipe and the MLP model, and the team behind it.
Menu
🏠 Home
📷 Detection
📖 ASL Guide
ℹ️ About
⚙️ Settings
Navigation drawer
A slide-out drawer menu with the app logo and tagline gives access to all five sections: Home, Detection, ASL Guide, About, and Settings.
The pipeline

How it works

From your hand in front of the camera to a printed and spoken letter — in milliseconds.

01
Camera capture
Flutter captures a live camera frame. You can use the front or back camera or pick an image from the gallery — all switchable in settings.
02
MediaPipe landmark detection
MediaPipe Hands detects the hand and extracts 21 precise landmark points. Each point has x, y, z coordinates — giving 63 numerical features per hand sign.
03
MLP classification
The 63 features are normalized and fed into a trained MLP neural network. The model classifies the sign into one of 28 ASL classes (A–Z, space, delete) with a confidence score.
04
Build, speak & translate
Confirmed letters are appended to a sentence. Tap Speak to hear the text via TTS, or Translate to convert it to Arabic, French, Spanish, or German.
AI model

Model performance

Two approaches were tested. The Landmark-based MLP model was selected as the final model for its accuracy and lightweight mobile performance.

Experimental

MobileNetV2 CNN

Input224×224 cropped images
ArchitectureMobileNetV2 + Dense
TrainingTransfer learning
Test accuracy96.80%
Dataset10,731 filtered images
NoteNot used in final app
Dataset: 14,713 original images → 10,731 usable after MediaPipe filtering (3,982 removed — no hand detected) · 28 classes · 70% train / 15% validation / 15% test split
Built with

Tech stack

A carefully chosen stack for real-time on-device mobile AI inference.

📱
Flutter
Mobile UI & camera
MediaPipe
Hand landmark detection
🧠
MLP Classifier
Sign classification
TensorFlow Lite
On-device inference
🔬
Keras
Model training
☁️
Google Colab
Training environment
The people

Meet the team

Senior project students at the University of Bahrain, 2025–2026.

OA
Omar Abdulaziz Mohamed
Team member
University of Bahrain
AB
Abdullah Moatazbellah
Team member
University of Bahrain
OA
Omar Adnan Ahmed
Team member
University of Bahrain
Get in touch

Contact us

Have a question about the project or want to learn more? Reach out to the team directly.

✉️ info@asl-signlanguage.site

University of Bahrain · Computer Science · Senior Project 2025–2026