We wanted to create a hand position/movement classification glove with applications in human- computer interaction. This is useful for cases where hands need to be tracked without the use of a camera. We use edge processing to maintain user bio-data private and real-time responsiveness. We use multi-sensor fusion to improve resolution and classification accuracy. The glove classifies sign language gestures and allows the user to interact with a virtual agent through sign language. Real-world applications would allow the glove to translate American Sign Language (ASL) to English or other spoken languages.