Movements involved in speech production are hard to view. Only the lips and the tip of the tongue can easily be seen from the outside, while the most important movements of the tongue are hidden from view. For children and adults with a speech disorder, this means that it is hard to learn how to make different, clearer movements.
Ultrasound Biofeedback Therapy (UBT) uses ultrasound imaging to show tongue movement in speech therapy, but the display can be difficult for some children to understand. A clearer, more motivating method of displaying tongue movement is to use ultrasound to track movements and convert the movements into gamified feedback: If children move their tongues the right way, something fun will happen on the screen, like a character hitting a target.
To track tongue movement from B-mode ultrasound images in real time, Dr. Boyce and her team developed a custom ultrasound image processing program, Tongue-PART (Tongue Profiles with Automatic Rapid Tracking). Tongue-PART is comprised of three processes: tongue surface identification, tongue part delineation, and recording of trajectories for each tongue part. This program processes real-time grayscale B-mode ultrasound images captured as video sequences and is adaptable to work with any clinical ultrasound scanner. The tracked data can integrate with game engines, such as Unity, to create a game where objects are controlled by tongue movement.