Five students supported by an IEEE program used a VR system to create a fun and inexpensive way to treat a vision problem in young people.
Since its inception, EPICS in IEEE has sought to show the benefits of empowering students to work with local service organizations to engineer and implement solutions for their communities’ unique challenges. One example is a virtual reality project to treat binocular dysfunction being led by five New Jersey Institute of Technology (NJIT) students.
Binocular dysfunction is a broad term that covers a wide range of conditions. Convergence insufficiency is a more common one that affects about 5% of the general population and about 50% of the population diagnosed with traumatic brain injury. Poor binocular eye movement control negatively impacts young peoples’ ability to read comfortably, and eye movement dysfunctions can also be very detrimental to a child’s academic performance.
A 2008 study showed that conventional home treatments had the same efficacy as placebo treatment. However it found vision therapy under the supervision of a therapist in an office setting resulted in significant visual improvements in about 75% of neurologically normal children one year post therapy.
Today’s methods to treat binocular dysfunction typically ask patients to view repetitive tasks using a 3D TV, or perform such tasks in person with a vision therapist. This approach relies on a therapist’s visual observation of the alignment of a patient’s eye position while visually tracking objects, as well as confirmation from patients that they are complying with the therapy regimen. Unfortunately, conducting vision therapy at home can be ineffective due to patient non-compliance.
The NJIT students recognized a need for home therapy to effectively augment traditional office therapy. They set out to develop a home system targeting the 8-18 year old age group that retained the efficacy of office therapy, while being cost-effective, convenient and capable of tracing compliance.
Implementing real-time eye tracking was one of the group’s biggest hurdles. After studying Nyquist’s sampling theorem, the team used a commercial grade eye tracker that samples at 240 frames/second to determine a minimum sampling rate of 14 f/s as the requirement to effectively track eye convergence movements in real-time. Writing code in C++ and using the OpenCV library, the team was able to attain more than 30 f/s, without overstressing the processor, and even hitting 40 f/s
by overclocking it.
Students built a system based on a Raspberry Pi board, Oculus Rift and tablets. (Images: IEEE)
In their design (above), two infrared sensitive cameras and light sources are mounted in an Oculus Rift. The head-mounted display allowed the team to create a game where certain objects are offset from each other, stimulating eye movements to correct this visual disparity to create a perception of depth.
Each camera in the system connects to a Raspberry Pi board where an algorithm processes each frame of video to find the center of the user’s pupils. This data, as well as some calibration parameters, are displayed on a pair of tablets.
Data also is sent to a desktop computer running the game. The PC receives the vertical and horizontal location of the center of the user’s pupils to determine the point to which the user’s eyes are converging. Objects that are located further apart from the left and right screen will require the eyes to cross (move inward) and hence give the illusion that the object is closer to the user
The game (below), written in the Unity game engine, is inspired by early arcade games like Galaga and Space Invaders. Named Bug Eyes, the game presents alien targets that travel toward the player, eliciting a convergence response.
The software made therapy fun by emulating a typical computer game.
The system is designed to verify whether the user is actually converging on the target through the use of the eye tracking cameras. It destroys the target if the eyes have correctly aligned. The game difficulty increases as the user’s binocular coordination improves through vision therapy.
The NJIT team wanted to make their project a less expensive option to traditional methods, and one that can be used in the comfort of the patient’s home. This is especially important for families who have limited economic resources and who may not be able to afford office-based vision therapy. The project won an initial grant that prompted matches both from NJIT and Salus University.
“EPICS in IEEE was critical to our project’s success because it funded the parts and materials for our project,” said the project’s student lead John Vito d’Antonio-Bertagnolli, a biomedical engineering graduate, “Our hope is that our project will be adopted as a tool to aid vision therapists and eye care professionals, leading to improved vision in children,” he added.
The project recently won NJIT’s TechQuest award, a yearly competition for undergraduate research and design projects. It also received funding from other NJIT and Salus programs. The project was supported by The Children’s Hospital of Philadelphia as well as academic leaders from NJIT and Salus.
--Saurabh Sinha chairs the EPICS in IEEE development committee and is professor and executive dean of engineering at the University of Johannesburg, South Africa.