CT profs and students create software that helps people communicate and move. It reads your face.

0
8

Observing a young man in a wheelchair with motor impairment struggle to communicate with his parents profoundly affected Chetan Jaiswal.

Jaiswal is an associate professor of computer science at Quinnipiac University.

After meeting the boy at an occupational therapy conference in 2022, Jaiswal decided that more could be done to help individuals connect with their loved ones using technology powered by artificial intelligence, or A.I.

“We are computer scientists,” Jaiswal said. “We can do better. Technology should help people who actually need it. Technology shouldn’t just be created to be rich. You need to create technology that can help the people who can actually use it on a daily basis.”

Joining with colleagues, Karen Majeski, associate professor of occupational therapy and Brian O’Neill, associate professor of computer science, at Quinnipiac, the three worked together with students Michael Ruocco and Jack Duggan to develop the university’s first patented software hands-free input system using AI with communication entirely through facial gestures to help provide more independence for those with limited mobility.

Using a standard webcam, AccessiMove uses head-tilt detection, wink/gesture recognition and facial-landmark tracing to enable individuals to make command inputs and move the cursor on a computer screen or other device.

“This benefits a lot of people, especially people with disabilities and motor impairment,” Jaiswal said. “They can actually use their face to interact with a computer and do a number of things that we take for granted.”

The professors are looking for partnerships, collaborators and investors to develop the product across communication technologies including the health care field.

“We have so many healthcare industry partners in the East Coast, especially in Connecticut, whether it is Yale Hospital, Hartford Hospital,” Jaiswal said. “All of us are wanting what is best for our patients, especially for those patients that can’t help themselves. They need assistance. They need help to live a better life. We are looking up to those partners who want to make a difference in the world for the people who need it.”

Facial gestures

Majeski described the software system, explaining that an individual’s facial gestures become like a computer mouse tool dictating commands.

O’Neill said the camera is focused on the bridge of your nose, explaining that when an individual moves the bridge of his or her nose one direction it is signaling to the computer to go in a particular direction.

Jaiswal said ,for example, that a left, right, up, down head tilt is tracked to any event such as opening chrome or restarting the computer. Eye blinks, he said, are tracked to mouse/keyboard clicks.

With the technology running on similar devices it is adaptable in long-term care centers, education, rehabilitation, remote learning, and assistive home-care environments, according to Jaiswal.

Jaiswal noted that it is not only limited to computers. It can also be used with wheelchairs, he explained.

“A person with a disability can sit in the chair and just use gestures to move the wheelchair,” he said. “If you look up the chair goes forward, if you look down, the chair goes backward.”

He said it could also benefit seniors in assisted living facilities, helping them to go from point A to point B with a wheelchair using facial gestures that inform the AI technology.

O’Neill said the software couldn’t be used without the assistance of AI.

AI is used to track the face gestures in real time, Jaiswal said.

The system “also enhances accessibility in the gaming industry, enabling inclusive gameplay and new hands-free interaction styles for slower-paced, strategy, simulation, or narrative-driven games,” according to information on the system.

The idea, Majeski said, is for children who have mobility issues to interact with objects in their environment.

“We are talking about gaming literacy for learning,” she said. “If it is a toy and they can’t turn the button on could you help us hack that toy so he could turn that button with the whole hand grasp? If it is a computer can you help us to connect something to the computer to get him to be able to do those actions?”

Majeski said the team conducted trials with students, Ruocco and Duggan, to make sure the software works if someone is wearing glasses or has a stiff neck. They found it to be successful in all those situations, she said. The software is calibrated to the person’s range of motion.

“There is money behind bringing it to the market so even if there is a source option for people with disabilities we still need funding to bring it to a place where it is an open source option for people with disabilities to use it on their own,” she said.

O’Neill explained the simplicity of the software.

“This is not using special hardware,” he said. “It is not using a fancy webcam. It is using the webcam built into any tablet or any phone.”

Jaiswal said it is his hope that the technology becomes an everyday use for people that need it and for people who may want to use it for convenience.

“The technology is useful in hospital settings,” he said. “Patients can use facial gestures to communicate in a hospital setting, especially for patients who can’t speak.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here