Students can learn with their mouths as well as with their eyes and hands

0

The first time I met a blind scientist, I was participating in an NSF Undergraduate Research Experiment at the University of Delaware, a program for students with disabilities interested in STEM research. Until then, it had never occurred to me that science education excluded blind students. My daily classroom experience consisted of teachers drawing molecules on the board or writing an equation assuming that the students in the class could see what was written, very rarely even stating what they wrote. But what about blind and visually impaired students in the classroom?

According to the National Science Foundation 2021 report to Women, minorities and people with disabilities in STEM, in 2019 9.1 percent of doctoral graduates reported having a disability, a figure that includes people who are blind or visually impaired. Inclusive teaching methods are needed to increase accessibility to STEM.

A recent paper from Baylor University, led by Katelyn Baumer and published in Scientists progress, was inspired by exactly this problem. Shaw designed a study to assess whether people could learn to recognize 3D models, like those often used to teach science, with their mouths rather than their eyes. “I probably wouldn’t be working in this field if it weren’t for my own child who is visually impaired, hard of hearing and autistic,” Shaw said in an email interview. Having his son diagnosed with retinoblastoma at such a young age changed Shaw’s view on science and encouraged him to increase accessibility for blind or visually impaired scientists in his field.

Shaw is certainly not the first person to take advantage of our mouths and lips’ ability to spatially discriminate between items. For example, a student from Hong Kong named Tsang Tsz-Kwan learned to read braille with her on her own lips. Although not a traditional method of learning braille, this case suggests that the mouth is able to recognize and distinguish patterns.

Shaw’s research is based on the fact that brain imaging revealed that the sensation of touch, called somatosensory input, of our tongues, lips and teeth converge on the primary somatosensory cortex to produce an image generated entirely by signals from our mouth. The primary somatosensory cortex is a region of the brain that receives data to produce images.

To 2021 paper in Nature found that when primates showed the same brain circuit activation when they grabbed objects with their hands and when they moved an object with their tongue. This indicates that there may be underlying similarities between the physical manipulations of the hand and the mouth, but much remains unknown. Signals for manipulation with the hands or the mouth are sent to the cerebral cortex, but as Shaw points out, “the fine structure of how everything is sorted and processed remains unknown.”

Baumer, Shaw and their colleagues discovered that there was manual tactile recognition comparable to the recognition of hand-to-mouth manipulations when using these models. College and elementary school students (Grades 4 and 5) participated in the study, with 365 college students and 31 elementary students represented. Participants were blindfolded and then divided into two groups, one assigned to handling objects by hand and the other to handling objects with only their mouths. Each participant received a single model protein to study. They were then asked to identify if each of a set of eight other protein models matched the original given to them. Of those eight, one was a match.

Shaw expected the results to be similar in both age groups, because “oral somatosensory perception is hardwired in us, the tongue develops very early, and we probably start doing oral somatosensory perception. in utero. “The research team found that both age groups of students were able to successfully distinguish between patterns, and precision of structure recall was higher in people who evaluated models only by oral manipulation in about 41% of participants.

Part of the design process was to make the models portable, practical and affordable, as they would eventually have to be produced in large quantities. As Shaw noted, biochemistry textbooks often contain over a thousand illustrations, further emphasizing the need for models to be small and inexpensive. The first person Shaw gave the models to was Kate Fraser, a science teacher at the Perkins School for the Blind in Massachusetts, who was also Helen Keller’s alma. mother. He chose the school because it offered an important support by going personally to his family’s apartment and offering intervention assistance after Shaw’s son had an eye removed.

Although this study did not involve blind or visually impaired students, it lays the groundwork for further development. These models have shown results comparable to manual manipulation and may offer a way to make science more accessible, which is the ultimate goal. By increasing the number of scientists with disabilities, STEM will benefit from diverse perspectives that will ultimately lead to better science.


Source link

Leave A Reply

Your email address will not be published.