Techno Blender
Digitally Yours.

Professors call for further study of potential uses of AI in special education, avoiding bans

0 41


WeGotIT! essay organization chart. Credit: Journal of Special Education Technology (2023). DOI: 10.1177/01626434231165977

Artificial intelligence is making headlines about its potentially disruptive influence in many spaces, including the classroom. A group of educators that includes a University of Kansas researcher has just published a position paper reviewing AI’s potential in special education, calling for patience and consideration of its potential uses before such technology is banned.

Most importantly, AI should be considered as a tool that can potentially benefit students with disabilities, according to James Basham, KU professor of special education, and co-authors. Tools such as ChatGPT can quickly turn out writing. And naturally, some students have used that to avoid schoolwork.

But banning it is not the answer.

“It’s really been over the last decade or so that we’ve seen AI and machine learning move from just what you might call geek culture to the bigger world,” Basham said. “We’ve been studying it, but ChatGPT made it a little more real by making it available to the public. While we think the writing process is complex, AI can do it, quickly and fairly well.

“When you think about people with disabilities in education, you often think about writing. We get referrals all the time for students who can’t or struggle to express themselves in writing. And AI can help with that. So we need to think about what questions we need to ask or issues to think about.”

In the paper, the authors provided a brief history of artificial intelligence and how it developed to its current state. They then considered ethical questions regarding its use in education and special education and how policy should address the technology’s use. Foremost, schools should not reflexively ban the technology, the authors wrote. Meanwhile, educators, researchers and others need to think about what they want students to learn and how the technology can aid that process. Additionally, teacher educators who are producing future generations of educators need to work with their students to consider how they can effectively address the topic.

Among the main ethical considerations is information literacy, the authors wrote. Students need to learn how and where to find valid information as well as how to discern true information from false, think critically and assess topics to avoid misinformation. Educators should also avoid the trap of evaluating skills like writing too narrowly.

“If we’re only having students do things in one certain way, the AI can probably do that,” Basham said. “But if we’re bringing in multiple concepts and modalities, then it’s a much different conversation. We need to think about who we are as a society and what we teach, especially when we think about students with disabilities, because they are often judged on just one aspect.”

The article, published in the Journal of Special Education Technology, was co-written with Matthew Marino, Eleazar Vasquez and Lisa Dieker, all of the University of Central Florida, and Jose Blackorby of WestEd.

The authors also urged those in education to consider AI and if it is a “cognitive prosthesis” or something more. Just as a student with physical impairments might use speech-to-text to translate their thoughts more efficiently to writing or a student with a hearing impairment can use an app on a phone to turn down ambient noise in the classroom, a student with cognitive disabilities could potentially use AI to improve their writing.

But while technology can help students improve writing and other skills, educators need to consider consent, the authors wrote. All students should be taught about what information any AI collects, how it is stored and how it is shared. Parents have a role to play in that regard as well, in considering whether a school that uses AI is right for their child, if it complies with an Individualized Education Plan and if it can be personalized while being respectful of diverse student backgrounds and values, the authors wrote.

The authors also noted that AI already exists in schools: Students use laptops, tablets, smartphones and other technologies unavailable to previous generations. Yet those tools are not banned from classrooms outright. Similarly, while technologies such as ChatGPT could be used to cheat or reduce student workload, they could also potentially be an effective resource for students with disabilities.

Before any such judgments are made, researchers and policymakers should continue to ask questions and ensure people who represent students with disabilities are at the table, the authors wrote.

“Technology is a societal experiment,” Basham said. “We can use it effectively or ineffectively. But the education system needs to get in front of it and figure out how to use this particular technology to further human betterment. What we need is not to be afraid of change but to think about critical thinking and problem-solving so we are teaching students to do that whether with AI or without. We need to reflect not on today on how it will change our lives, but what it means for the future.”

More information:
Matthew T. Marino et al, The Future of Artificial Intelligence in Special Education Technology, Journal of Special Education Technology (2023). DOI: 10.1177/01626434231165977

Provided by
University of Kansas


Citation:
Professors call for further study of potential uses of AI in special education, avoiding bans (2023, June 8)
retrieved 8 June 2023
from https://phys.org/news/2023-06-professors-potential-ai-special.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.




Professors call for further study of potential uses of AI in special education, avoiding bans
WeGotIT! essay organization chart. Credit: Journal of Special Education Technology (2023). DOI: 10.1177/01626434231165977

Artificial intelligence is making headlines about its potentially disruptive influence in many spaces, including the classroom. A group of educators that includes a University of Kansas researcher has just published a position paper reviewing AI’s potential in special education, calling for patience and consideration of its potential uses before such technology is banned.

Most importantly, AI should be considered as a tool that can potentially benefit students with disabilities, according to James Basham, KU professor of special education, and co-authors. Tools such as ChatGPT can quickly turn out writing. And naturally, some students have used that to avoid schoolwork.

But banning it is not the answer.

“It’s really been over the last decade or so that we’ve seen AI and machine learning move from just what you might call geek culture to the bigger world,” Basham said. “We’ve been studying it, but ChatGPT made it a little more real by making it available to the public. While we think the writing process is complex, AI can do it, quickly and fairly well.

“When you think about people with disabilities in education, you often think about writing. We get referrals all the time for students who can’t or struggle to express themselves in writing. And AI can help with that. So we need to think about what questions we need to ask or issues to think about.”

In the paper, the authors provided a brief history of artificial intelligence and how it developed to its current state. They then considered ethical questions regarding its use in education and special education and how policy should address the technology’s use. Foremost, schools should not reflexively ban the technology, the authors wrote. Meanwhile, educators, researchers and others need to think about what they want students to learn and how the technology can aid that process. Additionally, teacher educators who are producing future generations of educators need to work with their students to consider how they can effectively address the topic.

Among the main ethical considerations is information literacy, the authors wrote. Students need to learn how and where to find valid information as well as how to discern true information from false, think critically and assess topics to avoid misinformation. Educators should also avoid the trap of evaluating skills like writing too narrowly.

“If we’re only having students do things in one certain way, the AI can probably do that,” Basham said. “But if we’re bringing in multiple concepts and modalities, then it’s a much different conversation. We need to think about who we are as a society and what we teach, especially when we think about students with disabilities, because they are often judged on just one aspect.”

The article, published in the Journal of Special Education Technology, was co-written with Matthew Marino, Eleazar Vasquez and Lisa Dieker, all of the University of Central Florida, and Jose Blackorby of WestEd.

The authors also urged those in education to consider AI and if it is a “cognitive prosthesis” or something more. Just as a student with physical impairments might use speech-to-text to translate their thoughts more efficiently to writing or a student with a hearing impairment can use an app on a phone to turn down ambient noise in the classroom, a student with cognitive disabilities could potentially use AI to improve their writing.

But while technology can help students improve writing and other skills, educators need to consider consent, the authors wrote. All students should be taught about what information any AI collects, how it is stored and how it is shared. Parents have a role to play in that regard as well, in considering whether a school that uses AI is right for their child, if it complies with an Individualized Education Plan and if it can be personalized while being respectful of diverse student backgrounds and values, the authors wrote.

The authors also noted that AI already exists in schools: Students use laptops, tablets, smartphones and other technologies unavailable to previous generations. Yet those tools are not banned from classrooms outright. Similarly, while technologies such as ChatGPT could be used to cheat or reduce student workload, they could also potentially be an effective resource for students with disabilities.

Before any such judgments are made, researchers and policymakers should continue to ask questions and ensure people who represent students with disabilities are at the table, the authors wrote.

“Technology is a societal experiment,” Basham said. “We can use it effectively or ineffectively. But the education system needs to get in front of it and figure out how to use this particular technology to further human betterment. What we need is not to be afraid of change but to think about critical thinking and problem-solving so we are teaching students to do that whether with AI or without. We need to reflect not on today on how it will change our lives, but what it means for the future.”

More information:
Matthew T. Marino et al, The Future of Artificial Intelligence in Special Education Technology, Journal of Special Education Technology (2023). DOI: 10.1177/01626434231165977

Provided by
University of Kansas


Citation:
Professors call for further study of potential uses of AI in special education, avoiding bans (2023, June 8)
retrieved 8 June 2023
from https://phys.org/news/2023-06-professors-potential-ai-special.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment