Professors are Pissed as Students Use AI to Complete their Assignments


The universities are facing a huge challenge as students use AI for writing essays

Some claim that as AI develops, it will eventually be able to write academic articles. But when does the introduction of artificial intelligence amount to fraud? Students use AI for writing their essays and it is easy and time-saving, well professors are pissed at their students for such academic intrusion.

“As I wait for my next class to begin in front of the lecture hall, two students next to me are debating which AI software would write their essays the best. Am I marking this correctly? AI Essays? Carla Ionescu, a historian, captured the mounting worry about what artificial intelligence can mean for conventional university evaluation in a tweet she posted late last month. “No. Not possible, she tweeted. Tell me we haven’t arrived at AI for writing essays. Universities are facing an enormous challenge of this full-on robot writing and this is not going to end any time soon.

Ben Goertzel, a computer theorist, suggested the “robot university student test” in 2012, contending that an AI capable of earning a degree in the same manner as a person should be regarded as aware.

If it weren’t for the achievements of AIs using natural language processing (NLP), most notably GPT-3, the language model developed by the OpenAI research group, Goertzel’s idea—an alternative to the more well-known “Turing test”—might have remained a thought experiment.

Nassim Dehouche, a computer scientist, presented a paper two years ago proving that GPT-3 could create legitimate academic work that was imperceptible to standard anti-plagiarism tools.

He now believes that the period when students could produce full essays and other types of writing using algorithmic approaches has long since passed. Now, many people share his urgency. GPT-3 has written persuasively on whether it threatens mankind (it claims it doesn’t) and about animal cruelty in the manners of both William Shakespeare and Bob Dylan in news and opinion pieces.

A dramatic mic-drop occurred at the end of a 2021 Forbes article on using AI to write essays for school: “This post about utilizing an AI to write essays in school,” it said, “was created using an artificial intelligence content writing tool.” Of course, unjustified hype is the lifeblood of the IT sector.

  1. Scott Graham wrote last month for Inside Higher Education on the conflicting results of encouraging students to use technology for their projects. The absolute finest, according to him, would have done little more than meet the standards. Weaker students found it difficult to give the system useful hints because doing so required advanced writing abilities, making the AI unnecessary.

 

Full-on robot writing will always and forever be “just around the corner,” he said in his conclusion.

That may be the case, but only a month prior, Aki Peritz of Slate came to the exact opposite conclusion, writing that “with a little bit of skill, a student may utilize AI to write his or her paper in a fraction of the time that it would typically take.”

 

However, the difficulty facing higher education cannot be reduced to “full-on robot writing.”

Universities must resolve a wide range of subtler issues in addition to essays and assignments that are created by algorithms. For instance, word processors powered by AI frequently provide solutions to our grammatical errors. But if a student’s phrase can be algorithmically rewritten by software, why shouldn’t a paragraph? And if a paragraph, why not a page?

Prof. Phillip Dawson, of Deakin University, is an expert in digital assessment security. He advises viewing AI as nothing more than a new application of the practice of cognitive offloading.

He defines cognitive offloading as “using a technology to lessen the mental weight of a task.” To avoid having to struggle to remember anything for later, it might be as easy as writing it down. Since Socrates’ complaint about individuals pretending to know something by writing it down through the first appearance of pocket calculators, there have been moral panics around technologies for cognitive dumping.

Dawson contends that, as AI is increasingly included in higher-level activities, colleges should make plain to students the types and degrees of cognitive offloading permissible for certain exams.

“I believe that we will instruct students on how to use these technologies. We won’t necessarily ban them, in my opinion.

The post Professors are Pissed as Students Use AI to Complete their Assignments appeared first on Analytics Insight.


The universities are facing a huge challenge as students use AI for writing essays

Some claim that as AI develops, it will eventually be able to write academic articles. But when does the introduction of artificial intelligence amount to fraud? Students use AI for writing their essays and it is easy and time-saving, well professors are pissed at their students for such academic intrusion.

“As I wait for my next class to begin in front of the lecture hall, two students next to me are debating which AI software would write their essays the best. Am I marking this correctly? AI Essays? Carla Ionescu, a historian, captured the mounting worry about what artificial intelligence can mean for conventional university evaluation in a tweet she posted late last month. “No. Not possible, she tweeted. Tell me we haven’t arrived at AI for writing essays. Universities are facing an enormous challenge of this full-on robot writing and this is not going to end any time soon.

Ben Goertzel, a computer theorist, suggested the “robot university student test” in 2012, contending that an AI capable of earning a degree in the same manner as a person should be regarded as aware.

If it weren’t for the achievements of AIs using natural language processing (NLP), most notably GPT-3, the language model developed by the OpenAI research group, Goertzel’s idea—an alternative to the more well-known “Turing test”—might have remained a thought experiment.

Nassim Dehouche, a computer scientist, presented a paper two years ago proving that GPT-3 could create legitimate academic work that was imperceptible to standard anti-plagiarism tools.

He now believes that the period when students could produce full essays and other types of writing using algorithmic approaches has long since passed. Now, many people share his urgency. GPT-3 has written persuasively on whether it threatens mankind (it claims it doesn’t) and about animal cruelty in the manners of both William Shakespeare and Bob Dylan in news and opinion pieces.

A dramatic mic-drop occurred at the end of a 2021 Forbes article on using AI to write essays for school: “This post about utilizing an AI to write essays in school,” it said, “was created using an artificial intelligence content writing tool.” Of course, unjustified hype is the lifeblood of the IT sector.

  1. Scott Graham wrote last month for Inside Higher Education on the conflicting results of encouraging students to use technology for their projects. The absolute finest, according to him, would have done little more than meet the standards. Weaker students found it difficult to give the system useful hints because doing so required advanced writing abilities, making the AI unnecessary.

 

Full-on robot writing will always and forever be “just around the corner,” he said in his conclusion.

That may be the case, but only a month prior, Aki Peritz of Slate came to the exact opposite conclusion, writing that “with a little bit of skill, a student may utilize AI to write his or her paper in a fraction of the time that it would typically take.”

 

However, the difficulty facing higher education cannot be reduced to “full-on robot writing.”

Universities must resolve a wide range of subtler issues in addition to essays and assignments that are created by algorithms. For instance, word processors powered by AI frequently provide solutions to our grammatical errors. But if a student’s phrase can be algorithmically rewritten by software, why shouldn’t a paragraph? And if a paragraph, why not a page?

Prof. Phillip Dawson, of Deakin University, is an expert in digital assessment security. He advises viewing AI as nothing more than a new application of the practice of cognitive offloading.

He defines cognitive offloading as “using a technology to lessen the mental weight of a task.” To avoid having to struggle to remember anything for later, it might be as easy as writing it down. Since Socrates’ complaint about individuals pretending to know something by writing it down through the first appearance of pocket calculators, there have been moral panics around technologies for cognitive dumping.

Dawson contends that, as AI is increasingly included in higher-level activities, colleges should make plain to students the types and degrees of cognitive offloading permissible for certain exams.

“I believe that we will instruct students on how to use these technologies. We won’t necessarily ban them, in my opinion.

The post Professors are Pissed as Students Use AI to Complete their Assignments appeared first on Analytics Insight.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@technoblender.com. The content will be deleted within 24 hours.
assignmentscompletePissedprofessorsStudentsTech NewsTechnoblenderTop Stories
Comments (0)
Add Comment