[Opinion] This is not a university professor

Today, the world of universities is in full transformation and experiencing a worrying development. ChatGPT technology has recently found its way into academia in the form of “social acceleration”. Such an innovation has major consequences for teaching and research.

ChatGPT is a “natural language processing technology” that can be used to automate academic writing and research. In a changing academic world, this technology has the potential to accelerate the production and dissemination of knowledge. It can actually facilitate the translation, writing and research of documents, thus making the production of academic work faster and more efficient.

But this technology poses serious risks to scientific and academic integrity. Some fear the technology is being used to circumvent academic review, giving authors an unfair advantage and unreliable results. Moreover, this technology could be used to accelerate the neoliberalization of the universities, which tends to reduce access to resources and degrade the quality of teaching and research.

Finally, the use of ChatGPT can lead to “surveillance capitalism”, where users are constantly monitored by algorithms. Researchers and students could be subject to increased scrutiny as their research and work is analyzed and evaluated by algorithms. This technology can also lead to a loss of autonomy and freedom for researchers and students, depriving them of their right to determine how their work is produced and disseminated.

The arrival of ChatGPT in academic circles therefore raises legitimate concerns. Universities must take steps to protect their students and researchers and ensure the integrity of their activities. It is important to limit the use of this technology to where it is necessary and to ensure that the work produced is always subject to appropriate review.

Therefore, universities must be vigilant and take steps to combat misuse of ChatGPT technology and its harmful effects. Students and researchers must be protected from possible abuse, and the quality of teaching and research must be maintained.

What to do with this tool?

The previous six sections were not written by me, but by the ChatGDP tool, based on the latest techniques of deep learning and generative artificial intelligence, capable of producing more or less complex texts from simple instructions. I personally wrote the following request: “Write a 600-word op-ed for the newspaper The duty about the upheavals of ChatGPT in the academic world. Write a critical reflection, including the concepts of social acceleration, neoliberalization of universities and surveillance capitalism. »

As a university professor interested in the transformations of advanced capitalism, social innovations and the complex impact of new technologies, I am faced here with a very concrete question that directly affects my profession: what to do with this tool that will have direct impact on assessment and teaching methods?

Besides the issue of plagiarism, these tools can also be used by teachers to perform a number of tasks. I’ve done a few tests myself, created a syllabus in seconds, generated questions for a final exam, and answered the same questions with a higher writing quality than college students, and this in less than three minutes.

It immediately made me think of philosopher Günther Anders’ concept of human obsolescence. Here is another answer from ChatGPT: “The term human obsolescence by Günther Anders refers to a potential threat to human survival due to the arrival of new technologies. Although this notion can apply to all areas of life, it takes on special meaning for university professors with the arrival of ChatGPT. In fact, ChatGPT is an artificial intelligence system that can simulate a teacher and give lessons to students. It could lead to a form of course automation and the loss of jobs for university professors. »

This is a somewhat disturbing answer. For my part, I believe that the status of full-time professors is not threatened in the short term. Contrary to popular belief, recent innovations in artificial intelligence are not leading to a massive loss of jobs, but rather to a reconfiguration of work and exploitation techniques (including “digital work”), via outsourcing and fragmentation of tasks, which extends the automation mechanisms of the industrial capitalism of the past.

Bring the spirit back into his lamp

Nevertheless, the introduction of artificial intelligence in schools and universities raises several fundamental questions. If these tools can be used in a pedagogical way under certain circumstances, and teachers will have to adapt their evaluation methods accordingly, it would be simplistic to adopt an “adaptive” approach limited to individual responsibility. All primary and secondary schools, CEGEPs and university communities are affected.

Moreover, technological innovations arise and spread in a particular socio-economic context, in this case capitalism, which affects the possible uses of algorithmic tools. It would not be surprising if the precarity of teaching intensified, as did the pressure to publish. In our capitalist system based on the imperatives of productivity and performance, this kind of technology will serve above all to intensify the dynamic of competition between students, professors and researchers.

Also, no matter how much we “ban” the use of these tools in the classroom, the technology is now available, free, easily accessible and usable, so we cannot go back or put the spirit back in his lamp.

We are therefore faced with a fundamental reflection on the purpose of teaching and on the relevance (or obsolescence) of the methods used in the field of education. To be honest, I’m amazed. If the teachers are not yet obsolete for the time being, it is certain that a good portion of our current evaluation tools are good for the trash.

To see in video

Leave a Comment