Fanshawe integrates Microsoft Copilot
Jim Cooper, Program Coordinator of Artificial Intelligence and Machine Learning at Fanshawe College, discusses the integration of Microsoft Copilot into the academic framework, highlighting both the benefits and challenges of using AI in education.
Fanshawe College has recently integrated Microsoft's Copilot Chat Assistant and Copilot for Web with Commercial Data Protection (CDP) into its Artificial Intelligence (AI) tools, marking a significant shift towards adopting AI in education. While this advancement is seen as a positive step, it comes with its own set of challenges.
According to the Program Coordinator of Artificial Intelligence and Machine Learning (Co-op) at Fanshawe, Jim Cooper, the college's decision to use Copilot is well thought out.
'Fanshawe made a good choice because Copilot is designed for businesses and academic institutions, so it fits well with our needs,' Cooper said.
However, Cooper acknowledged that it has its faults.
'You have to be skeptical of everything you get from it,' he warned, noting that while the system often provides reliable responses, it can also generate incorrect information with the same confidence level as accurate outputs.
In the past, students used Google Answers and learned to sift through varying degrees of reliability from different websites. Now, with Copilot, AI-generated answers are presented with a false sense of certainty.
'It gives the illusion that whatever the AI says is good information, but unfortunately, it's not always accurate,' Cooper said.
One of the main concerns raised by Cooper is ensuring that students are still engaged in the learning process while using AI tools like Copilot.
'The challenge for most faculty members is finding new ways to evaluate students,' Cooper explained. With AI able to generate responses for assignments, students may complete tasks without fully understanding the material. 'If we don't adapt our projects, students may be able to use these tools without actually learning.'
Cooper has been experimenting with ways to address this issue. One method he employs is providing students with pre-written code and asking them to explain how it works rather than having them write code from scratch.
'This forces them to engage with the material more deeply,' Cooper said, adding that it's harder for students to rely on Copilot in such cases. But this adaptation requires effort from faculty members to ensure that assignments still promote critical thinking and problem-solving rather than just copying AI-generated work.
Beyond these practical challenges, Cooper highlighted ethical concerns surrounding AI, particularly around copyright.
'There's still a lot of potential for copyright violations,' Copper said.
AI tools like Copilot can generate images and other media by pulling elements from existing copyrighted works without proper attribution. While this may not have severe commercial implications in an academic context, Cooper stressed the importance of addressing these concerns. 'Fanshawe takes copyright issues seriously, and this is something we're still navigating.'
Despite these challenges, Cooper sees AI as an essential tool for modern education. 'You can't ignore it. It's not going away,' Cooper stated. He believes that it is necessary to integrate AI tools like Copilot into the curriculum without allowing them to replace the learning process. 'It's not going to be easy, but we're all working on it,' Cooper concluded.
Educators must continuously reconsider their teaching and assessment methods to ensure that AI enhances traditional learning.