Top 12 AI podcasts to listen to



recent technology in computer science :: Article Creator

New BYU Computer Science Study Shows Four Ways Students Are Actually Using ChatGPT

Two female students point and look at a computer screen.

Photo by BYU Photo

Two female students point and look at a computer screen.

Photo by BYU Photo

Thousands of students are doubtless relying on ChatGPT to prepare for finals this time of year. Instructors less initiated into this ever-evolving, new AI tool might be curious — what exactly are their students doing with it?

Probably quite a lot of things, according to a recent study conducted by a group of BYU professors who had that very question. When ChatGPT burst onto the scene in early 2023, the professors noticed all kinds of confusion on social media about who should be using it and why. They decided to ask 455 BYU students to share the prompts they'd fed ChatGPT and the instructions their teachers gave them about the new tech.

The results showed that students were taking advantage of the tool's interactive, iterative nature to converse with ChatGPT as they might with an instructor. "As one of the students commented, 'It is like my 24/7 TA,'" said BYU computer science professor Amanda Hughes, a co-author of the paper, which was published in the proceedings of 57th Hawaii International Conference on System Sciences. "I thought that was a really interesting description of the technology."

From the survey data, the professors grouped students' use of ChatGPT into four broad categories:

screenshot of ChatGPT app on a phone.

Photo by Solen Feyissa/Unsplash

screenshot of ChatGPT app on a phone.

Photo by Solen Feyissa/Unsplash

  • retrieving basic information ("When did this happen?", "How does this work?")
  • generating content, such as writing a piece of code or outlining a paper
  • revising, such as correcting or improving an essay in a designated way
  • evaluating, such as assessing the quality of a resume.
  • They also found that students used ChatGPT for things like study support, which involved having the tool create questions their professor might put on an exam and then quiz them, or for giving advice.

    "The thing ChatGPT does that is obviously very different from previous tools like Google is that it allows you to have a contextual conversation, where if ChatGPT gives you a response that isn't quite right, you can give it more information and ask it to rephrase with the new information in mind," Hughes said. "With Google, you can ask it a different question, but it doesn't maintain the context of what you originally asked."

    This feature makes the tool more like a human teacher, capable of adapting to students' specific needs. For example, one student talked about how it could be used as a virtual teacher: "I'm a very shy person, but I felt like I could ask all the follow-up questions I needed to understand when I was having a conversation with ChatGPT." Of course, substituting ChatGPT for a human instructor or even a Google search has some inevitable drawbacks for student learning.

    "One of the things I suspect is that tools like ChatGPT may make us content with less accurate and less satisfying answers because it just gives the answer without forcing us to look at other sources," Hughes said. "My theory is that over time this might erode students' ability to discern when they need to keep looking for a better answer."

    Hughes is optimistic overall, though, about ChatGPT as a teaching tool. Not only can AI expedite tedious daily tasks like composing emails, it may also prepare students to learn more advanced skills in a class than they otherwise might.

    "For example, in my field of computer programming, there's a huge learning curve to get to a point where you can build something useful that isn't just a pet program you created for class," she said. "If ChatGPT can help students write code more quickly, it may allow students to imagine bigger and better solutions for a program instead of getting caught up in all the small details of how to make something work."

    Instructors' collective ambivalence about ChatGPT was apparent in the survey responses, with students reporting that some teachers prohibited ChatGPT altogether, while others encouraged using it and gave advice on how to best take advantage of the tool. Based on discussions she's had with other instructors, Hughes believes the future of AI in the classroom likely looks like teaching students how to use the technology to supplement their work.

    "If they don't know how to use tools like ChatGPT, they're going to be at a disadvantage because other people in their industries will be using them and are going to be much more effective as a result."

    The paper was additionally co-authored by Ryan Schuetzler, Justin Giboney, Taylor Wells, Benjamin Richardson, Tom Meservy, Cole Sutton, Clay Posey and Jacob Steffen.


    Computational Journalism Lab Combines Technology, News Distribution Research

    Nick Diakopoulos, along with about 12 other Ph.D. And undergraduate students, design and collaborate on an array of self-curated research projects.

    Communication Prof. Nick Diakopoulos grew up in a family of journalists. His father spent years working at The Providence Journal in Rhode Island, and his older brother also shared industry experience. It seemed natural for him to follow in their footsteps, he said.

    "I was kind of primed to be interested in journalism," Diakopoulos said.

    Instead of crafting ledes and grafs, he delved into lines of code as early as the age of six. He was captivated by how technology could be perceived through the media's lens — as a testament to his enduring connection to journalism.

    While studying computer engineering at Brown University, Diakopoulos began researching the ways computers can manipulate images, video and media. In 2017, after arriving at Northwestern, Diakopoulos's two worlds merged in founding the Computational Journalism Lab.

    Under the School of Communication, the Computational Journalism Lab is an interdisciplinary research group on campus to explore how advances in algorithms, automation and AI impact news distribution and consumption. Diakopoulos — along with about 12 other Ph.D. And undergraduate students with different fields of study — designs and collaborates on an array of self-curated research projects.

    "My goal for the lab was to try to bring students together from different disciplines to work on computational journalism," Diakopoulos said. "It's kind of an expression of my research program on how those computational forces are changing the media ecosystem."

    Sachita Nishal, a fourth-year computer science and communications Ph.D. Candidate, joined the computational lab to explore how news distribution leads to the spread of misinformation. She has geared her research interest toward building tools for reporters to strengthen news interest online.

    Nishal said she's designed a digital helper that ranks scientific articles by categories of newsworthiness for reporters to discover new story ideas. Nishal is now working on a tool to break down complex jargon from scientific articles for readers to easily digest.

    "It's helping science communicators communicate with laypersons," Nishal said. "What we're trying to do is ground the responses of the language models in more truth.

    The computational lab members meet a few times per quarter to present on their ongoing studies, receive feedback from Diakopoulos and hold social gatherings. It's more than just a "forum of education," he said.

    Lab members recently published a paper report with the Associated Press after developing a survey around generative AI use in news production. Almost 300 journalists and newsroom members responded to the survey, citing an increasing use of generative AI.

    Another member, Julia Barnett, a third-year Ph.D. Candidate in computer science and communications, worked previously as a senior data analyst at The Washington Post. Barnett said her research focuses on social data and artificial ethics.

    Barnett said the lab has provided her a classroom but also a social home.

    "A Ph.D. Is a pretty isolating undertaking that anyone can do, and labs are really the social component, at least at Northwestern," Barnett said. "We'll work in the same spaces. And we share the same methods and the same kind of interactions that we have."

    Email: [email protected] 

    X: @Jerrwu

    Related Stories:

    — Northwestern Football alumni create AI research start-up Consensus

    — TIILT Lab uses Minecraft to improve learning outcomes, accessibility for students

    — School of Communication, Weinberg and Medill faculty join forces to launch new graduate program


    University Of Toronto Initiative Promotes Ethical Considerations In Computer Science Studies

    Computer science students at the University of Toronto are learning how to incorporate ethical considerations into the design and development of new technologies such as artificial intelligence with the help of a unique undergraduate initiative.

    The Embedded Ethics Education Initiative (E3I) aims to provide students with the ability to critically assess the societal impacts of the technologies they will be designing and developing throughout their careers. That includes grappling with issues such as AI safety, data privacy and misinformation.

    Program co-creator Sheila McIlraith, a professor in the department of computer science in the Faculty of Arts & Science and an associate director at the Schwartz Reisman Institute for Technology and Society (SRI), says E3I aims to help students "recognize the broader ramifications of the technology they're developing on diverse stakeholders, and to avoid or mitigate any negative impact."

    First launched in 2020 as a two-year pilot program, the initiative is a collaborative venture between the department of computer science and SRI in association with the department of philosophy. It integrates ethics modules into select undergraduate computer science courses – and has reached thousands of U of T students in this academic year alone.

    Malaikah Hussain is one of the many U of T students who has benefited from the initiative. As a first-year student enrolled in CSC111: Foundations of Computer Science II, she participated in an E3I module that explored how a data structure she learned about in class laid the foundation of a contact tracing system and raised ethical issues concerning data collection.

    "The modules underlined how the software design choices we make extend beyond computing efficiency concerns to grave ethical concerns such as privacy," says Hussain, who is now a third-year computer science specialist.

    Hussain adds that the modules propelled her interest in ethics and computing, leading her to pursue upper year courses on the topic. During a subsequent internship, she organized an event about the ethics surrounding e-waste disposal and the company's technology life cycle.

    "The E3I modules have been crucial in shaping my approach to my studies and work, emphasizing the importance of ethics in every aspect of computing," she says.

    The program, which initially reached 400 students, has seen significant growth over the last four years. This academic year alone, total enrolment in computer science courses with E3I programming has exceeded 8,000 students. Another 1,500 students participated in E3I programming in courses outside computer science.

    In recognition of the program's impact on the undergraduate student learning experience, McIlraith and her colleagues – Diane Horton and David Liu, a professor and associate professor, teaching stream, respectively, in the department of computer science, and Steven Coyne, an assistant professor in the department of philosophy with a cross appointment to computer science – were recently recognized with the 2024 Northrop Frye Award (Team), one of the prestigious U of T Alumni Association Awards of Excellence.

    Horton, who leads the initiative's assessment efforts, points to the team's recently published paper showing that after participating in modules in only one or two courses, students are inspired to learn more about ethics and are benefiting in the workplace.

    "We have evidence that they are better able to identify ethical issues arising in their work, and that the modules help them navigate those issues," she says.

    Horton adds that the findings build on earlier assessment work showing that after experiencing modules in only one course, students became more interested in ethics and tech, and more confident in their ability to deal with ethical issues they might encounter.

    The team says the initiative's interdisciplinary nature is key to delivering both a curriculum and experience with an authentic voice, giving instructors and students the vocabulary and depth of knowledge to engage on issues such as privacy, well-being and harm.

    "As a philosopher and ethicist, I love teaching in a computer science department," says Coyne. "My colleagues teach me about interesting ethical problems that they've found in their class material, and I get to reciprocate by finding distinctions and ideas that illuminate those problems. And we learn a lot from each other – intellectually and pedagogically – when we design a module for that class together."

    E3I is founded upon three key principles: teach students how – not what – to think; encourage ethics-informed design choices as a design principle; and make discussions safe, not personal.

    "Engaging with students and making them feel safe, not proselytizing, inviting the students to participate is especially important," says Liu.

    The modules support this type of learning environment by using stakeholders with fictional character profiles that include names, pictures and a backstory.

    "Fictional stakeholders help add a layer of distance so students can think through the issues without having to say, 'This is what I think,'" Horton says. "Stakeholders also increase their awareness of the different kinds of people who might be impacted."

    McIlraith adds that having students advocate for an opinion that is not necessarily their own encourages empathy, while Liu notes that many have a "real hunger" to learn about the ethical considerations of their work.

    "An increasing number of students are thinking, 'I want to be trained as a computer scientist and I want to use my skills after graduation,' but also 'I want to do something that I think will make a positive impact on the world,'" he says.

    Together, the E3I team works with course instructors to develop educational modules that tightly pair ethical concepts with course-specific technical material. In an applied software design course, for example, students learn about accessible software and disability theory; in a theoretical algorithms course, they learn about algorithmic fairness and distributive justice; and in a game design course, they learn about addiction and consent.

    Steve Engels, a computer science professor, teaching stream, says integrating an ethics module about addiction into his fourth-year capstone course on video game design felt like a natural extension of his lecture topic on ludology – in particular, the psychological techniques used to make games compelling – instead of something that felt artificially inserted into the course.

    "Project-based courses can sometimes compel students to focus primarily on the final product of the course, but this module provided an opportunity to pause and reflect on what they were doing and why," Engels says. "It forced them to confront their role in the important and current issue of gaming addiction, so they would be more aware of the ethical implications of their future work and thus be better equipped to handle it."

    By next year, each undergraduate computer science student will encounter E3I modules in at least one or two courses every year throughout their program. The team is also exploring the adoption of the E3I model in other STEM disciplines, from ecology to statistics. Beyond U of T, the team plans to share their expertise with other Canadian universities that are interested in developing a similar program.

    "This initiative is having a huge impact," McIlraith says. "You see it in the number of students we're reaching and in our assessment results. But it's more than that – we're instigating a culture change."






    Comments

    Follow It

    Popular posts from this blog

    Reimagining Healthcare: Unleashing the Power of Artificial ...

    Top AI Interview Questions and Answers for 2024 | Artificial Intelligence Interview Questions

    What is Generative AI? Everything You Need to Know