Integrating AI into “Politics of Data” Course

Project Overview

We are redesigning INAF 1000: Politics of Data, a discussion-based Proseminar for twelve first-year School of Foreign Service students. INAF 1000 is unique in that, as a Proseminar, it is tasked with not just teaching content (focused on data & algorithms) but also helping students thrive at Georgetown. This involves helping students cultivate community and forge friendships, find mentors, build critical reading, writing, and dialogue skills, and develop ways to be better students and citizens during their time on the Hilltop. In the past, we have partnered with the Red House’s In Your Shoes program to achieve some of these goals. 

The proposed re-design leverages ChatGPT to not just deliver content but help students get to know each other, engage in conversations, and build community. Specifically, we will create five to six 30-minute learning modules to be delivered every other week. Each module will be on a topic relevant to the course and involve multiple perspectives (e.g., bias in algorithms, AI harms vs. risks, gig work). 

For each topic, instructors will identify the relevant perspectives and train ChatGPT to speak from that perspective when responding to user prompts. During the exercise, students will be randomly assigned to one of these perspectives and leverage the AI system to more deeply understand the perspective they have been assigned and steelman their perspective in dialogue with peers. We hypothesize that this could introduce perspectives that may have been avoided in regular discussions (especially if it is sensitive or an unpopular opinion) and allow students to feel more comfortable critiquing ideas because they came from an AI system and are disassociated from the speaker. It could also speed up the process of helping students find associations between ideas and move into deeper dialogue more quickly by playing the ChatGPT-enabled “devil’s advocate” role and bringing up perspectives the group had previously ignored. Finally, broader themes, such as “discovering bias in large language models,” will tie the five modules together. 

Ultimately, this course re-design is an extended version of In Your Shoes, but with perspectives augmented by an AI system.

Team

Rajesh Veeraraghavan

Science Technology and International Affairs (STIA) Program, SFS

Ashley Lin

Georgetown SFS ’26 (Major: Science, Technology, and International Affairs)

css.php