Princeton University Users: If you would like to view a senior thesis while you are away from campus, you will need to connect to the campus network remotely via the Global Protect virtual private network (VPN). If you are not part of the University requesting a copy of a thesis, please note, all requests are processed manually by staff and will require additional time to process.
 

Publication:

Approaching Equalized Odds by Actively Forgetting in Deep-Learning Networks

dc.contributor.advisorChazelle, Bernard
dc.contributor.advisorDoğan, Irmak
dc.contributor.authorElsheikh, Rahma
dc.date.accessioned2025-08-07T15:23:34Z
dc.date.available2025-08-07T15:23:34Z
dc.date.issued2025-04-28
dc.description.abstractThis work explores how Machine Learning (ML) models can learn what and how to forget. We develop a deep-learning image classification Convolution Neural Network (CNN) trained incrementally by class on the RAF-DB database. Our model addresses the Catastrophic Forgetting Problem (CFP) while simultaneously learning to actively forget biases. We propose an iterative, chained learning process for MLs to measure how well knowledge is retained and, in parallel, how effectively biased data is forgotten. Our contributions are threefold: (1) We construct a novel process that uses active forgetting to mitigate biases; (2) We present a nonlinear learning algorithm that emulates human-like behavior; and (3) We present a schema for machines to avoid the CFP parallel to actively forgetting. To the best of our knowledge, combining a continual learning approach with natural active forgetting algorithms to mitigate biases during the training process is unexplored. Its results are applicable to real-world scenarios like security measures and healthcare data. In particular, we find that using active forgetting alongside regularization and replay methods improves the adaptability, efficiency, and fairness of machine learners in dynamic real-world settings. Finally, we discuss ethical considerations and potential future directions.
dc.identifier.urihttps://theses-dissertations.princeton.edu/handle/88435/dsp01zs25xc91c
dc.language.isoen_US
dc.titleApproaching Equalized Odds by Actively Forgetting in Deep-Learning Networks
dc.typePrinceton University Senior Theses
dspace.entity.typePublication
dspace.workflow.startDateTime2025-04-28T20:40:29.700Z
pu.contributor.authorid920251929
pu.date.classyear2025
pu.departmentMathematics

Files

Original bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
final_thesis___actual_final-10.pdf
Size:
2.52 MB
Format:
Adobe Portable Document Format
Download

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
100 B
Format:
Item-specific license agreed to upon submission
Description:
Download