Nathaniel Hudson, Ph.D.
A brief introduction.

312 John Crerar Library
5730 S Ellis Ave
Chicago, IL 60637
I am a computer scientist, currently serving as a Postdoctoral Scholar at Globus Labs out of the University of Chicago’s Department of Computer Science.
A high-level description of my research is the design of systems for serving AI on edge computing infrastructure — i.e., Edge Intelligence (EI) — for smart city applications. More specifically, my research centers around challenges related to resource limitations available at the edge for supporting EI. Trade-offs between latency, accuracy, resource usage, etc. are common themes in my work.
Some areas of study my research touches include (but are not limited to):
- federated learning
- service placement and request scheduling
- lossy compression techniques
- social mining
- modeling of information diffusion processes
- interdependent and complex networks
- cyber-physical systems
recent news
Jun 30, 2025 | Research that explores how active learning methods can improve the rate of novel scientific discovery in generative AI workflows has been accepted for publication at the 2025 IEEE e-Science conference. This paper specifically studies the discovery novel metal-organic frameworks (MOFs) in the MOFA workflows presented in an earlier work. |
---|---|
Jun 26, 2025 | Our paper for Flight, a hierarchical federated learning framework, has been accepted for publication through the Future Generation Computer Systems journal. |
May 20, 2025 | My former summer undergraduate student mentee, Jordan Pettyjohn, was recently awarded 1st place in the ACM Student Research Competition (SRC) Grand Finals in the graduate competition. This was for his work I worked with him on investigating toxicity ablation in large language models (source, retrieved May 24, 2025). |
May 16, 2025 | Assistant Professorship at the Illinois Institute of Technology. |
Feb 11, 2025 | Paper on mitigating memorization in language models, was selected to be presented as a Spotlight Paper at this year’s ICLR conference. |
Feb 7, 2025 | A paper on causal discovery over hypothesis spaces has been accepted to be published in the Transactions on Machine Learning Research (TMLR). A preprint for this work can be found here. |
Jan 22, 2025 | Thrilled to announce a recent paper of ours investigating memorization in large language models has been accepted for publication by this year’s ICLR conference. A preprint of this work is available on arXiv and a succinct blog post on its results is also available. |
Nov 21, 2024 | Research awarded 1st Place in the ACM Student Research Competition at 2024 IEEE/ACM Supercomputing conference. |
Oct 7, 2024 | A preprint of a recent work where we explore mitigation strategies for memorization in language models has been made publicly available on arXiv. Click here for the paper. For a more brief dive into the material, please see this blog (here) on the work. |
Sep 19, 2024 | Very happy to announce that TaPS, an evaluation suite for execution frameworks and data management systems, has been awarded the “Best Paper Award” at the 2024 IEEE eScience conference. Read the preprint of the paper here. |