Pick My Brain Seminar

Fall 2021, Northeastern

Meeting quasi-weekly on Wednesdays at 2:50PM in 544 Nightingale Hall or via Zoom.

This is an informal colloquium style seminar which provides the opportunity to learn about others areas of interest. Questions are encouraged!

Organizer: Vance Blankers

If you have a question, would like receive announcements, or would like to speak at seminar, please email v.blankers [at] This seminar is supported in part by a Research and Training Grant (RTG) from the NSF.


Date Speaker Title
Sept 29 Matej Penciak (Northeastern University) The many faces of the Calogero-Moser system

In this talk I will introduce the notion of an integrable system and focus on one well known example: The Calogero-Moser system. After defining the system, I will describe some of the many ways one can prove its integrability. In so doing, we will see a few of the wide range of topics the CM system touches: Cauchy matrices, Hitchin systems, Cherednik algebras, and more.

Oct 13 Vance Blankers (Northeastern University) Compactifying the moduli space of curves

The moduli space of curves is an important object in modern algebraic geometry, both interesting in its own right and serving as a test space for broader geometric programs. These often require the space to be compact, which leads to a variety of choices for compactification, the most well-known of which is the Deligne-Mumford-Knudsen compactification by stable curves, originally introduced in 1969. Since then, several alternative compactifications have been constructed and studied, and in 2013 David Smyth used a combinatorial framework to make progress towards classifying all "sufficiently nice" compactifications. In this talk, I'll discuss some of the most well-studied compactifications, as well as two new compactifications, which together classify the Gorenstein compactifications in genus 0 and genus 1.

Oct 27 Ahmad Reza Haj Saeedi Sadegh (Northeastern University) Tangent Groupoid and Local Index Formula

In this talk I will explain the local index method of Patodi-Getzler using tangent groupoids. I also explain other methods of extracting local index formulas where the conventional methods won’t apply.

Nov 10 Robin Walters (Northeastern University) Symmetry in Neural Networks

Neural networks define function spaces which are dense within the space of continuous functions $\mathbb{R}^m \to \mathbb{R}^n$ and can be fit to data $\lbrace x_i,y_i \rbrace$ more accurately than other regression methods. We'll explore two applications of representation theory to neural networks. Firstly, consider the case in which the data represent an $G$-equivariant function. In this case, we can consider spaces of equivariant neural networks which may more easily be fit to the data using gradient descent. Secondly, we can consider symmetries of the function space as well. Exploiting these symmetries can lead to models with fewer free parameters, faster convergence, and more stable optimization. This is a joint work with Iordan Ganev (

End of Semester (Seminar resumes Spring 2022)

Previous Semesters

Fall 2020
Spring 2020
Fall 2019
Spring 2019
Fall 2018
Spring 2018
Fall 2017