Abstract: The theory of L-functions lies at the heart of the Langlands program and in this talk we will focus on their building blocks, the local L-factors. To be more precise, we will recall in the first part of the
Neural network compression has been an increasingly important subject, not only due to its practical relevance, but also due to its theoretical implications, as there is an explicit connection between compressibility and generalization error. In this talk, I will present