CV
General Information
Full Name | Amir Joudaki |
Tagline | Mathematical understanding of deep neural networks |
Location | Zurich, SWITZERLAND |
Languages | English, Persian, German (A2) |
Experience
-
Nov 2024 – now
Postdoctoral Researcher
ETH Zurich
- Researching how deep nonlinear models learn by investigating module interactions, forward passes, backward gradients, and the evolution of neural kernels.
Education
-
Feb 2017 – 2024
Direct Ph.D. (Msc + Ph.D.) in Artificial Intelligence
ETH Zurich
- Highly selective program (<5% admissions) focused on the mathematical foundations of deep neural networks and AI for biomedicine.
- PhD Thesis: 'On a mathematical understanding of deep neural networks'.
- Authored 8 papers (6 first-authored) in top-tier venues; supervised over 10 MSc projects.
- Supervisors: Gunnar Ratsch & Francis Bach.
-
Feb 2014 – Jan 2017
M.Sc. in Cognitive Neuroscience
International School for Advanced Studies (SISSA)
- Thesis: 'Modeling activity of electrophysiological recordings in vivo in rats'.
-
Sept 2008 – Sept 2011
B.Sc. in Computer Engineering
Sharif University of Technology
Awards
-
2017
- Direct Doctorate Fellowship (selected as 1 of 2 out of >100 candidates for ETH Zurich’s direct PhD program)
-
2011
- Ranked 42nd in the National Higher Education Entrance Exam (among 50,000 participants)
-
2007
- Ranked 369th in the National University Entrance Exam (among 400,000 participants)
Publications
-
Mathematical Foundations of AI
- Emergence of globally attracting fixed points in deep neural networks with nonlinear activations (AISTATS, poster)
- Batch normalization without gradient explosion: Towards training without depth limits (ICLR 2024, poster)
- On the impact of activation and normalization in obtaining isometric embeddings at initialization (NeurIPS 2023, poster)
- On Bridging the Gap between Mean Field and Finite Width in Deep Random Neural Networks with Batch Normalization (ICML 2023, poster)
- Batch Normalization Orthogonalizes Representations in Deep Random Networks (spotlighted NeurIPS 2021, top 3% submissions)
- PCA Subspaces Are Not Always Optimal for Bayesian Learning (NeurIPS 2021 workshop, DistShift)
-
Genomic Sequence Analysis
- Identifying Biological Priors and Structure in Single-Cell Foundation Models (ICML workshop 2024)
- Learning Genomic Sequence Representations using Graph Neural Networks over De Bruijn Graphs (NeurIPS workshop 2023)
- Aligning distant sequences to graphs using long seed sketches (Journal of Genome Research 2023)
- Fast Alignment-Free Similarity Estimation By Tensor Sketching (RECOMB 2021)
- Sparse Binary Relation Representations for Genome Graph (Journal of Computational Biology 27.4, 2020)
-
Dimensionality Reduction
- Nonlinear Dimensionality Reduction via Path-Based Isometric Mapping (IEEE TPAMI 38(7), 1452–1464)
-
Functional Brain Networks Analysis
- Properties of functional brain networks affect frequency of psychogenic non-epileptic seizures (Frontiers in Human Neuroscience, 6:335)
- EEG-based functional brain networks: does the network size matter? (PLoS ONE 7(4): e35673)