Assistant Professor of Computer Science, Aalto University
My research group
My Top Achievements
The overarching goal of our research is to develop a theory that allows to tell if (and how) machine learning is possible given the available
resources (data, computation, privacy,...). To this end, we try to identify easily measurable quantities (such as temperature in physics)
which allow to predict the behaviour of highly complex machine learning systems (e.g. a deep neural network with billions of tuneable weights which is applied
to massive unstructured data). We draw inspiration from the accomplishments in physics where a single number (the temperature) can tell a lot
about the global behaviour of incredibly complex systems such as a lake which is constituted by billions of water molecules.
H. Ambos, N. Tran, A. Jung The Logistic Network Lasso, May 2018. preprint can be found here
A. Jung. On the Complexity of Sparse Label Propagation. Frontiers in Appl. Math. and Stat., Jun. 2018. click here
A. Jung, N. Tran and A. Mara. When is Network Lasso Accurate?. Frontiers in Appl. Math. and Stat., Jan. 2018. click here
A. Jung A Fixed-Point of View on Gradient Methods for Big Data. Frontiers in Appl. Math. and Stat., Sept. 2017. Preprint
N. Tran and A. Jung. On the Sample Complexity of Graphical Model Selection for Non-Stationary Processes. submitted , Jan. 2017. Preprint
- A. Jung, Y. C. Eldar, and N. Goertz. On the Minimax Risk of Dictionary Learning. IEEE Trans. Inf. Theory, vol. 62, no. 3, March 2016. Preprint: arXiv:1507.05498 [stat.ML]
A. Jung. Learning the Conditional Independence Structure of Stationary Time Series: A Multitask Learning Approach. IEEE Trans. Sig. Proc., vol 63, no. 21, Nov. 2015.
- A. Jung, G. Tauboeck and F. Hlawatsch. Compressive Spectral Estimation for Nonstationary Random Processes.
IEEE Trans. Inf. Theory, vol. 59, no. 5, May 2013. Preprint
Recent Talks and Lectures
first.last at aalto.fi