Assistant Professor of Computer Science, Aalto University
Contact information
My research group
Publications
Teaching
Short CV
My Top Achievements
The overarching goal of our research is to develop a theory that allows to tell if (and how) machine learning is possible based on
data arising in a particular application. Ideally, a few easily measurable quantities (such as temperature in physics) tell us how
complicated machine learning systems, e.g. deep neural networks with billions of tuneable weights, will behave. A main
inspiration for our ambition are the accomplishments in physics where a single number (the temperature) can tell a lot
about the global behaviour of incredibly complex systems such as a lake containing billions of water molecules.
Fast Facts
Selected Publications

H. Ambos, N. Tran, A. Jung The Logistic Network Lasso, May 2018. preprint can be found here

A. Jung. On the Complexity of Sparse Label Propagation. Frontiers in Appl. Math. and Stat., Jun. 2018. click here

A. Jung, N. Tran and A. Mara. When is Network Lasso Accurate?. Frontiers in Appl. Math. and Stat., Jan. 2018. click here

A. Jung A FixedPoint of View on Gradient Methods for Big Data. Frontiers in Appl. Math. and Stat., Sept. 2017. Preprint

N. Tran and A. Jung. On the Sample Complexity of Graphical Model Selection for NonStationary Processes. submitted , Jan. 2017. Preprint
 A. Jung, Y. C. Eldar, and N. Goertz. On the Minimax Risk of Dictionary Learning. IEEE Trans. Inf. Theory, vol. 62, no. 3, March 2016. Preprint: arXiv:1507.05498 [stat.ML]

A. Jung. Learning the Conditional Independence Structure of Stationary Time Series: A Multitask Learning Approach. IEEE Trans. Sig. Proc., vol 63, no. 21, Nov. 2015.
Preprint
 A. Jung, G. Tauboeck and F. Hlawatsch. Compressive Spectral Estimation for Nonstationary Random Processes.
IEEE Trans. Inf. Theory, vol. 59, no. 5, May 2013. Preprint
Talks

slides for invited talk at Royal Caribbean Cruises Limited , June 2018.

check out my slides NeuralBased Machine Learning in Finland: From Kohonen to Jung, May 2018. click here

Compressed Sensing of Big Data Networks
at EPFL, Lausanne, Dec. 2017.

Solving the Food Waste Challenge: An Artificial Intelligence perspective
at Slush side event ``The Future is BioDigital: The Packaging Valley'', Helsinki, Nov. 2017.

Machine Learning for Big Data
at AI Helsinki Seminar, Helsinki, Nov. 2017.

Information Theory (IT) of Machine Learning for Big Data
at Future IT for Research seminar, Aalto University, Oct. 2017.

Backpropagation
at the Artificial Intelligence Laboratory, Vrije Universiteit Brussel, June 2017.

Machine Learning for Big Data over Networks: When is Network Lasso Accurate?
in complex systems and networks seminar at Dept. of Computer Science, Aalto University, April 2017.
When is Network Lasso Accurate? at the Signal Processing and Speech Communication Laboratory (SPSC Lab) TU Graz, April 2017.

When is Network Lasso Accurate?
in group seminar of Prof. Thomas Pock at TU Graz, April 2017.

Compressed Sensing for Learning from Big Data over Networks
at International Institute for Applied Systems Analysis (IIASA) Austria, March 2017.

Compressed Sensing for Learning from Big Data over Networks
at Johannes Kepler University (JKU) Linz, Feb. 2017.

Compressed Sensing for Learning from Big Data over Networks .
within the
Large Structures Seminar of Aalto University, Feb. 2017.

Compressed Sensing for SemiSupervised Learning from Big Data over Networks.
within the
Machine Learning Coffee Seminars of Aalto University and the University of Helsinki, Feb. 2017.

Compressed Sensing for Big Data over Networks. in group seminar of Prof. Sjoerd Dirksen and Prof. Holger Rauhut at RWTH Aachen, Jan. 2017.
 Graphical Model Selection for Big Data over Networks. within the Stochastic Sauna at Aalto University,
Dec. 2016.
 Graph Signal Recovery using Convex Optimization. within the
Helsinki Algorithms Seminar of Aalto University and the University of Helsinki, May 2016.
 An InformationTheoretic Approach to Dictionary Learning. TU Berlin, June 2015.
 On the SampleComplexity of Dictionary Learning. within the Mathematics Colloquium of the University of Innsbruck, June 2015.
 Performance Limits of Dictionary Learning for Sparse Coding: An InformationTheoretic Approach. University of Michigan, June 2014.
 Compressive Nonparametric Graphical Model Selection for Time Series: A Multitask Learning Approach. The University of Edinburgh, Oct. 2013.
first.last at aalto.fi