Special programming and simulation methods in high energy physics and astrophysics
Course: Quantum field theory
Structural unit: Faculty of Physics
Title
Special programming and simulation methods in high energy physics and astrophysics
Code
ВБ 4.1
Module type
Вибіркова дисципліна для ОП
Educational cycle
Second
Year of study when the component is delivered
2023/2024
Semester/trimester when the component is delivered
3 Semester
Number of ECTS credits allocated
6
Learning outcomes
To know the theory and problems of modern machine learning models
To know the main approaches to the problems of high-energy astrophysics using neural networks
Be able to use Python, Wolfram Mathematica libraries for data preparation and metric classification, regression problems
To be able to apply theoretical and practical approaches to multidimensional classification problems
Demonstrate the ability to communicate freely in the language of instruction
Use knowledge of foreign languages to read technical literature on programming in preparation for seminar classes
To participate in professional discussions during classroom work
Form of study
Full-time form
Prerequisites and co-requisites
1. Know the basics of mathematical analysis, probability theory, statistical physics and thermodynamics.
2. To be able to apply previously acquired knowledge from the courses of mathematical analysis, probability theory, statistical physics and thermodynamics to solve practical problems.
3. To have basic calculation skills from the theory of differential equations course.
Course content
Part 1. Basic methods
1 Basic concepts of neural networks
2 Stochastic gradient descent, the role of saddle points in the loss landscape, automatic differentiation and backpropagation
3 Basic properties of entropy, differential entropy, Bayes: motivation, Bayes formula, physical examples for updating Bayes, conditional entropy
4 Generative competitive networks, recurrent neural networks
5 natural gradient, Kullback-Leibler divergence and its properties, Fisher information, mutual information
Part 2 Neural networks
6 Mutual information, renormalized mutual information, inverted neural networks
7 Variational autoencoder, generative competitive networks
8 Implicit layers: solving equations and neural differential equations. Hamiltonian and Lagrangian neural networks. Reinforcement learning
9 Big Data architecture, design of data warehouses
Recommended or required reading and other learning resources/tools
1. Pattern Recognition and Machine Learning, Bishop, Christopher (2006).
2. Linked: The New Science Of Networks Science Of Networks, Perseus Books Group (2002).
3. Deep Learning (by Goodfellow, Bengio, Courville) https://www.deeplearningbook.org/
4. An Introduction to Statistical Learning, Gareth Games, Daniela Witten, Travor Hastie, Robert Tibshirani, Springer (2013).
5. Neural Networks A Systematic Introduction, R Rojas, Springer (1996).
Planned learning activities and teaching methods
Lectures
Practical
Individual work
Assessment methods and criteria
Oral survey
Projects
Report
Exam work
Language of instruction
Ukrainian
Lecturers
This discipline is taught by the following teachers
Departments
The following departments are involved in teaching the above discipline