Спікери

Bulka Ivan
Bulka Ivan, LNU, Senior Data Scientist / Deep Learning Engineer

LLM, Transformers, BERT, GPT models family, LLM fine-tuning techniques

Transformer-based language models and their evolution will be explored, from foundational architectures like BERT and the GPT family to modern large language models. We will cover technical advances in model design, training methodologies, and practical fine-tuning approaches including parameter-efficient methods like LoRA and preference optimization techniques. Sessions will examine both the theoretical foundations and real-world applications of adapting these models for specialized domains and tasks.

Dutsiak Oleh
Dutsiak Oleh, LNU, Senior Software Engineer

Security of computer networks and the Internet

This session provides a comprehensive look at the evolving threat landscape of modern computer networks, from low-level protocol exploits to sophisticated Internet-scale attacks. We will analyze the implementation of robust security frameworks, including encryption, firewalls, and intrusion detection systems, to safeguard digital communications.

Yakubovych Maksym
Yakubovych Maksym, LNU, Engineering Manager, GlobalLogic<br><br> We'll run through the comprehensive overview of modern data architecture, moving from the fundamentals of relational and non-relational models to the complexities of distributed systems. Going forward to data warehousing and ETL.

Database basics, relational, non-relational, distributed databases, data warehouse, ETL, data workflows

Mykola Stasiuk
Mykola Stasiuk, LNU, Senior Software Engineer, GlobalLogic

ML and DL basics, supervised, reinforcement and unsupervised learning
This lecture introduces the fundamental concepts of Machine Learning (ML) and Deep Learning (DL) as key approaches in modern data-driven systems. The session covers the basic principles of learning from data, the differences between classical machine learning and deep learning models, and the role of labeled and unlabeled data in model training. Special attention is given to the main learning paradigms: supervised, unsupervised, and reinforcement learning. Supervised learning is presented through typical tasks such as classification and regression, unsupervised learning through clustering and representation learning, and reinforcement learning through agent–environment interaction and reward optimization. The lecture also discusses typical applications, strengths, and limitations of each approach, providing an intuitive understanding of when and why particular learning paradigms are used. By the end of the lecture, students will have a conceptual foundation for further study of machine learning algorithms and deep neural networks.

Markiyan Fostiak
Markiyan Fostiak, Software Engineering Lead

Ihor Rohatskyi
Ihor Rohatskyi, LNU, Engineering Manager, GlobalLogic

Git, Data Version Control (DVC), Data sources (Kaggle, etc.), ML Hubs (Hugging Face, etc.)

Overview of project and data management practices in Machine Learning, including version control for code and datasets using Git and Data Version Control (DVC). The topic also covers widely used data sources such as Kaggle and modern ML hubs like Hugging Face for accessing datasets, models, and benchmarks. Special attention is given to reproducibility, collaboration, experiment tracking, and organizing ML assets across the full project lifecycle.

Vitalii Pretsel
Vitalii Pretsel, LNU, Software Engineer at Leobit

Python for data analysis, data visualization, and data mining

Exploring core Python libraries and methods for analyzing, visualizing, and mining data. Data preparation, exploratory data analysis, application of statistical and machine learning methods.

Yatskiv Oleh
Yatskiv Oleh, Staff Software Engineer at Nexla with over 12 years of experience in industry

Generative models basics

In this speech, we will explore the fundamentals of generative models, delving into their core concepts and how they create new data, such as images, text, and audio. We’ll highlight their connection to cutting-edge technologies like AI and deep learning, showcasing practical applications across various fields. Expect insights into the mechanics, benefits, and transformative potential of generative models in shaping innovation.

Kozynets Andrian
Kozynets Andrian, LNU, Middle Backend Developer - COXIT

TensorFlow/Keras/PyTorch basics and building own models

Lecture provides an overview of popular deep learning frameworks used for developing neural network models, including TensorFlow, Keras, and PyTorch. The session introduces the core concepts of deep learning workflows. The lecture highlights the design philosophy and practical differences between high-level and low-level APIs, with a focus on ease of use, flexibility, and typical application scenarios. Through illustrative examples, students are introduced to the process of constructing and training simple neural network models using both Keras and PyTorch. Additionally, the lecture outlines common challenges faced when building custom models and discusses criteria for selecting an appropriate framework.