Speakers

LLM, Transformers, BERT, GPT models family, LLM fine-tuning techniques
Transformer-based language models and their evolution will be explored, from foundational architectures like BERT and the GPT family to modern large language models. We will cover technical advances in model design, training methodologies, and practical fine-tuning approaches including parameter-efficient methods like LoRA and preference optimization techniques. Sessions will examine both the theoretical foundations and real-world applications of adapting these models for specialized domains and tasks.

Security of computer networks and the Internet
This session provides a comprehensive look at the evolving threat landscape of modern computer networks, from low-level protocol exploits to sophisticated Internet-scale attacks. We will analyze the implementation of robust security frameworks, including encryption, firewalls, and intrusion detection systems, to safeguard digital communications.
We'll run through the comprehensive overview of modern data architecture, moving from the fundamentals of relational and non-relational models to the complexities of distributed systems. Going forward to data warehousing and ETL.

Database usage for Data Science, Data Analysis and Machine Learning
Database usage for Data Science, Data Analysis and Machine Learning. The presentation explores the role of databases in Data Science, Data Analysis, and Machine Learning workflows. Key topics include selecting appropriate database types (SQL, NoSQL, and graph databases) for different data scenarios, optimizing data retrieval and preprocessing for analysis, and integrating databases with machine learning pipelines. Real-world examples demonstrate how effective database management enhances data-driven insights and supports scalable machine learning applications.

ML and DL basics, supervised, reinforcement and unsupervised learning
This lecture introduces the fundamental concepts of Machine Learning (ML) and Deep Learning (DL) as key approaches in modern data-driven systems. The session covers the basic principles of learning from data, the differences between classical machine learning and deep learning models, and the role of labeled and unlabeled data in model training.
Special attention is given to the main learning paradigms: supervised, unsupervised, and reinforcement learning. Supervised learning is presented through typical tasks such as classification and regression, unsupervised learning through clustering and representation learning, and reinforcement learning through agent–environment interaction and reward optimization.
The lecture also discusses typical applications, strengths, and limitations of each approach, providing an intuitive understanding of when and why particular learning paradigms are used. By the end of the lecture, students will have a conceptual foundation for further study of machine learning algorithms and deep neural networks.

This presentation explores Amazon Web Services (AWS) and its capabilities for managing and analyzing big data. Key topics include AWS services such as S3, Redshift, EMR, and Glue, which facilitate data storage, processing, and analysis at scale. Practical use cases demonstrate how AWS empowers businesses to extract insights from large datasets, optimize workflows, and enable machine learning applications in a cost-effective and scalable manner.

Cloud computing basics, SaaS, PaaS and IaaS
This talk introduces the fundamentals of cloud computing and the three main service models: Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS). We will compare these models in terms of shared responsibility, scalability, customization, typical use cases, and practical trade-offs. Using real-world examples and best practices, participants will build a strong foundation for selecting and applying cloud solutions in modern software projects.

Git, Data Version Control (DVC), Data sources (Kaggle, etc.), ML Hubs (Hugging Face, etc.)
Overview of project and data management practices in Machine Learning, including version control for code and datasets using Git and Data Version Control (DVC). The topic also covers widely used data sources such as Kaggle and modern ML hubs like Hugging Face for accessing datasets, models, and benchmarks. Special attention is given to reproducibility, collaboration, experiment tracking, and organizing ML assets across the full project lifecycle.

Python for data analysis, data visualization, and data mining
Exploring core Python libraries and methods for analyzing, visualizing, and mining data. Data preparation, exploratory data analysis, application of statistical and machine learning methods.

Integration Challenges of Large Language Models in Microservices
This presentation explores how large language models can be integrated into microservice-based architectures. It introduces common architectural integration patterns for using LLMs in distributed systems and examines the key challenges that arise, including reliability, performance, latency, scalability, and cost. The talk highlights architectural trade-offs and evolving best practices, showing how traditional microservice principles adapt when probabilistic AI components are introduced into modern software systems.

Google Colab and Elements of Artificial Intelligence Application: Automation of Computations and Creation of AI Projects
This two-session cycle is dedicated to the practical use of Google Colab for automating analytical computations. The first session covers an overview of the Google Colab platform, its cloud computing capabilities, integration with Python libraries (TensorFlow, PyTorch, Pandas, OpenAI API), and applications in machine learning and workflow automation. The second session takes the form of a hands-on workshop, during which participants will learn how to create their own AI scripts for automatic data processing, text and image generation, and collaborative coding in Colab.
All materials (code examples, links to reference resources) are provided to participants for further independent use. The format is suitable for both beginners and users with basic Python knowledge.

Generative models basics
In this speech, we will explore the fundamentals of generative models, delving into their core concepts and how they create new data, such as images, text, and audio. We’ll highlight their connection to cutting-edge technologies like AI and deep learning, showcasing practical applications across various fields. Expect insights into the mechanics, benefits, and transformative potential of generative models in shaping innovation.

TensorFlow/Keras/PyTorch basics and building own models
Lecture provides an overview of popular deep learning frameworks used for developing neural network models, including TensorFlow, Keras, and PyTorch. The session introduces the core concepts of deep learning workflows. The lecture highlights the design philosophy and practical differences between high-level and low-level APIs, with a focus on ease of use, flexibility, and typical application scenarios. Through illustrative examples, students are introduced to the process of constructing and training simple neural network models using both Keras and PyTorch. Additionally, the lecture outlines common challenges faced when building custom models and discusses criteria for selecting an appropriate framework.

CV, CV tasks, CV Libraries and frameworks (e.g. OpenCV)
Discover how machines "see" and interpret the visual world. Explore core Computer Vision (CV) concepts, breaking down key tasks like object detection, classification, and segmentation. The session also introduces the practical toolkit for developers, highlighting OpenCV and other essential frameworks used to build real-world vision applications.

This lecture is a short introduction to Microsoft Azure and how it is used to work with large amounts of data. We will briefly look at what Big Data means, what Azure is, and which Azure services are commonly used for data storage and processing.

Google Cloud Platform and Big Data in GCP
This lecture introduces Google Cloud Platform (GCP) and its role in big data management and analysis. It explores key cloud services such as Compute Engine, Cloud Storage, and BigQuery, and explains how GCP supports scalable, cost-efficient big data processing through tools like Dataflow, Dataproc, and Pub/Sub. The session emphasizes core principles of cloud computing, data warehousing, and real-time analytics, providing a practical understanding of how GCP enables the development of end-to-end data solutions.

NLP, NLP tasks, NLP libraries and frameworks (e.g. NLTK)
This session introduces Natural Language Processing (NLP)—how machines work with human language to extract meaning from text and produce useful outputs. We will review the most common NLP tasks (e.g., text classification, sentiment analysis, named entity recognition, summarization, translation) and the typical NLP pipeline, from data cleaning and tokenization to feature/embedding creation, modeling, and evaluation. Finally, we’ll give an overview of widely used NLP libraries and frameworks such as NLTK, spaCy, and related tools, ending with a short hands-on demo to show how they are applied in practice.

Number Theory for Cryptography
This lecture explores the dynamic evolution of the relationship between number theory and modern cryptographic techniques. It examines the transition from classical algorithms to sophisticated tools in algebraic number theory and arithmetic algebraic geometry, specifically the use of elliptic curves for factorization and primality testing. Special attention is given to the practical implementation of protocols such as zero-knowledge proofs, Digital Signature Standards (DSS), and probabilistic encryption. The lecture demonstrates how abstract mathematical structures serve as the essential foundation for securing information systems.

Diffusion transformer for the model of stress placement in the Ukrainian language
Diffusion models are a powerful modern approach in generative AI, capable of producing high-quality data by gradually adding and removing noise. A key foundation of these models is the transformer architecture, which relies on attention mechanisms instead of recurrent or convolutional layers, enabling efficient parallel training and strong context modeling. Transformers have become central to many generative systems, including diffusion models that use self-attention inside U-Net-like structures. The proposed Ukrainian Accentor’s Diffusion Transformer Model (USCDiT) applies a conditional Diffusion Transformer (DiT) to stress placement in Ukrainian text, framing the task as a gradual denoising process guided by sentence context.

Diffusion transformer for the model of stress placement in the Ukrainian language
Diffusion models are a powerful modern approach in generative AI, capable of producing high-quality data by gradually adding and removing noise. A key foundation of these models is the transformer architecture, which relies on attention mechanisms instead of recurrent or convolutional layers, enabling efficient parallel training and strong context modeling. Transformers have become central to many generative systems, including diffusion models that use self-attention inside U-Net-like structures. The proposed Ukrainian Accentor’s Diffusion Transformer Model (USCDiT) applies a conditional Diffusion Transformer (DiT) to stress placement in Ukrainian text, framing the task as a gradual denoising process guided by sentence context.

Principles of Modern Cryptography and Secure Communication
This lecture provides overview of modern cryptography methods and their application:
Evolution of cryptosystems development from Classical Ancient Cryptography to modern Elliptic Curve Cryptography.
Concept of cryptosystems efficiency and reliability, types of cryptanalyses. Advantages and disadvantages of symmetric and asymmetric cryptosystems.
Cryptographic hash functions.
Application of modern cryptosystems (AES, ChaCha20, RSA, Diffie–Hellman key exchange): electronic digital signature, cryptocurrency, cryptographic protocols TLS/SSL, SSH, OpenSSL library, digital certificates.

Reverse Engineering of Embedded Systems and Protection Against It
The lecture introduces key methods for analyzing and securing modern devices. It covers primary hardware study, disassembly, PCB structures, and identification of microcontrollers, firmware, memory, and communication interfaces such as I2C, SPI, UART, USB, and JTAG.
Alongside investigative techniques like X-ray inspection and verification procedures, the lecture highlights protection strategies: checksums, cryptographic signatures, trusted boot chains, encrypted updates, tamper sensors, secure memory modules, and hardware-based security such as TPM, Secure Enclave, and ARM TrustZone.
In simple words, it shows how devices can be studied inside and also how they can be protected from unwanted access.

Secure Coding Basics
This lecture introduces students to the fundamentals of secure coding and explains why security must be considered from the earliest stages of software development. It covers common software vulnerabilities such as improper input validation, insecure authentication, and hardcoded credentials. Students will learn basic principles for writing safer code, including least privilege, secure error handling, and defensive programming. Real-world examples illustrate how small coding mistakes can lead to serious security incidents. By the end of the lecture, students will understand how secure coding practices help reduce risks and improve overall software quality.

The Power of Teamwork: How Effective Collaboration Turns Ideas into Successful Projects
This session focuses on the power of collaboration in turning ideas into meaningful projects. We will discuss key team roles, team building, and effective communication as the foundation of successful teamwork. Participants will learn how to organize a team, collaborate efficiently, and overcome common challenges.
Turning Ideas into Impact: How to Present and Defend Student Projects
This session focuses on helping students present their projects in a clear, engaging, and persuasive way. We will discuss how to structure a presentation, highlight the core idea, and communicate value to the audience and judges. Participants will also learn how to confidently respond to questions and justify their project decisions. The talk empowers students to turn ideas into convincing and impactful presentations.

Media and SmartTV Development. Key platforms in a nutshell
A brief overview of the Smart TV development landscape, with focus on the core technical expertise required to build modern multimedia applications. The session explores the leading industry platforms and examined the unique architectural challenges of delivering high-quality content to the big screen. The lecture also breaks down essential roles and skill profiles within a Smart TV engineering team, offering a roadmap for specialists looking to enter this specialized domain.

Data Analytics: How Data Turns into Solutions
From Data to Solutions: Practical Analysis
A series of two sessions is dedicated to the role of data in modern information systems and decision-making processes. The basics of Business Intelligence and Data Science are covered, as well as the importance of open data and modern professions that work with it. Special attention is paid to the data life cycle - from collection, storage, and processing to integration and analysis. The second session is workshop-based, during which participants are introduced to the basics of visualization and data analysis methods. Program code will be implemented on the Google Colab platform.

Chatbots: History, Architectures, Ethics, and a Hands-On Prototype Building a Chatbot and Integration Basics
A series of two sessions is dedicated to chatbot technology—from its early origins in the 1960s to modern AI-driven conversational systems. The first session covers the history of chatbot development (from ELIZA to ChatGPT), core concepts and architectures, key enabling technologies (Natural Language Processing and Machine Learning), typical use cases across industries, and the state of the field in 2025. The second session is workshop-based: participants will be guided through building a simple chatbot using no-code tools for Discord and will review common approaches to integrating chatbots with external systems. The format is suitable for audiences with different levels of technical background.

Probability in Real Life
In the lecture, we will study interesting examples of the use of probability theory. We will get acquainted with the basic concepts and terms that are later used in data analysis and with which mathematical modeling of the phenomena surrounding us is carried out.

Statistics in Real Life
The lecture covers the fundamentals of mathematical statistics, including descriptive statistics, parameter estimation, and hypothesis testing. Parametric and nonparametric tests are discussed.

Data visualization
Key methods of data visualization in Python using Matplotlib, Seaborn, and Plotly are demonstrated for enhanced data analysis and interpretation, along with examples of plotting and customization.

Decision making. How to make better choices in life and work
We will consider four enemies of great decisions, will learn how to avoid them and how to broaden horizons to make the best choices in business and life

Web development
Web development basics and foundational aspects, introduction to back-end development with Python. Key aspects and notions in modern web development, comparative overview of Python's best and most popular frameworks for this area and applicability of each to various development cases.

Problem Solving and Creativity
The “Problem Solving and Creativity” training will help you develop critical thinking and creativity skills, which are essential for successful learning and a future career. You will learn to identify the root of a problem, analyze situations from different perspectives, and find unconventional solutions even under challenging conditions. The program includes interactive exercises, teamwork, and idea-generation techniques that foster flexibility, communication, and confidence in decision-making.

Eigen decomposition and singular decomposition in practice
This lecture explores two foundational matrix decomposition techniques in linear algebra — Eigen Decomposition and Singular Value Decomposition (SVD) — and their central role in modern data analysis and machine learning. We examine the mathematical principles behind these decompositions, the conditions under which they exist, and the structural insights they provide into matrices of various types. Special emphasis is placed on practical applications, including dimensionality reduction (PCA), signal and image processing, recommendation systems, numerical stability, and uncovering latent patterns in high‑dimensional datasets. The lecture offers a comprehensive understanding of how matrix decompositions transform complex analytical tasks into simpler, interpretable components, enabling efficient extraction of meaningful information from data.

Simulation and Programming in Materials Sciences and Mechanics
The lecture explores how modern materials science and mechanics rely on simulation and programming methods to bridge the gap between theory and experiment. Emphasis is placed on the systematic connection between physical models, computational implementation, and the interpretation of simulation results. Through selected examples, it demonstrates why simulation has become an indispensable tool in contemporary research and industry.

Classical regression analysis
We will learn the basic principles of building a simple and multivariate linear regression model. We will consider interesting examples of forecasting sales depending on the price of the product and wages depending on many factors (gender, age, education, part-time or full-time employment).
Logistic regression
We will learn about the regression method used when the dependent variable is binary, i.e. it can only take two values (0 or 1). When a threshold value is introduced, this method can be used for classification.

Life on the command line: How text-based interfaces can improve data processing
Command-line interfaces are a powerful but often intimidating environment. We will dispel the fears and show magnificent stuff that can be done in the blocky world of terminals
Intro to Neurosymbolic AI
We will review common approaches to Neurosymbolic AI, and will go through specific examples using Scallop library.

Solution Discovery Framework: Seven Product Dimentions
This lecture introduces the Solution Discovery process as a structured, collaborative approach to identifying the right problems to solve and shaping solutions that deliver real business and user value. It focuses on early-stage discovery activities that help teams move beyond assumptions toward shared understanding and informed decision-making. As a practical foundation, the lecture outlines the Seven Product Dimensions framework developed by EBG Consulting, which provides a holistic lens for exploring solutions across users, actions, interfaces, data, controls, environments, and quality attributes. By applying this framework, students will learn how to ask the right discovery questions, balance functional and nonfunctional considerations, and align customer, business, and technology perspectives to design well-rounded, value-driven solutions.

Working with small and not so small data in spreadsheets
The lecture explores practical approaches to working with small and large datasets in spreadsheet environments, focusing on efficiency, accuracy, and scalability. The study examines the differences between handling small datasets, where simplicity, readability, and manual control are priorities, and working with larger datasets, which require structured data organization and optimization techniques.

Introduction to financial mathematics
Finance. Subject of financial mathematics. Interest rate theory, cash flows. Calculation of loan and deposit contracts using spreadsheets.

Computational Mechanics
Computational Mechanics is a modern interdisciplinary field that integrates mathematics, engineering, information technologies, and computer-based tools for the modeling and analysis of real mechanical processes and systems. The presentation will examine the role of computer simulations and numerical methods in the design, verification, and optimization of engineering solutions, and will demonstrate the importance of computational mechanics as a powerful tool of modern science and industry, opening new opportunities for the creation of reliable, efficient, and technologically advanced systems.

Azure 101
What is Cloud and what are they for
Types of Clouds
Microsoft Azure basics
Basic Azure services
Azure Virtual Machine
Azure Storage Account
Learning paths in Azure

Working on big products or ambitious startups? The cost of mistakes and responsibility
• How to work in a team with dozens of developers (processes, Code Review, CI/CD).
• Why changes in enterprises (such as Nova Poshta or Kyivstar) are implemented slowly but reliably.
• The dynamics of taxi service development: working with maps, real-time geolocation, and the “dirty” reality of GPS.
• The “Air Alarm” case study.

How to rock your AI project

How to rock your AI project

Emotional Intelligence in IT: Why Hard Skills Are Not Enough for Career Growth
In today's IT landscape, project success depends as much on team synergy as it does on clean code. Drawing from real-world cases at Stfalcon, we will explore how Emotional Intelligence helps engineers navigate crises, communicate effectively with clients, and accelerate their career growth. You will discover why EQ is the "invisible code" that transforms a talented specialist into a true tech leader.

Data Analysis and Visualization in Power BI
A practical workshop dedicated to the basics of data analysis and visualization in Power BI. Participants will learn about the key stages of working with data, the capabilities of Power BI and Microsoft Fabric, and see how to build a report in a few steps and speed up analytics with Microsoft Copilot.

Algebraic Codes and Automata
The lecture will be devoted the algebraic theory of codes. We present some algebraic codes, their methods of determining and the maximality of some types of codes.
Languages, Grammars and Automata
The Chomsky classification of formal languages will be presented. Also, different methods of determining of formal languages, in particular we shall discuses the determining of formal languages by automata.

Modern algebra in cryptography: theory, methods, and applications
This lecture explores the role of general algebra in cryptography, focusing on key concepts such as groups, fields, linear spaces, and elliptic curves. Theoretical foundations are connected to practical cryptographic methods used in real-world systems. The lecture explains how algebraic methods are used to generate cryptographic keys, create digital signatures, and secure blockchain transactions.
Special attention is given to elliptic curve cryptography and its applications in cryptocurrencies and digital transaction security.

ML/LLM for CyberSecurity

AI Security Overview: Threat Landscape and Defense Layers

Competitive programming as a tool for algorithmic thinking
The report will examine sports programming as an approach to developing algorithmic thinking. It will explain its essence and practical value as an effective way of solving problems. Examples will be used to show how different approaches to the same problem affect the quality and effectiveness of the solution.

Social Engineering and Phishing: Human Factors in Modern Cyberattacks
Evolution of social engineering attacks from classic fraud schemes to modern digital phishing campaigns.
Psychological principles behind social engineering: trust, authority, urgency, fear, and curiosity.
Classification of social engineering attacks: phishing, spear phishing, whaling, vishing, smishing, pretexting, baiting.

What does it take to make a successful game.

What does it take to make a successful game.

Software Development. Software and System architecture principles

Web development trends in 2026
In 2026, web development will focus not only on creating websites, but also on combining speed, artificial intelligence, and business efficiency. In this lecture, we will look at the key technological trends that shape modern websites and web products: AI tools, new approaches to front-end development, performance, and UX. The lecture will be useful for developers and designers who want to be competitive in the digital environment.

Gamification Beyond Games: Using Game Mechanics Outside Entertainment
This report is dedicated to the phenomenon of gamification—the use of game mechanics (points, ratings, achievements) in non-gaming contexts to increase user engagement. We will look at real-life cases of gamification in education, business, cybersecurity, and product development, as well as how AI personalizes the gaming experience. Learn how to turn code into an engaging experience that changes the behavior of millions of users!

LLM-based Code Migration Technology

Finding the Core Product Vulnerability: Cybersecurity Beyong Code and Architecture

Using the OAuth and OpenID Connect Protocols for Distributed Authorization and Authentication

The Power of Teamwork: How Effective Collaboration Turns Ideas into Successful Projects

JavaScript outside the browser: Using the Node.js environment to execute JavaScript code
This presentation explores JavaScript's transition from a client-side scripting tool to a powerful server-side execution environment. By examining the fundamental architecture of Node.js, the presentation explains how the event loop and non-blocking I/O enable the creation of efficient server systems. Participants will gain a basic understanding of the ecosystem, including package management and the development of scalable web services.

Опанування prompt engineering: Практичні техніки для великих мовних моделей (LLM)
We will look at key prompt engineering techniques that can significantly improve the quality of language model responses without retraining. Few-shot, Step-back, Chain of Thought, and Tree of Thoughts help manage the model's thinking logic and effectively solve complex tasks. Function Calling expands the capabilities of LLMs by allowing them to be integrated with real-world systems and business processes. Combining these approaches makes OpenAI models more accurate, reliable, and ready for use in production.

Managing Requirements: How to Understand Your Client and Deliver the Right Product
The topic focuses on ways to clearly define and verify customer needs before and during development. It emphasizes the importance of constant communication, feedback, and prioritization of requirements. This helps reduce risks, rework, and discrepancies between the team and the customer.

Security Requirementsas Part of Project Defense
This lesson introduces security requirements as an essential part of project protection. Students learn why security must be considered at all stages of development, from planning to deployment. The lesson covers basic types of security requirements and common risks they help prevent. Practical examples show how security requirements protect data, systems, and users.


From Concept Art to Game Engine: The 3D Asset Production Pipeline in AAA Games and Protection of Creative Data
This lecture introduces the full lifecycle of 3D asset creation, from concept art and modeling to texturing and integration. Alongside the technical and artistic aspects, the lecture explores why creative data is highly valuable and how game studios protect assets, builds, and internal tools from leaks and unauthorized distribution. Real-world examples of recent industry leaks are discussed, together with common preventive practices and secure work environments used in on-site and remote production. This lecture explores how creativity, technology, and responsible data handling come together in modern AAA game development.

A career without a perfect plan: how to get started and not burn out
How not to burn out at the start and what really helps you grow. A career path rarely looks like a clear plan with bullet points. Volodymyr will share his real-life experience of starting out without a perfect strategy.

Text Processing in Cybersecurity: Extracting Insights from Logs, Natural text, and codes

AI is becoming part of how we think, work, and decide every day — but what makes AI truly trustworthy?
This session breaks down the core principles of Trustworthy AI into clear, practical concepts that apply to anyone who builds, deploys, or uses AI systems. It focuses on transparency, explainability, fairness, robustness, and privacy, showing how these principles translate into real-world practices.
You’ll gain practical insights on how to move from assumption to verification, helping ensure AI systems are reliable, responsible, and safe for both individuals and organizations.

IT 2026: Communication, Organization, and Teamwork

Deception Technology: Beyond Honeypots, Detecting Attackers In Your Infrastructure
An introduction to deception techniques that detect attacks in the most unexpected ways, using unconventional tools. The talk showcases extremely simple yet effective mechanisms that remain valuable even when traditional defenses have failed.



What am I showing online?! The basics of OSINT analysis
The topic introduces the basics of OSINT — collecting and analyzing open-source information online. It explores how ordinary online posts can reveal personal data and provides essential tips for digital hygiene to protect privacy.
Hardware and hardware-software systems play a crucial role in modern computing processes, ensuring efficiency and flexibility. The focus is on integrating legacy technologies and combining hardware solutions with software to improve performance and compatibility.

Creation of A Multi-Control Intelligent Cyber Defense System
The multi-circuit intelligent cyber defense system provides an assessment of the cyber security of critical infrastructure facilities, has a wide range of implementation options, covering both public and private structures that ensure the functioning of critical systems. The versatility of the system is due to its ability to adapt to various industries, such as energy, transport, water supply, healthcare and financial institutions. The creation of such systems provides a combination of social, cybernetic and physical components of the security of socio-cyber-physical systems.

Statistical data analysis in Excel and Python
We consider some simple games related to probability theory and statistics.
Using Excel, Python, and Jupyter Notebook, we make predictions about their results.

Geospatial Intelligence and Analytics (GEOSINT)
Geospatial intelligence and analytics (GEOSINT) provides a comprehensive analysis of spatiotemporal data to support management and operational decision-making in the field of security, defense and civil administration. It has a wide range of practical applications, covering both public and private structures whose activities are related to monitoring territories, infrastructure and natural processes. The use of GEOSINT provides a combination of information, technological and spatial components of analytics in a single circuit, which allows to significantly increase the effectiveness of threat detection, risk forecasting and ensuring the stability of critically important objects and territories.
