CSIT - 2024 рік

Постійне посилання зібрання

Переглянути

Нові надходження

Зараз показуємо 1 - 20 з 20
  • Документ
    Method for improving the performance of convolutional neural networks using an accelerator
    (Хмельницький національний університет, 2024) Isaiev, T.; Kysil , T.
    The effectiveness of convolutional neural networks (CNNs) has been demonstrated across various fields, including computer vision, natural language processing, medical imaging, and autonomous systems. However, achieving high performance in CNNs is not only a matter of model design but also of optimizing the training and inference processes. Using accelerators like the Google Coral TPU provides significant improvements in both computational efficiency and overall model performance. This paper focuses on the integration of the Coral TPU to enhance CNN performance by speeding up computations, reducing latency, and enabling real-time deployment. Training deep learning models, particularly CNNs, is computationally intensive. Traditional CPUs or GPUs can take hours or even days to train large networks on complex data. The accelerator offloads these intensive tasks, allowing the host machine to focus on other operations and making training more efficient. This enables researchers to experiment with multiple architectures and hyperparameters within shorter cycles, thereby improving the model's accuracy and robustness. CNNs are widely deployed in edge computing scenarios where real-time predictions are critical, such as in robotics, autonomous vehicles, and smart surveillance systems.Unlike traditional cloud-based solutions, where models are executed remotely and suffer from network delays, the Coral TPU ensures low-latency predictions directly on the device, making it ideal for timesensitive applications. Another key advantage of using accelerators like Coral TPU is the ability to efficiently handle optimized and lightweight models. These optimized models are well-suited for the Coral TPU’s architecture, allowing developers to deploy high-performing networks even on resource-constrained devices. The TPU’s ability to handle quantized models with minimal loss in accuracy further enhances the CNN’s practical usability across various domains. The Coral TPU is designed to minimize power consumption, making it an ideal solution for battery-powered or energyconstrained devices. This energy efficiency ensures that CNNs can run continuously on devices like drones, IoT sensors, or mobile platforms without exhausting their power supply. Additionally, the scalability of the TPU makes it easy to deploy multiple accelerators in parallel, further improving throughput for applications that require processing high volumes of data, such as realtime video analysis. The Coral TPU also facilitates on-device learning, where models can be incrementally updated based on new data without requiring a full retraining session. This feature is particularly useful in dynamic environments, such as autonomous vehicles or security systems, where the model needs to adapt quickly to new conditions. With the TPU handling the computational workload, CNNs can be fine-tuned on the device, ensuring they remain accurate and responsive over time.
  • Документ
    The concept of AI-based information systems for the analysis of learning foreign words
    (Хмельницький національний університет, 2024) Pavlova, O.; Kozyra, A.
    In the modern world, information systems based on artificial intelligence (AI) are increasingly used to automate learning and improve the educational process. One of the perspective areas of AI application is foreign language learning, particularly vocabulary acquisition. By integrating AI components, specifically those utilizing machine learning algorithms to analyze large volumes of data and provide automated recommendations to enhance the learning process, users gain constant access to selfassessment tools and automatic adjustment of cognitive workload.This paper examines the key role and significance of information systems for analyzing foreign language vocabulary acquisition with the help of AI. It investigates the working principles of such systems, their advantages, and various strategies used to enhance the efficiency of language learning, aiming for optimal results in acquiring new linguistic knowledge and improving learning outcomes. Learning new foreign terms is often a challenging task for many students, leading to a loss of motivation or slow progress, highlighting the urgent need for solutions that enhance material retention. Adapting to individual users, AI-based information systems have developed a range of services and platforms with global potential for language learning worldwide. These systems function by analyzing user behavior and success, based on specific indicators and metrics, whose numerical values are interpreted to identify patterns and correlations between user behavior and its impact on the system. The advantages of AI-based information systems for language learning are significant, offering an objective, reliable method for assessing learning achievements, eliminating the need for human intervention in many cases. Data collected by these systems serve as a valuable resource for analyzing user productivity, detecting common mistakes, creating effective study plans, and more. However, it's important to note that AI has not yet reached the level of understanding semantics or the cultural and historical nuances of certain words, complicating the implementation of more comprehensive functionality for evaluating and adjusting the learning process. This requires developers to prepare additional data through proprietary sources or gain useful input from user interactions with the system.
  • Документ
    Method of creating custom dataset to train convolutional neural network
    (Хмельницький національний університет, 2024) Isaiev, T.; Kysil , T.
    The task of creating and developing custom datasets for training convolutional neural networks (CNNs) is essential due to the increasing adoption of deep learning across industries. CNNs have become fundamental tools for various applications, including computer vision, natural language processing, medical imaging, and autonomous systems. However, the success of a CNN depends heavily on the quality and relevance of the data it is trained on. The datasets used to train these models must be diverse, representative of the task at hand, and of sufficient quality to capture the underlying patterns that the CNN needs to learn. Thus, building custom datasets that align with the specific objectives of a neural network plays a critical role in enhancing the performance and generalization capability of the trained model. This paper focuses on developing a method and subsystem for generating high-quality custom datasets tailored to CNNs. The aim is to provide a framework that automates and streamlines the processes involved in data collection, preprocessing, augmentation, annotation, and validation. Moreover, the method integrates tools that allow the dataset to evolve over time, incorporating new data to adapt to changing requirements or environments, making the system flexible and scalable. The process of creating a dataset begins with the acquisition of raw data. The data can come from various sources such as images from cameras, videos, sensor feeds, open data repositories, or proprietary datasets. A key consideration during data collection is ensuring that the samples cover the full range of conditions or classes the CNN will encounter in production. For example, in an object recognition task, it is essential to collect images from diverse environments, lighting conditions, and angles to train the model effectively. Ensuring variability in the dataset increases the model's ability to generalize, reducing the risk of poor performance on unseen data. Data augmentation is a critical step in building a robust dataset, particularly when the size of the dataset is limited. Augmentation techniques introduce variability into the dataset by artificially modifying the existing samples, thereby simulating a wider range of conditions. This helps the CNN generalize better and prevents overfitting. In essence, it allows the model to experience different perspectives and distortions of the same data, strengthening its adaptability to real-world scenarios. Annotation involves labeling the data samples with the correct class or category information. Depending on the task, annotations may include bounding boxes for object detection, segmentation masks for semantic segmentation, or class labels for classification tasks. The importance of well-annotated data cannot be overstated, as CNNs rely on this labeled information to understand the relationships between input data and the desired output predictions.
  • Документ
    Information system for earth’s surface temperature forecasting using machine learning technologies
    (Хмельницький національний університет, 2024) Hovorushchenko, T.; Alekseiko, V.; Shvaiko , V.; Ilchyshyna, J.; Kuzmin, A.
    Temperature forecasting is a topical issue in many areas of human life. In particular, climate change directly affects agriculture, energy, infrastructure, health care, logistics, and tourism. Anticipating future changes allows you to better prepare for challenges and minimize risks. The paper presents an information system for forecasting the temperature of the Earth’s surface using machine learning technologies. The forecast is formed by a model adapted to the region, by learning on the basis of historical data and tracking the most inherent patterns. The selection and training of the model was carried out on the basis of the analysis of the characteristics of climatic zones, according to the Köppen classification. A comparison of the performance of models for forecasting the average monthly temperatures of the earth’s surface in different climatic zones was carried out. The analysis of scientific publications confirmed the relevance of the chosen research topic. Modern approaches to forecasting climatic indicators are considered. Methods and approaches to temperature forecasting, their advantages and disadvantages are analyzed. The peculiarities of the application of machine learning methods for temperature forecasting are considered, and the criteria for choosing the most accurate and least energy-consuming methods are determined. The research results made it possible to identify machine learning methods that best adapt to temperature patterns and allow accurate short-term forecasting. An approach for long-term forecasting using recurrent neural networks is proposed. An information system has been developed for forecasting future temperatures depending on the climatic features of the studied territories based on the proposed methods. A concept for further research for the development and improvement of the developed information system has been formed.
  • Документ
    Decision-making support system regarding the optimizaion process of crop cultivation using remote sensing data
    (Хмельницький національний університет, 2024) Okrushko, D.; Pavlova, O.
  • Документ
    Method for interpreting decisions made by deep learning models
    (Хмельницький національний університет, 2024) Slobodzian, V.; Barmak, O.
    The use of artificial intelligence (AI) in medical diagnostics opens new opportunities for analyzing complex medical images and optimizing diagnostic processes. One of the key challenges remains the interpretation of results obtained through AI systems, particularly in medical practice, where ensuring transparency and clarity of decision-making is critically important. This study proposes a method for visualizing and interpreting the results of cardiac disease classification based on MRI image analysis using deep learning models. The primary goal of the research is to explain AI-driven decisions in a convenient and understandable format for physicians, contributing to the reduction of subjectivity in clinical practice. During the research, approaches were developed for visualizing key groups of medical indicators, such as heart volumes, ejection fraction, myocardial wall thickness, and volume-to-mass ratios. The study describes numerical metrics commonly used in medical practice. Fifteen key medical metrics were identified and grouped into corresponding categories for effective representation of essential medical indicators. Various visualization forms were utilized to ensure intuitive data presentation: pie charts to demonstrate ratios, the 17-segment myocardial model for analyzing wall thickness, and numerical indicators for accurately displaying volumes and ejection fraction. This approach allows physicians to quickly assess structural changes in the heart and make informed conclusions. The proposed method aims to enhance transparency and trust in AI by providing comprehensible data representation, reducing the risks of subjective interpretation and cognitive biases. The results indicate that using such visualizations can significantly facilitate clinical decision-making, improve diagnostic accuracy, and standardize approaches to medical data analysis.
  • Документ
    Overview of the methods and tools for environmental components monitoring
    (Хмельницький національний університет, 2024) Hovorushchenko, T.; Bachuk, V.; Hnatchuk, Y.; Zasornova, I.; El, Bouhissi H.
    Monitoring of environmental components is an important process for determining the level of pollution and tracking changes in the environment, and plays a key role in ensuring the health and comfort of residents, as well as in preserving the environment. Continuous monitoring of environmental components is key to ensuring human health, protecting nature and reducing the negative impact on the climate and ecosystems, as well as achieving sustainable development. In order to combat environmental pollution, it is necessary to implement effective measures to limit emissions of harmful substances, use environmentally friendly technologies and green solutions in all sectors of the economy, and raise public awareness of the problem of environmental pollution. From the analysis of the sources reviewed, a pattern was identified that the information technologies mainly used to monitor environmental components are either Internet of Things (IoT) technologies using modern sensors and data transmission components or artificial intelligence technologies such as computer vision. Less commonly, the use of robots, UAVs, and digital twins is being traced. Based on a critical analysis of methods and tools for environmental components monitoring, there is a need to develop such methods and tools for environmental components monitoring that would: perform cheaper and more versatile environmental components monitoring than existing analogues, but at the same time have no less accuracy and speed; monitor the state of the environment; identify sources of environmental pollution; warn of environmental disasters; assess the state of natural resources; support environmental decision-making; collect and analyze various environmental indicators in real time; assessed the level of quality and safety of environmental components, which will allow immediate response to quality changes and promptly take the necessary measures, etc., which will be the focus of the authors' further efforts.
  • Документ
    Land surface temperature forecasting in the context of the development of sustainable cities and communities
    (Хмельницький національний університет, 2024) Hovorushchenko, T.; Alekseiko, V.
    The article examines the aspects of land surface temperature forecasting for effective planning and development of sustainable cities and communities. The relevance of the research lies in the need to develop effective approaches to the analysis and forecasting of climate data, for the timely determination of existing problems and ways to solve them, in accordance with civilizational challenges. The trends of changes in the land surface temperature from the middle of the 20th to the beginning of the 21st century were analyzed, using the example of five megacities located in different regions: Tokyo (Japan), Lagos (Nigeria), Berlin (Germany), Singapore (Singapore) and Belo Horizonte (Brazil). The results of the analysis of changes in average monthly and average annual temperatures are presented. The factors affecting the formation of the temperature regime of each of the cities are determined. The role of urbanization as a key factor in the development of the city is described, the main challenges caused by it are considered. An overview of megacities from the point of view of sustainable development was carried out. Prospects for urban development and measures aimed at reducing the urban heat island effect are considered. The role of modern technologies, in particular machine learning for predicting the land surface temperature, is described. The expediency of land surface temperature forecasting, in order to implement strategies for mitigating negative consequences and achieving the goals of sustainable development, is substantiated. The conducted research makes it possible to be convinced of the need for a responsible approach to the design and development of sustainable cities and communities. The temperature of the earth’s surface is one of the key indicators that allows you to monitor the main trends of urban life and can be an indicator of the effectiveness of strategies used to increase comfort and sustainable development
  • Документ
    Overview of the methods and tools for environmental components monitoring
    (Хмельницький національний університет, 2024) Hovorushchenko, T.; Bachuk, V.; Hnatchuk, Y.; Zasornova, I.; Bouhissi, H. E.
    Monitoring of environmental components is an important process for determining the level of pollution and tracking changes in the environment, and plays a key role in ensuring the health and comfort of residents, as well as in preserving the environment. Continuous monitoring of environmental components is key to ensuring human health, protecting nature and reducing the negative impact on the climate and ecosystems, as well as achieving sustainable development. In order to combat environmental pollution, it is necessary to implement effective measures to limit emissions of harmful substances, use environmentally friendly technologies and green solutions in all sectors of the economy, and raise public awareness of the problem of environmental pollution. From the analysis of the sources reviewed, a pattern was identified that the information technologies mainly used to monitor environmental components are either Internet of Things (IoT) technologies using modern sensors and data transmission components or artificial intelligence technologies such as computer vision. Less commonly, the use of robots, UAVs, and digital twins is being traced. Based on a critical analysis of methods and tools for environmental components monitoring, there is a need to develop such methods and tools for environmental components monitoring that would: perform cheaper and more versatile environmental components monitoring than existing analogues, but at the same time have no less accuracy and speed; monitor the state of the environment; identify sources of environmental pollution; warn of environmental disasters; assess the state of natural resources; support environmental decision-making; collect and analyze various environmental indicators in real time; assessed the level of quality and safety of environmental components, which will allow immediate response to quality changes and promptly take the necessary measures, etc., which will be the focus of the authors' further efforts.
  • Документ
    Land surface temperature forecasting in the context of the development of sustainable cities and communities
    (Хмельницький національний університет, 2024) Hovorushchenko, T.; Alekseiko, V.
    The article examines the aspects of land surface temperature forecasting for effective planning and development of sustainable cities and communities. The relevance of the research lies in the need to develop effective approaches to the analysis and forecasting of climate data, for the timely determination of existing problems and ways to solve them, in accordance with civilizational challenges. The trends of changes in the land surface temperature from the middle of the 20th to the beginning of the 21st century were analyzed, using the example of five megacities located in different regions: Tokyo (Japan), Lagos (Nigeria), Berlin (Germany), Singapore (Singapore) and Belo Horizonte (Brazil). The results of the analysis of changes in average monthly and average annual temperatures are presented. The factors affecting the formation of the temperature regime of each of the cities are determined. The role of urbanization as a key factor in the development of the city is described, the main challenges caused by it are considered. An overview of megacities from the point of view of sustainable development was carried out. Prospects for urban development and measures aimed at reducing the urban heat island effect are considered. The role of modern technologies, in particular machine learning for predicting the land surface temperature, is described. The expediency of land surface temperature forecasting, in order to implement strategies for mitigating negative consequences and achieving the goals of sustainable development, is substantiated. The conducted research makes it possible to be convinced of the need for a responsible approach to the design and development of sustainable cities and communities. The temperature of the earth’s surface is one of the key indicators that allows you to monitor the main trends of urban life and can be an indicator of the effectiveness of strategies used to increase comfort and sustainable development.
  • Документ
    Method for creating svm classifier for data analysis on FPGA
    (Хмельницький національний університет, 2024) Lysenko, S.; Shpuliar, Y.
    The paper explores the use of SVM classifier method for data analysis on FPGA, which, despite its effectiveness, may face challenges related to limited resources and data processing speed. In this context, there is a need to develop new methods for integrating SVM classifiers with high-performance computing hardware. The increasing demand for speed and energy efficiency requires new approaches to implementing machine learning methods. One of the key tools for data classification and analysis is the Support Vector Machine (SVM), widely used in business, science, medicine, and many other fields. Developing an efficient and optimized method for creating SVM classifiers for FPGA requires further research and development, as existing methods may be suboptimal in terms of speed and FPGA resource utilization. The article provides an overview of known hardware solutions to this problem, proposed in the current scientific literature. Additionally, the effectiveness of combining hardware and software components to achieve significant acceleration of the data analysis process is discussed. The article emphasizes the need for further research and improvement to fully realize the transformative potential of machine learning classification methods. The work resulted in the development of a new, specialized, and optimized hardware accelerator based on FPGA for the Support Vector Machine (SVM) method using convex optimization (CO) on embedded platforms. The proposed embedded architectures are designed to be universal, parameterized, and scalable. This means that these embedded solutions can accommodate different datasets of varying sizes and can be implemented on various embedded platforms, including those equipped with the latest FPGAs. They are also capable of handling both linear and nonlinear discrimination across multidimensional datasets.
  • Документ
    The concept of an information system for forecasting the temperature regime of the earth’s surface based on machine learning
    (Хмельницький національний університет, 2024) Pavlova, O.; Alekseiko, V.
    The paper presents the concept of an information system for forecasting the temperature regime of the Earth’s surface using machine learning. Forecasting is based on historical data for a specific area. In order to increase the accuracy of forecasting results, an analysis of the features of climate zones was carried out to identify patterns. A comparison of the dependence of the average earth’s surface monthly temperatures in countries depending on their location in climate zones was carried out. The analysis of sources and scientific publications confirmed the relevance of the chosen research topic. Historical aspects of forecasting changes in climatic indicators are considered. Modern methods and approaches to temperature forecasting, their advantages and disadvantages are analyzed. An overview of the subject area was conducted and the regularities of temperature changes according to climate features were determined. A comparison of temperature regimes for countries located in different climate zones was made. For clarity, graphs of temperature changes were plotted and average indicators were calculated for each climate zone. The results of the study confirm the need to adjust the temperature forecast for certain areas, taking into account their location in a specific climate zone. The revealed regularities in the temperature regime of the countries indicate the need for an individual approach to forecasting and the use of such machine learning methods that are best adapted to the dependencies observed in the climate zone. The architecture of the information system for forecasting future temperatures depending on the climatic features of the studied territories is proposed. A concept has been formed for further research to find more accurate and effective approaches to predicting climate parameters and achieving the goals of sustainable development.
  • Документ
    Model of process for ensuring fault tolerance in internet of things networks
    (Хмельницький національний університет, 2024) Nicheporuk, A.; Dariychuk, O.; Danchuk, S.
    The Internet of Things is a concept that describes a network of physical objects equipped with embedded technologies, allowing them to collect and exchange data over the Internet. The main idea is to connect various devices and objects around us so that they can collaborate and interact with each other without direct human intervention. However, this carries certain risks: failures in such systems can have serious consequences, including potentially fatal events. Therefore, the reliability of IoT systems becomes critical in many areas, especially where safety is a priority. The problems hindering the resolution of this issue are primarily related to the heterogeneity of the IoT environment, the lack of communication about failures and malfunctions between network elements, and the heterogeneous environment. As a result, the time to detect errors in such networks is quite long. Thus, the aim of this work is to model the process of ensuring fault tolerance to reduce the time to detect malfunctions in IoT networks by designing and implementing a fault tolerance system in IoT networks. The paper presents a model for providing fault tolerance in the Internet of Things network, describes key concepts, entities and connections, and also defines the main stages and processes that are included in providing fault tolerance. This model is the basis of the functioning of the fault tolerance system. The concept of a fault tolerance system that integrates into existing Internet of Things networks is proposed. The concept of fault tolerance agents is introduced, which make up the basis of the fault tolerance system, and which communicate with each other to ensure the exchange of information about the occurrence of a fault. Two local fault tolerance mechanisms are proposed, which determine the functionality of the agents. To verify the effectiveness of error detection, experimental studies were conducted, including two error detection scenarios using two local fault tolerance mechanisms.
  • Документ
    Optimization of cyber-physical system parameters based on intelligent iot sensors data
    (Хмельницький національний університет, 2024) Zasornova, I.; Fedula, M.; Rudyi, A.
    The optimization of parameters of cyber-physical systems (CPS) is studied taking into account various calculations, physical processes, Internet of Things (IoT). The use of intelligent IoT sensors is crucial for collecting real-time data, which is necessary for enhancing the efficiency, reliability, and performance of CPS. Various methods of CPS parameters optimization are analyzed and categorized into model-based approaches, data-driven approaches, and hybrid approaches. The model-based approaches rely on mathematical models to describe CPS behavior and use optimization algorithms like linear programming and evolutionary algorithms to predict system responses and optimize parameters. But, the limitations of model-based approaches are related to complex systems with uncertain or dynamic behavior. The data-driven approaches are more suitable for complex cyber-physical systems. These approaches utilize machine learning and data analytics techniques to extract patterns from sensor data, which are then used to adjust system parameters. The hybrid approaches combine elements of both model-based and data-driven methods. The method of cyber-physical system parameters optimization based on intelligent IoT sensors data processing is developed with using of distributed neural network. The optimization problem is formulated with constraints for the system parameters. The neural network mathematical model and learning algorithm are proposed. The performed research shows the importance of developing optimization methods for CPS parameters based on intelligent IoT sensor data, considering the evolving nature of IoT technology. The integrating intelligent sensors into CPS offers new opportunities for optimizing system performance but also presents challenges in data management and security that should be addressed in future.
  • Документ
    Surveillance cyber-physical system as a part of internet of vehicles
    (Хмельницький національний університет, 2024) Boiko, M.; Yatskiv, V.
    Integration of cyber-physical surveillance systems (CPSS) into the Internet of Vehicles (IoV) paradigm represents a transformative approach to enhancing transportation safety and efficiency. This article discusses the design, іmplementation, and application of CPSS as part of IoV ecosystems. Leveraging advancements in sensor technologies, communication protocols, and data analytics, CPSS within IoV enables real-time monitoring, analysis, and response to road conditions, incidents, and emergencies. Our research explores the architecture and functional capabilities of CPSS, including sensor deployment, data fusion, anomaly detection, and decision support mechanisms. We investigate the synergistic interaction between CPSS and IoV platforms, facilitating seamless data exchange, collaboration, and compatibility between automotive and infrastructural domains. Additionally, we discuss potential applications of CPSS in traffic management, law enforcement, emergency response, emphasizing its role in enhancing transportation safety, optimizing resource allocation, and preventing congestion and accidents. Through empirical evaluations and thematic studies, we demonstrate the effectiveness, scalability, and societal impact of integrating CPSS into IoV ecosystems. This research contributes to the development of intelligent transportation systems and underscores the transformative potential of CPSS within the IoV context. This article explores the potential of Cyber-Physical Systems (CPS) in the realm of the Internet of Vehicles (IoV), particularly within the context of surveillance systems in the IoV network. It proposes an approach to designing CPS, examining existing technical and systemic solutions for their creation. The article delves into the network architecture, the system's time cost model, and the possible positioning of CPS within vehicles. The aim of this article is to investigate the possibilities of utilizing CPS in the IoV sphere and to initiate a discussion on their implementation and potential benefits for the development of transportation infrastructure and road safety. It opens up new perspectives for improving transportation systems and creating effective monitoring and control mechanisms, thereby promoting safer and more efficient transportation usage
  • Документ
    Video repeater design concept for UAV control
    (Хмельницький національний університет, 2024) Pavlova, O.; Halytskyi, O.
    This study is aimed at the utilization of a video repeater for unmanned aerial vehicle (UAV) control, involving a comparative analysis of existing scientific literature, methodologies, and available solutions. Through comparative analysis, the method of using an external repeater emerged as the most promising, offering flexibility in selecting repeaters with superior performance and advanced capabilities. Notably, this approach allows for the utilization of a single repeater across multiple UAVs and facilitates convenient modification or upgrade of the repeater without necessitating alterations to the UAV itself. Following component selection, an experimental prototype was designed to facilitate empirical investigations. Additionally, a frequency transmission scheme was devised for quadcopter control employing the repeater. This research represents a significant advancement in the realm of UAV control systems, introducing a novel approach to video repeater integration that is poised to revolutionize operational efficiency and adaptability across diverse operational settings. This approach offers unparalleled flexibility in the selection of repeaters boasting superior performance and advanced capabilities. The insights gleaned from this study are poised to catalyze further advancements in UAV technology, particularly in the realm of optimizing video transmission for enhanced situational awareness and mission effectiveness. By shedding light on the efficacy of video repeater integration, this study lays the groundwork for future innovations aimed at pushing the boundaries of UAV capabilities and enhancing their utility across a myriad of applications. The findings from this study are anticipated to inform further developments in UAV technology, particularly in optimizing video transmission for improved situational awareness and mission effectiveness
  • Документ
    Analysis of artificial intelligence based systems for automated generation of digital content
    (Хмельницький національний університет, 2024) Pavlova, O.; Kuzmin, A.
    This paper is aimed at the examination of contemporary challenges related to the integration of generative models API of artificial intelligence (AI) into a unified information system to facilitate the automated generation of digital content. In the context of rapid advancements in AI technologies and the increasing demand for diverse and personalized digital content, the integration of API-based generative models emerges as a crucial driver for progress in this field. The research findings underscore the significance of incorporating API-based generative AI models into a unified system, marking a significant step towards automating the process of digital content creation to meet modern market demands. By streamlining content generation workflows, such integration holds promise for enhancing efficiency and scalability while fostering creativity and innovation. Furthermore, the integration of generative AI models into a unified system presents opportunities for the development of personalized and innovative solutions tailored to the needs and preferences of end-users. This not only enhances user experiences but also enables the creation of content that resonates more effectively with target audiences across various domains. The findings gleaned from our research underscore the importance of integration of API-based generative AI models into a unified framework, representing a monumental stride toward the automation of digital content creation that caters to the exigencies of today's market dynamics. By streamlining content generation workflows and alleviating manual intervention, such integration holds immense promise in enhancing operational efficiency, scalability, and adaptability, while simultaneously nurturing a fertile ground for creativity and innovation to flourish. The further efforts of our research team are committed to the practical implementation of this concept and the exploration of its applicability across diverse domains. By continuing to refine and expand upon this integration, we aim to unlock new possibilities for automated content generation and drive further innovation in the digital content creation landscape.
  • Документ
    Subsystem for monitoring atmospheric air quality in the cyberphysical system "Smart City"
    (Хмельницький національний університет, 2024) Hovorushchenko, T.; Baranovskyi, V.; Hnatchuk, A.; Ivanov, O.
    The task of designing and developing a cyber-physical system "Smart City" is currently relevant for Ukraine. This study is devoted to the development of a method and subsystem for monitoring atmospheric air quality in the cyber-physical system "Smart City". The article develops a method for monitoring atmospheric air quality, which forms the basis for effective monitoring of atmospheric air quality in the cyber-physical system "Smart City" and allows making informed decisions on warning residents about the danger with recommendations for protecting their health. The developed subsystem for monitoring atmospheric air quality in the cyber-physical system “Smart City” collects data from the installed sensors of air humidity, air temperature, dust content in the air, including particles PM2.5, PM10, air radiation background, air pollution level by nitrogen oxides, air pollution level by sulfur, air pollution level by carbon compounds, air pollution level by greenhouse gases CO, CO2, NH3, NO, real-time transmission of the collected data to the data processing server, real-time processing and analysis of the received data using various analytical methods, visualization of the air quality monitoring results in the form of a city map with n districts displaying all air parameters. The user can select the air parameters of interest in the mobile application of the cyber-physical system. After selecting such parameters, the visualization of the air quality monitoring results is adapted to the user's needs: the measured value of the parameter selected by the user is displayed on the image of the district on the city map, and the mobile application displays a sound signal in the background and a flashing sign on the image of the district on the city map in the application, which signals a danger in this area of the city; clicking on this sign displays a notification on the screen about the indicator for which there is a danger and recommendations for protecting the health of residents in this case.
  • Документ
    Advanced methods for maintaining and managing the life cycle of cloud environments: survey
    (Хмельницький національний університет, 2024) Lysenko, S.; Bondaruk, O.
    Resource management is a fundamental concept in cloud computing and virtualization, encompassing the allocation, release, coordination, and monitoring of cloud resources to optimize efficiency. The complexity arises from the virtualized, heterogeneous, and multi-user nature of these resources. Effective governance is challenging due to uncertainty, large-scale infrastructures, and unpredictable user states. This paper presents a comprehensive taxonomy of resource management technologies, offering a detailed analysis of design architecture, virtualization, and cloud deployment models, along with capabilities, objectives, methods, and mechanisms. In a cloud computing environment, deploying application-based resource management techniques necessitates understanding the system architecture and deployment model. This paper explores centralized and distributed resource management system architectures, providing a review of effective resource management techniques for both, accompanied by a comparative analysis. The evolution of cloud computing from a centralized to a distributed paradigm is examined, emphasizing the shift towards distributed cloud architectures to harness the computing power of smart connected devices at the network edge. These architectures address challenges like latency, energy consumption, and security, crucial for IoT-based applications. The literature proposes various methods for distributed resource management, aligning with the distributed nature of these architectures. Resource management in cloud computing involves discovery, provisioning, allocation, and monitoring functions, with sub-functions like mapping and scheduling. Integrated approaches to consolidation and resource management have been explored in numerous studies. This paper summarizes and analyzes existing research on resource management functions, focusing on identification, provisioning, allocation planning, and monitoring, based on their objectives and methods
  • Документ
    Cyber-physical system for monitoring the environment for allergens using geolocation data
    (Хмельницький національний університет, 2024) Hovorushchenko, T.; Voevudskyi, Y.; Ivanov, O.; Voichur, O.