Question at hand
Reference:
Chickrin D.E., Egorchev A.A., Briskiy D.V., Zakirov R.I.
Methods for obtaining and processing data from a bundle of downhole modules obtained by vertical seismic profiling in the software for controlling the complex for receiving seismic signals in a well
// Cybernetics and programming.
2018. ¹ 6.
P. 1-10.
DOI: 10.25136/2644-5522.2018.6.28091 URL: https://en.nbpublish.com/library_read_article.php?id=28091
Abstract:
The object of research in this article is a system for receiving seismic signals in a well, carried out by the method of vertical seismic profiling. The subject of the research is data processing methods from a bunch of downhole modules obtained by vertical seismic profiling in software for controlling and controlling a hardware-methodical complex for receiving seismic signals in a well. The authors consider in detail such aspects of the topic as the complexity and speed of seismic data processing algorithms obtained from borehole and surface modules. The authors use the following scientific methods, namely: mutual correlation in the time and frequency domains. The novelty of the results are the conclusions that in the considered complex the correlation in the spectral region provides a gain in the number of operations on the correlation in the time domain. The calculation in the time domain gives a more accurate result, since the Fourier transform on the final sample gives distortions even when using window functions. To obtain a correlogram of the same length in the calculation method in the time domain, it is necessary to register and process a larger amount of data than when using the calculation method in the spectral domain.
Keywords:
well modules, multi-wave seismic, geophysical data systems, frequency domain, time domain, vertical seismic profiling, hardware-methodical complex, seismic records, vibration seismic, seismic signals
Data encryption and data protection
Reference:
Revnivykh A.V., Velizhanin A.S.
The method of automated research of the structure of disassembled representation of software code with a buffer overflow vulnerability using the matrix approach
// Cybernetics and programming.
2018. ¹ 6.
P. 11-30.
DOI: 10.25136/2644-5522.2018.6.28288 URL: https://en.nbpublish.com/library_read_article.php?id=28288
Abstract:
The subject of the research is the optimization algorithms for automated dependency search on disassembled code. The object of the research is the dependent code blocks on the x64 architecture of Intel processors manufactured by the company and listings obtained as a result of reverse engineering software by compilers with different settings in Windows and Linux.Purpose of the study. The purpose of the study is to consider the possibility of using mathematical matrices to build a machine code map, and also to review possible problems for automatic analysis, to search for the paths of information flows. Research methods. We used the Visual C ++ compiler. We consider the architecture in which the transfer of information can be carried out in the following ways: register-memory, memory-register, register-register. For the analysis, the method of forming the list of functions called up to the potentially dangerous block of code under investigation, chosen for each considered path to the block under study, was chosen. Methods for implementing the matrix approach are described and developed. Novelty and key findings. Mathematical matrix method can be used to build a machine code map. However, determining the reachability paths of individual code blocks may require a significant amount of resources. In addition, the machine code can be exposed to packers and obfuscators, which also introduces additional complexity. A number of potentially dangerous functions of the standard library of the C / C ++ programming language were identified.
Keywords:
Mathematical Matrix Method, Buffer overflow, Disassembling, Code analysis, Vulnerabilities, Information security, Code compilers, Code Packers, Code obfuscators, Functions List
Question at hand
Reference:
Tymchuk A.I.
Textural signs in the problem of segmentation of aerial photographs based on luminance dependence matrices
// Cybernetics and programming.
2018. ¹ 6.
P. 31-39.
DOI: 10.25136/2644-5522.2018.6.28395 URL: https://en.nbpublish.com/library_read_article.php?id=28395
Abstract:
Computer image analysis is an automatic image processing, in the process of which the definition and classification of objects located on the image takes place. One of the most important stages of this analysis is image segmentation, by means of which, based on a set of characteristics (color, texture, brightness, etc.), the initial image is divided into many non-intersecting areas. The importance of the stage lies in the significant impact of segmentation on the final result of the analysis.The object of the research is the method of texture segmentation of the image based on the construction and use of luminance dependence matrices. The subject of the research is the effect of textural features on the quality of image segmentation. Special attention is paid to the calculation of the textural attributes and segmentation evaluation criteria.The research methodology is based on the analysis of texture segmentation of images using empirical evaluation criteria and reference segmentation. The main conclusion of the study is the conclusion about the choice of a set of textural features that showed the best segmentation results. This conclusion was made on the basis of the analysis of the values of the selected criteria for assessing the quality of segmentation. The textural segmentation of images and the evaluation criteria were performed on the basis of the developed program in the C ++ programming language. The novelty of the study is in the analysis of textural characteristics regarding the quality of image segmentation, made on their basis.
Keywords:
Gray Level Co-occurrence Matrix, texture characteristic, textural feature, texture analysis, texture segmentation, segment, texture, image processing, image segmentation evaluation criteria, reference segmentation
Question at hand
Reference:
Polyanichko M.A.
Using technical indicators to identify insider threats
// Cybernetics and programming.
2018. ¹ 6.
P. 40-47.
DOI: 10.25136/2644-5522.2018.6.27970 URL: https://en.nbpublish.com/library_read_article.php?id=27970
Abstract:
Detecting insider threats and countering them is a complex task faced by information security experts in both the commercial sector and government organizations. Modern organizations depend on information technology and their information assets, which makes the problem of confronting insiders all the more urgent. Identification of insiders can be carried out by introducing a complex of both technical and organizational measures. The article proposes the use of data from the work logs of information protection software and other monitoring tools to identify insider threats and highlights a set of indicators indicating the presence of suspicious employee actions. The set of technical indicators (indicators) proposed in the article can be used to build a system of logical rules or fuzzy inference rules that allow identifying insiders in an organization. The introduction of mechanisms for analyzing the proposed indicators will improve the efficiency of the information security administrator and will help prevent incidents related to the implementation of insider threats.
Keywords:
information security tools, indicator, insider detection, insider, information security, internal threats, monitoring, insider threats, suspicious actions, staff
Parallel algorithms for numerical analysis
Reference:
Pekunov V.V.
Application of prediction in parallel processing of chains of predicates in regular-logic expressions
// Cybernetics and programming.
2018. ¹ 6.
P. 48-55.
DOI: 10.25136/2644-5522.2018.6.27986 URL: https://en.nbpublish.com/library_read_article.php?id=27986
Abstract:
This paper addresses the problem of choosing the execution mode (sequential or parallel) when processing chains of predicates in regular-logic expressions. A brief description of the essence of regular-logical expressions, their known applications (natural language interfaces, automatic parallelizer of C-programs), types and composition of predicate chains is given. Particular attention is paid to the question of the prediction of time spent on processing chains in one mode or another. Various approaches to such a possible prediction are considered in detail. It is noted that in this case the semi-empirical-statistical approach is the most natural. The paper uses the basic relations of the theory of parallel computing, interpolation and extrapolation methods, computational experiment, elements of statistical processing. A new semi-empirical-statistical approach to solving the problem of calculating estimates of the execution time of chains of predicates is proposed. The approach is distinguished by the minimum amount of time measurement achieved using partial recovery of missing data, and the use of potentially more accurate linear autoregressive and quadratic models to calculate the estimated execution time in sequential or parallel modes.
Keywords:
collection of statistical data, prediction, parallel processing, execution mode selection, predicate's chains, logical handling, regular-logical expressions, absent data creation, semi-empirical estimations, calculating study
Mathematical models and computer simulation experiment
Reference:
Sklyar A.
Time series analysis and identification of processes with diffuse periodicity
// Cybernetics and programming.
2018. ¹ 6.
P. 56-64.
DOI: 10.25136/2644-5522.2018.6.27069 URL: https://en.nbpublish.com/library_read_article.php?id=27069
Abstract:
The subject of research is the method of estimating the noise component in the time series and its removal, the selection of the trend and fluctuations with different periods, the concept of T-ε and T-h-ε almost periods for the final series is introduced. The analysis is based on the requirement of smoothness of a function representing the original data and having derivatives up to the fourth order inclusive and the allocation of almost periods based on functions of the Alter-Johnson type. Separately, the trend of the length of the periods identified in the data of a number of fluctuations. The algorithm for solving the problem is based on minimizing the deviations of the calculated values from the smooth function, provided that the deviations from the source data correspond to the noise level. To identify the oscillatory component and the trend of almost periods, the modified Alter-Johnson function is used. The proposed methodology and algorithms for estimating and eliminating noise in the data allow us to reasonably determine the noise level in the data, remove the noise component from the data, identify almost the periods in the data in the sense of the definitions introduced in the article, highlight the trend and oscillation components in the data, identify, if necessary, the trend of changes almost periods.
Keywords:
data decomposition, signal spectrum, periodic functions, almost period, trend, time series, noise filtering, noise, numerical modeling, time series analysis
Mathematical models and computer simulation experiment
Reference:
Goryachev A.V., Novakova N.E.
Network traffic modeling based on the marker basket algorithm
// Cybernetics and programming.
2018. ¹ 6.
P. 65-79.
DOI: 10.25136/2644-5522.2018.6.27778 URL: https://en.nbpublish.com/library_read_article.php?id=27778
Abstract:
The object of research in this article is a system for simulating network traffic and its optimization. The subject of research in this article is the marker basket algorithm and methods for optimizing network traffic. Particular attention is paid to the network parameters of special control. We consider the problem of traffic management in order to ensure the quality of network service. Dynamic filter models are proposed based on a marker basket algorithm and a multiplexer that supports network quality control. The task of choosing the optimal strategy for controlling the parameters of traffic filters, working by the marker basket algorithm, is considered. The main research methodology is simulation modeling. Such metaheuristic optimization algorithms such as the genetic algorithm, the harmony algorithm, and the lifting algorithm are investigated. As a result of the research, a mathematical model for assessing the effectiveness of a network site was developed.A simulation and analytical model of network traffic based on the marker basket algorithm has been developed and implemented.The possibilities of several optimization algorithms are analyzed. Conducted simulation experiments, which resulted in the identification of optimal solutions.The study presented can be used to solve problems of improving the quality of network services.
Keywords:
token bucket algorithm, Simulation, multiplexor, quality of service, network throughput, network resources, network traffic, optimization, genetic algorithm, harmonic search algorithm
Telecommunication systems and computer networks
Reference:
Gibadullin R.F.
Organization of secure data transmission in a sensor network based on AVR microcontrollers
// Cybernetics and programming.
2018. ¹ 6.
P. 80-86.
DOI: 10.25136/2644-5522.2018.6.24048 URL: https://en.nbpublish.com/library_read_article.php?id=24048
Abstract:
The subject of the research is the implementation of the AES encryption algorithm based on AVR microcontrollers to provide secure data transmission in the sensor network. The sensor network is a network technique for the implementation of Ubiquitous computing environment. It is wireless network environment that consists of the many sensors of lightweight and low-power. Though sensor network provides various capabilities, it is unable to ensure the secure authentication between nodes. Eventually it causes the losing reliability of the entire network and many secure problems. Therefore, encryption algorithm for the implementation of reliable sensor network environments is required to the applicable sensor network. In this paper, the author proposes the solution of reliable sensor network to analyze the communication efficiency through measuring performance of AES encryption algorithm by plaintext size, and cost of operation per hop according to the network scale.
Keywords:
encryption, low power consumption, data protection, data encoding, AVR Microcontroller, AES algorithm, wireless sensor networks, decryption, transmission speed, network latency
Databases
Reference:
Bodrina N., Sidorov K., Filatova N., Shemaev P.
Software complex for the formation of situationally conditioned patterns of physical signals
// Cybernetics and programming.
2018. ¹ 6.
P. 87-97.
DOI: 10.25136/2644-5522.2018.6.28151 URL: https://en.nbpublish.com/library_read_article.php?id=28151
Abstract:
The subject of research is the task of creating tools for the formation of information resources with samples of physical signals recorded by a person experiencing an emotional reaction caused by a certain informational stimulus. The results of the analysis of the most well-known national databases with examples of emotional reactions in the patterns of English and French speech, photographs of people, samples of cardiograms, galvanic skin reactions, heart rate and other physical signals are presented. The structure of the new hardware-software complex for the formation and maintenance of an open information resource that integrates examples of recordings of Russian speech with recordings of other physical signals recorded by a person during emotional reactions of a different sign is considered. Conducted field experiments with hardware and software. For the formation of vector patterns of patterns of physical signals used methods of spectral analysis and nonlinear dynamics. The database is developed using systems analysis methods. The new results include the structure of software and information support; features of the methodological support, allowing to register objectively confirmed changes in the emotional state of a person, features of technical support supporting registration of biomedical signals through five channels: video, audio, electroencephalogram, electrocardiogram, electromyogram, as well as the structure and features of an open online version of the multimodal emotion base. The creation and periodic updating of the content of the database of patterns of situational responses makes available to all interested users complete information on each experiment, including recordings of speech and physical signals, as well as data on the methodology of experiments and observation protocols.
Keywords:
base of emotional reactions, stimulated emotion, electroencephalogram, Russian speech, emotional reaction, software complex, database, attractor, base of situational responses, toolkit for filling the database
Knowledge bases, intelligent systems, expert systems, decision support systems
Reference:
Fedorova N.I., Klimenteva A.Y.
Information support for decision-making in the formation of a strategy for innovative development of the region
// Cybernetics and programming.
2018. ¹ 6.
P. 98-109.
DOI: 10.25136/2644-5522.2018.6.27399 URL: https://en.nbpublish.com/library_read_article.php?id=27399
Abstract:
On the basis of a new methodology for assessing the current state of innovative development of the region, a decision-making information support system (DSS) has been developed when developing an innovative development strategy for a region. The generalized structure of the decision support system, the description and purpose of its main modules are given. Approbation of the work of the DSS on the example of the Republic of Bashkortostan was carried out. The received recommendations are necessary for the state authorities of the constituent entity of the Russian Federation to form an effective plan of measures, taking into account the current state and the existing opportunities for innovative development of the region. The study is based on general scientific methods of knowledge.(analysis, synthesis, comparison), the presentation of tabular and graphical interpretation of empirical-factual information. The theoretical and practical significance of the study is due to the relevance of the studied problems of assessment, forecasting and planning of innovative development of territories when developing regional development strategies. The practical result of the research is the approbation and implementation of the proposed approaches in the development of an information decision-making support system necessary for obtaining recommendations on the formation of an effective action plan for an innovative development strategy for a region that takes into account the current state and available capabilities of the territory.
Keywords:
region, innovation, strategy, knowledge base, decision support system, strategic planning, innovative development, efficiency, regional policy, resources
Knowledge bases, intelligent systems, expert systems, decision support systems
Reference:
Katasev A.S.
Neuro-fuzzy model of classification rules generation as an effective approximator of objects with discrete output
// Cybernetics and programming.
2018. ¹ 6.
P. 110-122.
DOI: 10.25136/2644-5522.2018.6.28081 URL: https://en.nbpublish.com/library_read_article.php?id=28081
Abstract:
The subject of this research is to evaluate the effectiveness of the approximation of objects with discrete output based on fuzzy knowledge bases. The object of the research is the neuro-fuzzy model, which allows, based on the training of a fuzzy neural network, to form a system of fuzzy-production rules (a fuzzy knowledge base) for assessing the state of objects. The author examines in detail the type of fuzzy-production rules proposed by him, the algorithm of logical inference on the rules, describes the developed model of a fuzzy neural network. Particular attention is paid to the need to assess the approximating ability of the model in order to determine the feasibility and effectiveness of its practical use. This assessment was made by analyzing the following model characteristics:- convergence of the developed learning algorithm for fuzzy neural network;- satisfaction of its work with the principles of fuzzy approximation;- consistency of the logic inference algorithm on the rules of the model to the well-known algorithm for approximating objects with discrete output based on a fuzzy knowledge base. The estimation of the approximating ability of the neuro-fuzzy model was made, based on the results of which it was concluded that this model is an effective approximator of objects with a discrete output. In addition, in order to test the model, an assessment was made of the classifying ability of the fuzzy rules being formed. The accuracy of classification based on fuzzy rules turned out to be no lower than the accuracy of other known classification methods. The practical value of the application of such rules is the ability to build decision support systems for assessing the state of objects in various subject areas.
Keywords:
state object evaluation, modeling, approximation, fuzzy production rule, neuro-fuzzy model, fuzzy neural network, decision support, fuzzy logic, neural network, fuzzy knowledge base