Computer
Mohammed E. Seno; Ban N. Dhannoon; Omer K. Jasim Mohammad
Abstract
Cloud computing is an evolving and high-demand research field at theforefront of technological advancements. It aims to provide software resources andoperates based on service-oriented delivery. Within the infrastructure as a service (IaaS)framework, the cloud offers end customers access to crucial infrastructure ...
Read More ...
Cloud computing is an evolving and high-demand research field at theforefront of technological advancements. It aims to provide software resources andoperates based on service-oriented delivery. Within the infrastructure as a service (IaaS)framework, the cloud offers end customers access to crucial infrastructure resources,including CPU, bandwidth, and memory. When a cloud system fails to deliver asexpected, it is referred to as an event, signifying a deviation from the anticipated service.To meet their service-level agreement (SLA) obligations, cloud service providers (CSPs)must ensure continuous access to fault-tolerant, on-demand resources for their clients,particularly during outages. Consequently, finding the most efficient ways to accomplishtasks while considering the rapid depletion of resources has become an urgent concern.Researchers are actively working to develop optimal strategies tailored to the cloudenvironment. Machine learning plays a critical role in these endeavors, serving as a keycomponent in various cloud computing platforms. This study presents a comprehensiveliterature review of current research papers that employ machine learning algorithms topropose strategies for optimizing cloud computing environments. Additionally, the surveyprovides authors with invaluable resources by extensively exploring a diverse range ofmachine learning techniques and their applications in the field of cloud computing. Byexamining these areas, researchers aim to enhance their understanding of efficientresource allocation and scheduling, addressing the challenges posed by resource scarcitywhile meeting SLA obligations.
Computer
Umniah Hameed Jaid; alia karim Abdulhassan
Abstract
The voice signal carries a wide range of data about the speaker, including theirphysical characteristics, feelings, and level of health. There are several uses for the estimateof these physical characteristics from the speech in forensics, security, surveillance,marketing, and customer service. ...
Read More ...
The voice signal carries a wide range of data about the speaker, including theirphysical characteristics, feelings, and level of health. There are several uses for the estimateof these physical characteristics from the speech in forensics, security, surveillance,marketing, and customer service. The primary goal of this research is to identify the auditorycharacteristics that aid in estimating a speaker’s age. To this end, an ensemble featureselection model is proposed that selects the best features from a baseline acoustic featurevector for age estimation from speech. Using a feature vector that covers various spectral,temporal, and prosodic aspects of speech, an ensemble-based automatic feature selection isperformed by, first calculating the feature importance or ranks based on individual featureselection methods, then voting is applied to the resulting feature ranks to attain the topranked subset by all feature selection methods. The proposed method is evaluated on theTIMIT dataset and achieved a mean absolute error (MAE) of 5.58 years and 5.12 years formale and female age estimation
Control
Wafeeq Sh. Hanna; Velar H. Elias; Dlawar R. Maruf
Abstract
The load forecasting is a human or computational technique foraccurate preanticipation of electrical load to enhance reliable operation andoptimal planning control of system plant for electrical energy flowing withoutfacing any economical and technical limitations, therefore appropriateestimation for ...
Read More ...
The load forecasting is a human or computational technique foraccurate preanticipation of electrical load to enhance reliable operation andoptimal planning control of system plant for electrical energy flowing withoutfacing any economical and technical limitations, therefore appropriateestimation for present and future consumption cost of electrical loads which arenecessary to predict the load demand for generating near to accurate power.During advanced technology at the last few decades, artificial neuralnetworks(ANNs) have been extensively employed in electrical system, they aretrained using historical data obtained from plant station. This work is intendedto be a study of short-term load forecasting (STLF) basis for a power predictedapplied to the actual past load data displayed from Azadi station for Feb.2022were used in training and validation system of neural grid. The result wasevaluated by mean square percentage error of (32.7) for the forecastingdynamic time series method to solve the data over hours, days, and weeks inadvance, using a kind of non-linear filtering. Short-term load forecasting triedout with main stages; predicted power load data sets, network training, andforecasting. Neural network used has 3-layers: an input, a hidden, and anoutput layer. The number of hidden layer neurons can be varied for the differentnetwork performance. The active power generation faces economical andtechnical challenges, therefore appropriate evaluation of loads are muchneeded
Computer
Rashad N. Razak; Hadeel N. Abdullah
Abstract
Multi-Object Detection and Tracking (MODT) are essential in manyapplication fields. Still, many enhancements in the speed of detection and tracking wererequired to overcome the challenges during implementation. This paper presents a newalgorithm system for (MODT) to improve the execution time to be robust ...
Read More ...
Multi-Object Detection and Tracking (MODT) are essential in manyapplication fields. Still, many enhancements in the speed of detection and tracking wererequired to overcome the challenges during implementation. This paper presents a newalgorithm system for (MODT) to improve the execution time to be robust in real-timeapplications. A background subtraction detection algorithm with a Kalman filter wasused to track and predict the object position and speed parameters. To improve theprocessing time, its needs to reduce some frames in a way that does not affect thedetection accuracy too much and instead use the prediction and the estimated valueobtained based on the Kalman filter for the tracked object. This work uses a single videocamera to show how effectively to compute and detect multiple objects concurrently; it isapplied for daytime preprocessing in an automated traffic surveillance system.Preliminary testing findings show that the suggested algorithm for this vehicle monitoringsystem is feasible and effective. It illustrates that using the suggested algorithm with asingle video camera can simultaneously watch, detect, and track several vehicles andimprove execution time. Simulation results on the built system demonstrate that theproposed system reduced the execution time to approximately 41.5% compared to thestandard background subtraction algorithm. Results indicate the proposed algorithm hasan approximate error for the position and speed of detected and tracked objects comparedwith the standard background subtraction algorithm.
Computer
Sanaa Ali Jabber; Soukaena H. Hashem; Shatha H Jafer
Abstract
Finding an optimal solution to some problem, like minimizing andmaximizing the objective function, is the goal of Single-Objective Optimization (SOP).Real-world problems, on the other hand, are more complicated and involve a widerrange of objectives, several objectives should be maximized in such problems. ...
Read More ...
Finding an optimal solution to some problem, like minimizing andmaximizing the objective function, is the goal of Single-Objective Optimization (SOP).Real-world problems, on the other hand, are more complicated and involve a widerrange of objectives, several objectives should be maximized in such problems. No singlesolution could be enhanced in all objectives without deteriorating at least one othergoal, which is the definition of Pareto-optimality. Understanding the idea of MultiObjective Optimization (MOP) is thus necessary to find the optimum solution. Multiobjective evolutionary algorithm (MOEA) are made to simultaneously assess manyobjectives and find Pareto-optimal solutions, MOEA can resolve multi-objective andsingle-objective optimization problems.This paper aims to introduce a survey study for optimization problem solutions bycomparing techniques, advantages, and disadvantages of SOP and MOP withmetaheuristics and evolutionary algorithms. From this study, we conduct that theefficiency of MOP lies in the present more than one SOP, but it takes a longer time toprocess and train and is not suitable for all applications, While SOP is faster and moreuseful in stock and profit maximization applications. And the posterior techniques areconsidered the dominant approach to solving multi-objective problems by the use of thefield of metaheuristics.
Computer
Asaad Raheem Kareem; Hasanen S. Abdullah
Abstract
The article provides an overview of two recent developments in technology: Business Intelligence (BI) and Deep Learning (DL). In order to support decision-making processes, BI entails gathering, integrating, and analyzing data from various sources, while DL uses artificial neural networks to learn and ...
Read More ...
The article provides an overview of two recent developments in technology: Business Intelligence (BI) and Deep Learning (DL). In order to support decision-making processes, BI entails gathering, integrating, and analyzing data from various sources, while DL uses artificial neural networks to learn and generate predictions from complicated datasets. This paper introduces the concepts and principles and highlights recent developments and applications in different domains of research: education, organizations, stock market, forecasting, decision-making in real-time, and security. However, the fundamental problem with the business intelligence approach is that there is no learning involved. Other limitations and challenges include the capacity that affects the data analysis process, the variety of data in results, and the need for a complete presentation of results in the form of dashboards, scorecards, reports, and portals. The approach choice hinges on the problem's context and requirements and the nature and characteristics of the data. Although BI and DL are widespread, alternative methods may suit well too, such as machine learning, data mining, and statistical analysis. Justifying the selection based on precise needs and goals is crucial. Recurrent neural networks (RNN), convolutional neural networks (CNN), long short-term memory (LSTM), gated recurrent units (GRU), and Business intelligence tools are used in the research problem to address these limitations and explore the potential advantages and difficulties of integrating BI and DL to achieve an advantage in a given sector.
Control
Ayad Q. Abdulkareem; Abdulrahim Th. Humod; Oday A. Ahmed
Abstract
To perform fault tolerance for Anti-lock Braking System (ABS), This paper proposes a hybrid Fault Detection and Fault Tolerant Control (FD-FTC) for ABS speed sensors. It utilizes a Fault Detection (FD) unit and a Data Construction (DC) unit. The first one, the FD unit, is based on a kNN classifier model ...
Read More ...
To perform fault tolerance for Anti-lock Braking System (ABS), This paper proposes a hybrid Fault Detection and Fault Tolerant Control (FD-FTC) for ABS speed sensors. It utilizes a Fault Detection (FD) unit and a Data Construction (DC) unit. The first one, the FD unit, is based on a kNN classifier model with 99.9% fault detection accuracy to perform three tasks: early fault detection, fault location diagnosis, and excluding faulty signals from being utilized in further processes. On the other hand, the second one, the DC Unit, is based on two separate neural network models. These models have an MSE of 2.01139e-1 and a R2 of 999880 for the first model and an MSE of 1.12486e-0 and 0.999586 for the second model. They are employed to provide an estimated alternative signal for the ABS speed sensors. These estimated signals are employed to perform two tasks: confirming fault detection declared by the FD model and compensating for the excluded faulty signal to fulfill fault accommodation. Both methods are trained and tested with MATLAB and Simulink. Results demonstrate that the proposed hybrid method has the ability to accurately detect and tolerate sensor faults and fulfill its design purpose, especially during emergency braking.
Computer
Amar A. Mahawish; Hassan Jalee.l Hassan
Abstract
The performance of the Internet is significantly impacted by network congestion. Because of the internet's current rapid growth, congestion could increase and cause more packets to be dropped. The Transmission Control Protocol (TCP) connection is used as the reliable transmission of packets which has ...
Read More ...
The performance of the Internet is significantly impacted by network congestion. Because of the internet's current rapid growth, congestion could increase and cause more packets to be dropped. The Transmission Control Protocol (TCP) connection is used as the reliable transmission of packets which has a Drop Tail (DT) mechanism since congestion DT signals occur only when the queue has become full. The Random Early Detection algorithm mechanism can be developed as congestion control which eliminates the drawback of a tail drop queue full. The Red algorithm also has problems dealing with different numbers of connections due to fix parameter tuning. In this study, the EXhaustive search used with the RED algorithm to develop the EX-RED algorithm. Based on network performance metrics such as packet drop, delay, and throughput, the developed algorithm adjusts the default RED parameter to find the best of them. When the number of TCP connections changes, the exhaustive search used systematically enumerates all possible states of the parameter. The simulation results showed that the EXRED improved performance of the network as compared with the other five algorithms (GRED, RED, ARED, NLRED, and NLGRED) by decreasing delay and dropped packets and increasing packet throughput.
Computer
Mohammed Majid Msallam
Abstract
In recent years, safe lockers have been spread in public places to secure valuable belongings. The people are concerned about losing safe locker keys or use of the spare key by others and will remain worried about their things. To solve the forenamed matter, this paper proposed a system that depends ...
Read More ...
In recent years, safe lockers have been spread in public places to secure valuable belongings. The people are concerned about losing safe locker keys or use of the spare key by others and will remain worried about their things. To solve the forenamed matter, this paper proposed a system that depends on biometrics to secure valuable things and can minimize their concerns and worries. The proposed system consists of two major parts, software, and hardware. In the hardware part, A microcontroller with a camera and an electronic lock will be used to securely open or close the door of the safe locker. In the software part, the images that are pictured by the camera will be prepared by an image processing algorithm and then Support Vector Machines (SVM) will be trained in the images of a person. The images and information of the person will be saved until the belongings come out and after that deletes everything to prepare for another one.
Computer
Asmaa A Mohammed; Abdul Monem S. Rahma; Hala B. Abdulwahad
Abstract
The rapid "development of communication technology" has had a considerable impact on financial transactions, leading to the appearance of new types of "currency." These digital currencies are intended to streamline processes, decrease time and effort expenditures, and minimize financial losses while ...
Read More ...
The rapid "development of communication technology" has had a considerable impact on financial transactions, leading to the appearance of new types of "currency." These digital currencies are intended to streamline processes, decrease time and effort expenditures, and minimize financial losses while doing away with the need for traditional financial intermediaries and central bank regulation. Despite persistent worries and hazards that remain in the minds of people participating in currency trading and stock exchanges, digital currencies have significantly impacted the global financial industry. The fundamental issue with digital currencies is the question of the legal and regulatory frameworks. Since digital currencies are decentralized, conventional regulatory frameworks might find it difficult to keep up with the consequences of this rapidly changing technology. This may result in ambiguity and uncertainty regarding the governance and regulation of digital currency.Best in Class In addition to exploring the ideas of Bitcoin and block chain technology, this study attempts to give an overview of digital currencies, including their definition, emergence, and development. The research examined the stages of development of digital currency from its inception in 2009 to the present year, 2023. It traced the evolution of digital currency over this period. The study provided into how digital currency has evolved from its early days, marked by the introduction of Bitcoin in 2009, to its current state in 2023.