Artificial Neural Networks for Renewable Energy Systems and Real-World Applications
()
About this ebook
Artificial Neural Networks for Renewable Energy Systems and Real-World Applications presents current trends for the solution of complex engineering problems in the application, modeling, analysis, and optimization of different energy systems and manufacturing processes. With growing research catering to the applications of neural networks in specific industrial applications, this reference provides a single resource catering to a broader perspective of ANN in renewable energy systems and manufacturing processes.
ANN-based methods have attracted the attention of scientists and researchers in different engineering and industrial disciplines, making this book a useful reference for all researchers and engineers interested in artificial networks, renewable energy systems, and manufacturing process analysis.
- Includes illustrative examples on the design and development of ANNS for renewable and manufacturing applications
- Features computer-aided simulations presented as algorithms, pseudocodes and flowcharts
- Covers ANN theory for easy reference in subsequent technology specific sections
Related to Artificial Neural Networks for Renewable Energy Systems and Real-World Applications
Related ebooks
Diagnostic Biomedical Signal and Image Processing Applications With Deep Learning Methods Rating: 0 out of 5 stars0 ratingsSmart Electrical and Mechanical Systems: An Application of Artificial Intelligence and Machine Learning Rating: 0 out of 5 stars0 ratingsSmart Energy and Electric Power Systems: Current Trends and New Intelligent Perspectives Rating: 0 out of 5 stars0 ratingsEdge-of-Things in Personalized Healthcare Support Systems Rating: 0 out of 5 stars0 ratingsApplications of Computational Intelligence in Multi-Disciplinary Research Rating: 0 out of 5 stars0 ratingsIntelligent Data Analysis for Biomedical Applications: Challenges and Solutions Rating: 0 out of 5 stars0 ratingsIntelligent Learning Approaches for Renewable and Sustainable Energy Rating: 0 out of 5 stars0 ratingsArtificial Intelligence for Future Generation Robotics Rating: 5 out of 5 stars5/5Next-Generation Cyber-Physical Microgrid Systems: A Practical Guide to Communication Technologies for Resilience Rating: 0 out of 5 stars0 ratingsDeep Learning for Medical Applications with Unique Data Rating: 0 out of 5 stars0 ratingsDeep Learning for Data Analytics: Foundations, Biomedical Applications, and Challenges Rating: 0 out of 5 stars0 ratingsArtificial Intelligence-Based Brain-Computer Interface Rating: 0 out of 5 stars0 ratingsApplications of AI and IOT in Renewable Energy Rating: 0 out of 5 stars0 ratingsSwarm Intelligence for Resource Management in Internet of Things Rating: 0 out of 5 stars0 ratingsAdvances in Smart Grid Power System: Network, Control and Security Rating: 0 out of 5 stars0 ratingsData Science for Genomics Rating: 0 out of 5 stars0 ratingsNanoscale Memristor Device and Circuits Design Rating: 0 out of 5 stars0 ratingsIntelligent Edge Computing for Cyber Physical Applications Rating: 0 out of 5 stars0 ratingsWearable Telemedicine Technology for the Healthcare Industry: Product Design and Development Rating: 0 out of 5 stars0 ratingsIntelligent Data-Analytics for Condition Monitoring: Smart Grid Applications Rating: 0 out of 5 stars0 ratingsComprehensive Guide to Heterogeneous Networks Rating: 0 out of 5 stars0 ratingsDrones in Smart-Cities: Security and Performance Rating: 0 out of 5 stars0 ratingsPredictive Modelling for Energy Management and Power Systems Engineering Rating: 0 out of 5 stars0 ratingsComputational Intelligence and Deep Learning Methods for Neuro-rehabilitation Applications Rating: 0 out of 5 stars0 ratingsComputational Intelligence for Multimedia Big Data on the Cloud with Engineering Applications Rating: 0 out of 5 stars0 ratingsIoT Enabled Multi-Energy Systems: From Isolated Energy Grids to Modern Interconnected Networks Rating: 0 out of 5 stars0 ratingsNon-Destructive Testing and Condition Monitoring Techniques in Wind Energy Rating: 0 out of 5 stars0 ratingsControl Strategy for Time-Delay Systems: Part II: Engineering Applications Rating: 0 out of 5 stars0 ratingsNeutrosophic Set in Medical Image Analysis Rating: 0 out of 5 stars0 ratings
Power Resources For You
Electric Motors and Drives: Fundamentals, Types and Applications Rating: 5 out of 5 stars5/5Electric Motor Control: DC, AC, and BLDC Motors Rating: 5 out of 5 stars5/5How Do Electric Motors Work? Physics Books for Kids | Children's Physics Books Rating: 0 out of 5 stars0 ratingsElectronics All-in-One For Dummies Rating: 4 out of 5 stars4/5The Ultimate Solar Power Design Guide Less Theory More Practice Rating: 4 out of 5 stars4/5Mastering Circuit Theory Rating: 0 out of 5 stars0 ratingsWorld Film Locations: Las Vegas Rating: 0 out of 5 stars0 ratingsOff Grid And Mobile Solar Power For Everyone: Your Smart Solar Guide Rating: 0 out of 5 stars0 ratingsThe Illustrated Tesla Rating: 5 out of 5 stars5/5The Homeowner's DIY Guide to Electrical Wiring Rating: 5 out of 5 stars5/5DIY Lithium Battery Rating: 3 out of 5 stars3/5Energy: A Beginner's Guide Rating: 4 out of 5 stars4/5Solar Electricity Basics: Powering Your Home or Office with Solar Energy Rating: 5 out of 5 stars5/5Temporary Stages II: Critically Oriented Drama Education Rating: 0 out of 5 stars0 ratingsSolar Power Your Home For Dummies Rating: 4 out of 5 stars4/5DIY Free Home Energy Solutions: How to Design and Build Your own Domestic Free Energy Solution Rating: 5 out of 5 stars5/5Electrical Machines: Lecture Notes for Electrical Machines Course Rating: 0 out of 5 stars0 ratingsSolar Power: How to Construct (and Use) the 45W Harbor Freight Solar Kit Rating: 5 out of 5 stars5/5Conductors and Insulators Electricity Kids Book | Electricity & Electronics Rating: 0 out of 5 stars0 ratingsIdaho Falls: The Untold Story of America's First Nuclear Accident Rating: 4 out of 5 stars4/5Solar Power Demystified: The Beginners Guide To Solar Power, Energy Independence And Lower Bills Rating: 5 out of 5 stars5/5Emergency Preparedness and Off-Grid Communication Rating: 0 out of 5 stars0 ratingsThe Illustrated Tesla (Rediscovered Books): With linked Table of Contents Rating: 5 out of 5 stars5/5Photovoltaic Design and Installation For Dummies Rating: 5 out of 5 stars5/5Geo Power: Stay Warm, Keep Cool and Save Money with Geothermal Heating & Cooling Rating: 5 out of 5 stars5/5A New System of Alternating Current Motors and Transformers Rating: 1 out of 5 stars1/5The Boy Who Harnessed the Wind: Creating Currents of Electricity and Hope Rating: 4 out of 5 stars4/5How to Drive a Nuclear Reactor Rating: 0 out of 5 stars0 ratingsNuclear War Survival Skills Rating: 0 out of 5 stars0 ratingsPower Supply Projects: A Collection of Innovative and Practical Design Projects Rating: 3 out of 5 stars3/5
Reviews for Artificial Neural Networks for Renewable Energy Systems and Real-World Applications
0 ratings0 reviews
Book preview
Artificial Neural Networks for Renewable Energy Systems and Real-World Applications - Ammar Hamed Elsheikh
Chapter one
Basics of artificial neural networks
Rehab Ali Ibrahim¹, Ammar H. Elsheikh², Mohamed Elasyed Abd Elaziz¹ and Mohammed A.A. Al-qaness³, ¹Department of Mathematics, Faculty of Science, Zagazig University, Zagazig, Egypt, ²Production Engineering and Mechanical Design Department, Tanta University, Tanta, Egypt, ³State Key Laboratory for Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan, P.R. China
Abstract
Artificial neural networks (ANNs) have been reported as useful predictive tools to model complex engineering systems. ANNs mimic the natural behavior of the human brain in handling different problems instead of solving intricate mathematical models. They are used as a black-box with excellent capabilities to learn the nonlinear relation between the inputs and outputs of a certain system. They have also enhanced the generalization capability to handle unseen data after the learning process. In this chapter, a review of the basics of ANNs is presented. In general, ANNs have received increased attention in recent years since they have been applied to numerous real-world applications. They have many advantages, such as simplicity and efficiency. In this chapter, the authors introduce the basic mathematical concepts of the multilayer perceptron, the wavelet neural network, radial basis function, and the Elman neural network.
Keywords
Artificial neural network (ANN); multilayer perceptron neural network; wavelet neural network (WNN)
Contents
Outline
1.1 Artificial neural networks 1
1.2 Types of neural networks 2
1.2.1 Multilayer perceptron neural network 2
1.2.2 Wavelet neural networks 4
1.2.3 Radial basis function 5
1.2.4 Elman neural network 6
1.2.5 Statistical performance evaluation criteria 8
1.3 Conclusion 9
References 10
1.1 Artificial neural networks
Artificial neural networks (ANNs) are widely distributed processors made up of basic processing units called neurons [1]. They have a built-in capability for storing experimental knowledge that is suitable for use. High-speed information processing, routing capabilities, fault tolerance, adaptiveness, generalization, and robustness are all excellent characteristics of ANNs. These features make ANNs useful tools for modeling, optimizing, and predicting the performance of various engineering systems. As a result, they have being used to solve complex nonlinear engineering problems in a number of real-world applications with acceptable cost and efficient computing time [2–9].
The neuron model used in many ANN models is made up of a series of links called synapses, each with its own weight, as shown Fig. 1.1. Each weight is multiplied by the xj input. Thereafter, all weighted inputs are summed and an externally applied bias bk is applied to lower or increase the summation’s output vk.
Figure 1.1 Nonlinear model of a neuron [1].
Also, the activation function φ(•) is employed to the output for decreasing the amplitude range for the output signal yk into a finite value. These sequences are formulated as follows:
(1.1)
where k represents a neuron, and j represents a synapse number.
1.2 Types of neural networks
In this section, we describe four ANN models, including the multilayer perceptron (MLP), wavelet neural network (WNN), radial basis function (RBF), and Elman neural network (ENN).
1.2.1 Multilayer perceptron neural network
The following are the fundamental principles of the MLP neural network. In general, this form of ANN has one input layer, multiple hidden layers, and one output layer [10,11], as shown in Fig. 1.2A. The input neuron is connected to the hidden layer neuron, and the hidden layer neuron is connected to the output neuron, but the neurons of the same layer are not connected. The neurons in the input layer obtain the data and transfer them on to the next layers before reaching the output layer.
Figure 1.2 The structure of (A) MLP [12], (B) wavelet neural networks [13], (C) RBF [14], and (D) Elman networks [15].
This problem can be defined as follows. There are M neurons in the input layer, and a k1-th neuron receives the xi input, where the output of the neuron can be obtained by:
(1.2)
Thus, the output is utilized as an input to the next hidden layer. The output of the neuron in the hidden layers is calculated as:
(1.3)
where denotes the activation function, and are the biases of the input and hidden layers. Also, ( ) denote the weights between the input and hidden layer neurons. is the number of inputs, and is the number of h hth hidden layer neurons, where represents the number of hidden layers.
An output neuron can be considered as a weighted sum of all outputs of the neurons of the last hidden layer, as follows:
(1.4)
where L represents the number of neurons in the output layer, and represents the weights between neurons in hidden layers and the neurons in the output layer.
Since the weights’ value has an effect on the final performance, backpropagation (BP) learning [16] is used to find these values. The BP method, on the other hand, takes a long time to complete because it requires an iterative training phase. Furthermore, the BP is most likely stuck in local minima.
1.2.2 Wavelet neural networks
WNNs, also known as wavelet networks (WNs), are a combination of wavelet theory and neural networks (NNs) [13]. The advantages of both the neural network and the wavelet transformation are inherited by WNNs. A WNN is made up of a feed-forward neural network and a hidden layer, the activation functions of which can be created using the orthonormal wavelet family [17]. The wavelet neurons are also called wavelons.
As shown in Fig. 1.2B, the WNN’s simplest structure consists of an input layer and an output layer. The concealed layer of the WNN is made up of wavelons, with the wavelet dilation and translation parameters as input coefficients. Since the input is in a small area of the input space, the hidden layer’s wavelons result in an output that is not zero. A WNN’s output can be interpreted as a linear combination of the wavelet’s activation functions, and this combination is also weighted.
Thus, the output can be formulated as the following equation:
(1.5)
where denotes the translation, and denotes the dilation coefficients.
Fig. 1.2B portrays the general structure of a WNN with one input and one output. The WNN’s secret layer is made up of wavelons with the number M. The network’s output is a weighted sum of the wavelon’s outputs [18].
(1.6)
where the value deals with the functions that contain nonzero mean, and the wavelet function has a mean with zero value. Also, at the highest scale, the value of is considered instead of the function . In the WNN, coefficients are learned by an algorithm.
1.2.3 Radial basis function
The RBF neural network is one of the most efficient NNs, with three layers: input, hidden, and output. It is similar to MLP, as illustrated in Fig. 1.2C [14,19].
The RBF’s input layer receives the input data and only passes it through the hidden layer. Following that, the hidden layer processes the obtained data and extracts relevant information before sending it to the output layer, which constructs the output data. However, the most important distinction between RBF and MLP is that tRBF is used by the hidden layer, which is defined by the Gaussian function, which is formulated as [14]:
(1.7)
where denotes the width of the neuron with a number j-th. xi represents the RBF input, and cj is the center of the RBF unit. Also, there are several functions that can be applied, such as:
(1.8)
(1.9)
(1.10)
Furthermore, another difference is the output layer using a linear function which is represented as the weighted sum of RBFs. This can be represented as:
(1.11)
where K is the outputs’ number, and wjk denotes the weight that connects the j-th node of the hidden layer and the k-th node of the output layer. Finally, j represents the number of nodes in the hidden layer.
1.2.4 Elman neural network
Generally, the ENN is a recurrent network (RNN), and a type of ANN in which connections between neurons form a guided cycle, which leads to dynamic temporal activity. RNNs vary from feedforward neural networks in that they handle input sequences using internal memory rather than external memory. This enables them to perform tasks such as unsegmented handwriting recognition and speech recognition.
Elman [20] proposed the ENNs, where the output of hidden layers is fed back onto themselves using the recurrent layer, and the number of recurrent neurons is equal to the hidden neurons. Therefore, ENNs have high ability for learning and constructing temporal and spatial patterns. Also, they have dynamic function that can give a system better capability to deal with time-varying properties.
The node in the hidden layer is connected only with one recurrent node using a fixed weight value.
The structure of ENNs is shown in Fig. 1.2D. They have four layers (input, hidden, recurrent, and output). Let the input and the output layers consist of N and M, and the hidden neurons number be denoted by Nh.
is the input of the neural network, where and are the outputs of the hidden and recurrent layers, respectively, as defined in the following equation:
(1.12)
where w1 represents the weight of the connection between the nodes in the input and hidden layers. w2 represents the weight of the connection between the hidden and recurrent layers, and f denotes the transfer function and is represented by the sigmoid function (as described in Table 1.1).
Table 1.1
After that, the hidden layer output is calculated, and the output of the neural network y(k) is calculated as:
(1.13)
where w3 denotes the weight between the hidden and output layers, and g represents the transfer function of the output layer.
The back-propagation is employed in ENNs for the weights updating process, furthermore, the error of the network is calculated as [15]:
(1.14)
where d(k) denotes the desired output of the u(k) input.
1.2.5 Statistical performance evaluation criteria
To assess the quality of the prediction of NNs, a set of evaluation metrics are applied. These metrics are mean square error ( ), mean absolute error ( ), mean relative error ( ), and root mean square error (RMSE). In addition there are the correlation coefficient ( ), coefficient of variance ( ), efficiency coefficient , coefficient of determination ( ²), coefficient of residual mass ( ), and through the index of the performance of the model ( ). These measures are defined in Table 1.2, in which is the number of observations, is the observed values, and is the desired and prediction values. and are the maximum and the minimum observed values, respectively, and and represent the average of the observed and prediction values, respectively.
Table 1.2
1.3 Conclusion
In this introductory chapter, different types of ANNs have been introduced. These ANN types include MLP, WNN, RBF, and ENN. The structure and mathematical notation of each ANN have been presented. It can be concluded that each of these ANNs has its own advantages that make it suitable for specific applications. In addition, most criteria that are used to assess the quality of the different ANN models have been presented.
References
1. Haykin SS, et al. Neural Networks and Learning Machines. Vol. 3 Upper Saddle River, NJ: Pearson; 2009.
2. Abd Elaziz M, et al. Utilization of Random Vector Functional Link integrated with Marine Predators Algorithm for tensile behavior prediction of dissimilar friction stir welded aluminum alloy joints. Journal of Materials Research and Technology. 2020;9(5):11370–11381.
3. Babikir HA, et al. Noise prediction of axial piston pump based on different valve materials using a modified artificial neural network model. Alexandria Engineering Journal 2019.
4. Elaziz MA, Elsheikh AH, Sharshir SW. Improved prediction of oscillatory heat transfer coefficient for a thermoacoustic heat exchanger using modified adaptive neuro-fuzzy inference system. International Journal of Refrigeration 2019.
5. El-Said EMS, Abd Elaziz M, Elsheikh AH, et al. Machine learning algorithms for improving the prediction of air injection effect on the thermohydraulic performance of shell and tube heat exchanger. Applied Thermal Engineering. 2021;185:116471.
6. Elsheikh, et al. Prediction of laser cutting parameters for polymethylmethacrylate sheets using random vector functional link network integrated with equilibrium optimizer. Journal of Intelligent Manufacturing 2020.
7. Elsheikh AH, et al. A new artificial neural network model integrated with a cat swarm optimization algorithm for predicting the emitted noise during axial piston pump operation. IOP Conference Series: Materials Science and Engineering. 2020;973:012035.
8. Elsheikh AH, et al. Utilization of LSTM neural network for water production forecasting of a stepped solar still with a corrugated absorber plate. Process Safety and Environmental Protection. 2021;148:273–282.
9. Elsheikh AH, et al. Modeling of solar energy systems using artificial neural network: a comprehensive review. Solar Energy. 2019;180:622–639.
10. Atkinson PM, Tatnall A. Introduction neural networks in remote sensing. International Journal of Remote Sensing. 1997;18(4):699–709.
11. Yan H, et al. A multilayer perceptron-based medical decision support system for heart disease diagnosis. Expert Systems with Applications. 2006;30(2):272–281.
12. Ren Y, et al. Random vector functional link network for short-term electricity load demand forecasting. Information Sciences.