Data-Variant Kernel Analysis
By Yuichi Motai
()
About this ebook
Describes and discusses the variants of kernel analysis methods for data types that have been intensely studied in recent years
This book covers kernel analysis topics ranging from the fundamental theory of kernel functions to its applications. The book surveys the current status, popular trends, and developments in kernel analysis studies. The author discusses multiple kernel learning algorithms and how to choose the appropriate kernels during the learning phase. Data-Variant Kernel Analysis is a new pattern analysis framework for different types of data configurations. The chapters include data formations of offline, distributed, online, cloud, and longitudinal data, used for kernel analysis to classify and predict future state.
Data-Variant Kernel Analysis:
- Surveys the kernel analysis in the traditionally developed machine learning techniques, such as Neural Networks (NN), Support Vector Machines (SVM), and Principal Component Analysis (PCA)
- Develops group kernel analysis with the distributed databases to compare speed and memory usages
- Explores the possibility of real-time processes by synthesizing offline and online databases
- Applies the assembled databases to compare cloud computing environments
- Examines the prediction of longitudinal data with time-sequential configurations
Data-Variant Kernel Analysis is a detailed reference for graduate students as well as electrical and computer engineers interested in pattern analysis and its application in colon cancer detection.
Related to Data-Variant Kernel Analysis
Titles in the series (13)
Kalman Filtering and Neural Networks Rating: 0 out of 5 stars0 ratingsStable Adaptive Control and Estimation for Nonlinear Systems: Neural and Fuzzy Approximator Techniques Rating: 0 out of 5 stars0 ratingsIndependent Component Analysis Rating: 4 out of 5 stars4/5Knowledge Based Radar Detection, Tracking and Classification Rating: 0 out of 5 stars0 ratingsComplex Valued Nonlinear Adaptive Filters: Noncircularity, Widely Linear and Neural Models Rating: 0 out of 5 stars0 ratingsMultiple-Input Multiple-Output Channel Models: Theory and Practice Rating: 0 out of 5 stars0 ratingsData-Variant Kernel Analysis Rating: 0 out of 5 stars0 ratingsNeural-Based Orthogonal Data Fitting: The EXIN Neural Networks Rating: 0 out of 5 stars0 ratingsRadio Resource Management in Multi-Tier Cellular Wireless Networks Rating: 0 out of 5 stars0 ratingsFundamentals of Cognitive Radio Rating: 0 out of 5 stars0 ratings
Related ebooks
Evolutionary Algorithms for Mobile Ad Hoc Networks Rating: 0 out of 5 stars0 ratingsMachine Learning: Hands-On for Developers and Technical Professionals Rating: 0 out of 5 stars0 ratingsMachine Learning for Time Series Forecasting with Python Rating: 4 out of 5 stars4/5Architecture-Aware Optimization Strategies in Real-time Image Processing Rating: 0 out of 5 stars0 ratingsFoundations of Data Intensive Applications: Large Scale Data Analytics under the Hood Rating: 0 out of 5 stars0 ratingsStatistical Data Cleaning with Applications in R Rating: 0 out of 5 stars0 ratingsMachine Learning in the AWS Cloud: Add Intelligence to Applications with Amazon SageMaker and Amazon Rekognition Rating: 0 out of 5 stars0 ratingsBackhauling / Fronthauling for Future Wireless Systems Rating: 0 out of 5 stars0 ratingsTemporal Data Mining via Unsupervised Ensemble Learning Rating: 0 out of 5 stars0 ratingsStructured Parallel Programming: Patterns for Efficient Computation Rating: 1 out of 5 stars1/5Hybrid Intelligence for Image Analysis and Understanding Rating: 0 out of 5 stars0 ratingsHeterogeneous Computing with OpenCL 2.0 Rating: 0 out of 5 stars0 ratingsAdvanced Backend Code Optimization Rating: 0 out of 5 stars0 ratingsAWS Certified Solutions Architect Study Guide: Associate SAA-C02 Exam Rating: 0 out of 5 stars0 ratingsA SECURE DATA AGGREGATION TECHNIQUE IN WIRELESS SENSOR NETWORK Rating: 0 out of 5 stars0 ratingsElectronic Structure Calculations on Graphics Processing Units: From Quantum Chemistry to Condensed Matter Physics Rating: 0 out of 5 stars0 ratingsInternet of Things: Architectures, Protocols and Standards Rating: 0 out of 5 stars0 ratingsKeras to Kubernetes: The Journey of a Machine Learning Model to Production Rating: 0 out of 5 stars0 ratingsResponsible Data Science Rating: 0 out of 5 stars0 ratingsNumerical Algorithms for Personalized Search in Self-organizing Information Networks Rating: 0 out of 5 stars0 ratingsProgramming Massively Parallel Processors: A Hands-on Approach Rating: 0 out of 5 stars0 ratingsKernel Learning Algorithms for Face Recognition Rating: 0 out of 5 stars0 ratingsAWS Certified Solutions Architect Study Guide with 900 Practice Test Questions: Associate (SAA-C03) Exam Rating: 0 out of 5 stars0 ratingsHigh Performance Parallelism Pearls Volume One: Multicore and Many-core Programming Approaches Rating: 0 out of 5 stars0 ratingsRadial Basis Networks: Fundamentals and Applications for The Activation Functions of Artificial Neural Networks Rating: 0 out of 5 stars0 ratingsBeginning Software Engineering Rating: 4 out of 5 stars4/5Big Data Analytics for Large-Scale Multimedia Search Rating: 0 out of 5 stars0 ratingsIntegrative Cluster Analysis in Bioinformatics Rating: 0 out of 5 stars0 ratingsDEEP LEARNING TECHNIQUES: CLUSTER ANALYSIS and PATTERN RECOGNITION with NEURAL NETWORKS. Examples with MATLAB Rating: 0 out of 5 stars0 ratings
Programming For You
HTML & CSS: Learn the Fundaments in 7 Days Rating: 4 out of 5 stars4/5Python Programming : How to Code Python Fast In Just 24 Hours With 7 Simple Steps Rating: 4 out of 5 stars4/5SQL QuickStart Guide: The Simplified Beginner's Guide to Managing, Analyzing, and Manipulating Data With SQL Rating: 4 out of 5 stars4/5Learn PowerShell in a Month of Lunches, Fourth Edition: Covers Windows, Linux, and macOS Rating: 0 out of 5 stars0 ratingsLearn to Code. Get a Job. The Ultimate Guide to Learning and Getting Hired as a Developer. Rating: 5 out of 5 stars5/5The Unofficial Guide to Open Broadcaster Software: OBS: The World's Most Popular Free Live-Streaming Application Rating: 0 out of 5 stars0 ratingsCoding All-in-One For Dummies Rating: 4 out of 5 stars4/5Java for Beginners: A Crash Course to Learn Java Programming in 1 Week Rating: 5 out of 5 stars5/5Hacking: Ultimate Beginner's Guide for Computer Hacking in 2018 and Beyond: Hacking in 2018, #1 Rating: 4 out of 5 stars4/5Grokking Algorithms: An illustrated guide for programmers and other curious people Rating: 4 out of 5 stars4/5Python Projects for Beginners: A Ten-Week Bootcamp Approach to Python Programming Rating: 0 out of 5 stars0 ratingsSQL: For Beginners: Your Guide To Easily Learn SQL Programming in 7 Days Rating: 5 out of 5 stars5/5PYTHON: Practical Python Programming For Beginners & Experts With Hands-on Project Rating: 5 out of 5 stars5/5Excel : The Ultimate Comprehensive Step-By-Step Guide to the Basics of Excel Programming: 1 Rating: 5 out of 5 stars5/5Python: For Beginners A Crash Course Guide To Learn Python in 1 Week Rating: 4 out of 5 stars4/5SQL All-in-One For Dummies Rating: 3 out of 5 stars3/5The Little SAS Book: A Primer, Sixth Edition Rating: 5 out of 5 stars5/5Teach Yourself C++ Rating: 4 out of 5 stars4/5Pokemon Go: Guide + 20 Tips and Tricks You Must Read Hints, Tricks, Tips, Secrets, Android, iOS Rating: 5 out of 5 stars5/5Web Designer's Idea Book, Volume 4: Inspiration from the Best Web Design Trends, Themes and Styles Rating: 4 out of 5 stars4/5
Reviews for Data-Variant Kernel Analysis
0 ratings0 reviews
Book preview
Data-Variant Kernel Analysis - Yuichi Motai
LIST OF FIGURES
LIST OF TABLES
PREFACE
Kernel methods have been extensively studied in pattern classification and its applications for the past 20 years. Kernel may refer to diverse meanings in different areas such as Physical Science, Mathematics, Computer Science, and even Music/Business. For the area of Computer Science, the term kernel
is used in different contexts (i) central component of most operating systems, (ii) scheme-like programming languages, and (iii) a function that executes on OpenCL devices. In machine learning and statistics, the term kernel is used for a pattern recognition algorithm. The kernel functions for pattern analysis, called kernel analysis (KA), is the central theme of this book. KA uses kernel trick
to replace feature representation of data with similarities to other data. We will cover KA topics ranging from the fundamental theory of kernel functions to applications. The overall structure starts from Survey in Chapter 1. On the basis of the KA configurations, the remaining chapters consist of Offline KA in Chapter 2, Group KA in Chapter 3, Online KA in Chapter 4, Cloud KA in Chapter 5, and Predictive KA in Chapter 6. Finally, Chapter 7 concludes by summarizing these distinct algorithms.
Chapter 1 surveys the current status, popular trends, and developments on KA studies, so that we can oversee functionalities and potentials in an organized manner:
Utilize KA with different types of data configurations, such as offline, online, and distributed, for pattern analysis framework.
Adapt KA into the traditionally developed machine learning techniques, such as neural networks (NN), support vector machines (SVM), and principal component analysis (PCA).
Evaluate KA performance among those algorithms.
Chapter 2 covers offline learning algorithms, in which KA does not change its approximation of the target function, once the initial training phase has been absolved. KA mainly deals with two major issues: (i) how to choose the appropriate kernels for offline learning during the learning phase, and (ii) how to adopt KA into the traditionally developed machine learning techniques such as NN, SVM, and PCA, where the (nonlinear) learning data-space is placed under the linear space via kernel tricks.
Chapter 3 covers group KA as a data-distributed extension of offline learning algorithms. The data used for Chapter 3 is now extended into several databases. Group KA for distributed data is explored to demonstrate the big-data analysis with the comparable performance of speed and memory usages.
Chapter 4 covers online learning algorithms, in which KA allows the feature space to be updated as the training proceeds with more data being fed into the algorithm. The feature space update can be incremental or nonincremental. In an incremental update, the feature space is augmented with new features extracted from the new data, with a possible expansion to the feature space if necessary. In a nonincremental update, the dimension of the feature space remains constant as the newly computed features may replace some of the existing ones. In this specific chapter, we also identify the following possibilities of online learning:
Synthesize offline learning and online learning using KA, which suggests other connections and potential impact both on machine learning and on signal processing.
Extend KA with different types of data configurations, from offline to online for pattern analysis framework.
Apply KA into practical learning setting, such as biomedical image data.
Chapter 5 covers cloud data configuration. The objective of this cloud network setting is to deliver an extension of distributed data. KA from offline and online learning aspects are carried out in the cloud to give more precise treatments to nonlinear pattern recognition without unnecessary computational complexity. This latest trend of big-data analysis may stimulate the emergence of cloud studies in KA to validate the efficiency using practical data.
Chapter 6 covers longitudinal data to predict future state using KA. A time-transitional relationship between online learning and prediction techniques is explored, so that KA can be applied to adaptive prediction from online learning. The prediction performance over different time periods is evaluated in comparison to KA alternatives.
Chapter 7 summarizes these distinct data formations used for KA. The data handling issues and potential advantages of data-variant KAs are listed. The supplemental material includes MATLAB® codes in Appendix.
The book is not chronological, and therefore, the reader can start from any chapter. All the chapters were formed by themselves and are relevant to each other. The author has organized each chapter assuming the readers had not read the other chapters.
ACKNOWLEDGMENTS
This study was supported in part by the School of Engineering at Virginia Commonwealth University and the National Science Foundation.
The author would like to thank his colleagues for the effort and time they spent for this study:
Dr. Hiroyuki Yoshida for providing valuable colon cancer datasets for experimental results.
Dr. Alen Docef for his discussion and comments dealing with joint proposal attempts.
The work reported herein would not have been possible without the help of many of the past and present members of his research group, in particular:
Dr. Awad Mariette, Lahiruka Winter, Dr. Xianhua Jiang, Sindhu Myla, Dr. Dingkun Ma, Eric Henderson, Nahian Alam Siddique, Ryan Meekins, and Jeff Miller.
CHAPTER 1
SURVEY¹
1.1 INTRODUCTION OF KERNEL ANALYSIS
Kernel methods have been widely studied for pattern classification and multidomain association tasks [1–3]. Kernel analysis (KA) enables kernel functions to operate in the feature space without ever computing the coordinates of the data in that space, but rather by simply computing the inner products between the images of all pairs of data in the feature space [4, 5]. This operation is often less computational than the explicit computation of the coordinates [3, 6, 7]. This approach is called the kernel trick. Kernel functions have been introduced for sequence data, graphs, text, images, as well as vectors [8–14].
Kernel feature analysis attracts significant attention in the fields of both machine learning and signal processing [10, 15]; thus there are demands to cover this state-of-the-art topic [16]. In this survey, we identify the following popular trends and developments in KA, so that we can visualize the merits and potentials in an organized manner:
Yield nonlinear filters in the input space to open up many possibilities for optimum nonlinear system design.
Adapt KA into the traditionally developed machine learning techniques for nonlinear optimal filter implementations.
Explore kernel selection for distributed databases including solutions of heterogeneous issues.
Constructing composite kernels is an anticipated solution for heterogeneous data problems. A composite kernel is more relevant for the dataset and adapts itself by adjusting its composed coefficient parameters, thus allowing more flexibility in the kernel choice [3, 8, 17–20].
The key idea behind the KA method is to allow the feature space to be updated as the training proceeds with more data being fed into the algorithm [15, 21–26]. This feature space update can be incremental or nonincremental. In an incremental update, the feature space is augmented with new features extracted from the new data, with a possible expansion of the feature space if necessary [21–27]. In a nonincremental update, the dimension of the feature space remains constant, as the newly computed features may replace some of the existing ones [8, 19, 20]. In this survey, we also identify the following possibilities:
A link between offline learning and online learning using KA framework, which suggests other connections and a potential impact on both machine learning and signal processing.
A relationship between online learning and prediction techniques to merge them together for an adaptive prediction from online learning.
An online novelty detection with KA as an extended application of prediction algorithms from online learning. These algorithms listed in this survey are capable of operating with kernels, including support vector machines (SVMs) [12, 28–34] Gaussian processes [35–38], Fisher's linear discriminant analysis (LDA) [19, 39], principal component analysis (PCA) [3, 9–11, 20, 22, 24–27, 40], spectral clustering [41–47], linear adaptive filters