Cross Correlation: Unlocking Patterns in Computer Vision
By Fouad Sabry
()
About this ebook
What is Cross Correlation
In signal processing, cross-correlation is a measure of similarity of two series as a function of the displacement of one relative to the other. This is also known as a sliding dot product or sliding inner-product. It is commonly used for searching a long signal for a shorter, known feature. It has applications in pattern recognition, single particle analysis, electron tomography, averaging, cryptanalysis, and neurophysiology. The cross-correlation is similar in nature to the convolution of two functions. In an autocorrelation, which is the cross-correlation of a signal with itself, there will always be a peak at a lag of zero, and its size will be the signal energy.
How you will benefit
(I) Insights, and validations about the following topics:
Chapter 1: Cross-correlation
Chapter 2: Autocorrelation
Chapter 3: Covariance matrix
Chapter 4: Estimation of covariance matrices
Chapter 5: Cross-covariance
Chapter 6: Autocovariance
Chapter 7: Variational Bayesian methods
Chapter 8: Normal-gamma distribution
Chapter 9: Expectation-maximization algorithm
Chapter 10: Griffiths inequality
(II) Answering the public top questions about cross correlation.
(III) Real world examples for the usage of cross correlation in many fields.
Who this book is for
Professionals, undergraduate and graduate students, enthusiasts, hobbyists, and those who want to go beyond basic knowledge or information for any kind of Cross Correlation.
Read more from Fouad Sabry
Related to Cross Correlation
Titles in the series (100)
Image Histogram: Unveiling Visual Insights, Exploring the Depths of Image Histograms in Computer Vision Rating: 0 out of 5 stars0 ratingsNoise Reduction: Enhancing Clarity, Advanced Techniques for Noise Reduction in Computer Vision Rating: 0 out of 5 stars0 ratingsGamma Correction: Enhancing Visual Clarity in Computer Vision: The Gamma Correction Technique Rating: 0 out of 5 stars0 ratingsUnderwater Computer Vision: Exploring the Depths of Computer Vision Beneath the Waves Rating: 0 out of 5 stars0 ratingsHuman Visual System Model: Understanding Perception and Processing Rating: 0 out of 5 stars0 ratingsColor Space: Exploring the Spectrum of Computer Vision Rating: 0 out of 5 stars0 ratingsRetinex: Unveiling the Secrets of Computational Vision with Retinex Rating: 0 out of 5 stars0 ratingsHomography: Homography: Transformations in Computer Vision Rating: 0 out of 5 stars0 ratingsInpainting: Bridging Gaps in Computer Vision Rating: 0 out of 5 stars0 ratingsAnisotropic Diffusion: Enhancing Image Analysis Through Anisotropic Diffusion Rating: 0 out of 5 stars0 ratingsComputer Vision: Exploring the Depths of Computer Vision Rating: 0 out of 5 stars0 ratingsActive Contour: Advancing Computer Vision with Active Contour Techniques Rating: 0 out of 5 stars0 ratingsTone Mapping: Tone Mapping: Illuminating Perspectives in Computer Vision Rating: 0 out of 5 stars0 ratingsContour Detection: Unveiling the Art of Visual Perception in Computer Vision Rating: 0 out of 5 stars0 ratingsVisual Perception: Insights into Computational Visual Processing Rating: 0 out of 5 stars0 ratingsAdaptive Filter: Enhancing Computer Vision Through Adaptive Filtering Rating: 0 out of 5 stars0 ratingsJoint Photographic Experts Group: Unlocking the Power of Visual Data with the JPEG Standard Rating: 0 out of 5 stars0 ratingsHistogram Equalization: Enhancing Image Contrast for Enhanced Visual Perception Rating: 0 out of 5 stars0 ratingsRadon Transform: Unveiling Hidden Patterns in Visual Data Rating: 0 out of 5 stars0 ratingsAffine Transformation: Unlocking Visual Perspectives: Exploring Affine Transformation in Computer Vision Rating: 0 out of 5 stars0 ratingsCanny Edge Detector: Unveiling the Art of Visual Perception Rating: 0 out of 5 stars0 ratingsComputer Stereo Vision: Exploring Depth Perception in Computer Vision Rating: 0 out of 5 stars0 ratingsFilter Bank: Insights into Computer Vision's Filter Bank Techniques Rating: 0 out of 5 stars0 ratingsColor Appearance Model: Understanding Perception and Representation in Computer Vision Rating: 0 out of 5 stars0 ratingsHough Transform: Unveiling the Magic of Hough Transform in Computer Vision Rating: 0 out of 5 stars0 ratingsColor Matching Function: Understanding Spectral Sensitivity in Computer Vision Rating: 0 out of 5 stars0 ratingsHadamard Transform: Unveiling the Power of Hadamard Transform in Computer Vision Rating: 0 out of 5 stars0 ratingsColor Model: Understanding the Spectrum of Computer Vision: Exploring Color Models Rating: 0 out of 5 stars0 ratingsRandom Sample Consensus: Robust Estimation in Computer Vision Rating: 0 out of 5 stars0 ratingsGeometric Hashing: Efficient Algorithms for Image Recognition and Matching Rating: 0 out of 5 stars0 ratings
Related ebooks
Direct Linear Transformation: Practical Applications and Techniques in Computer Vision Rating: 0 out of 5 stars0 ratingsDynamic Bayesian Networks: Fundamentals and Applications Rating: 0 out of 5 stars0 ratingsAffine Transformation: Unlocking Visual Perspectives: Exploring Affine Transformation in Computer Vision Rating: 0 out of 5 stars0 ratingsLevel Set Method: Advancing Computer Vision, Exploring the Level Set Method Rating: 0 out of 5 stars0 ratingsMathematical Equality: Fundamentals and Applications Rating: 0 out of 5 stars0 ratingsThe Book of Mathematics: Volume 2 Rating: 0 out of 5 stars0 ratingsVehicle Dynamics: Modeling and Simulation Rating: 0 out of 5 stars0 ratingsRelativity, decays and electromagnetic fields Rating: 0 out of 5 stars0 ratingsHomography: Homography: Transformations in Computer Vision Rating: 0 out of 5 stars0 ratingsEvent Calculus: Fundamentals and Applications Rating: 0 out of 5 stars0 ratingsExercises of Multi-Variable Functions Rating: 0 out of 5 stars0 ratingsScale Space: Exploring Dimensions in Computer Vision Rating: 0 out of 5 stars0 ratingsUnderstanding Vector Calculus: Practical Development and Solved Problems Rating: 0 out of 5 stars0 ratingsOperators Between Sequence Spaces and Applications Rating: 0 out of 5 stars0 ratingsVelocity Moments: Capturing the Dynamics: Insights into Computer Vision Rating: 0 out of 5 stars0 ratingsBayesian Decision Networks: Fundamentals and Applications Rating: 0 out of 5 stars0 ratingsIntroduction to Advanced Mathematical Analysis Rating: 0 out of 5 stars0 ratingsSupport Vector Machine: Fundamentals and Applications Rating: 0 out of 5 stars0 ratingsExercises of Vectors and Vectorial Spaces Rating: 0 out of 5 stars0 ratingsMotion Field: Exploring the Dynamics of Computer Vision: Motion Field Unveiled Rating: 0 out of 5 stars0 ratingsFactorization of Boundary Value Problems Using the Invariant Embedding Method Rating: 0 out of 5 stars0 ratingsFuzzy Logic: Fundamentals and Applications Rating: 0 out of 5 stars0 ratingsMarkov Random Field: Exploring the Power of Markov Random Fields in Computer Vision Rating: 0 out of 5 stars0 ratingsExercises of Advanced Statistics Rating: 0 out of 5 stars0 ratingsProcess Performance Models: Statistical, Probabilistic & Simulation Rating: 0 out of 5 stars0 ratingsBackpropagation: Fundamentals and Applications for Preparing Data for Training in Deep Learning Rating: 0 out of 5 stars0 ratingsSatplan: Fundamentals and Applications Rating: 0 out of 5 stars0 ratingsTrifocal Tensor: Exploring Depth, Motion, and Structure in Computer Vision Rating: 0 out of 5 stars0 ratingsPropositional Logic: Fundamentals and Applications Rating: 0 out of 5 stars0 ratingsOrthographic Projection: Exploring Orthographic Projection in Computer Vision Rating: 0 out of 5 stars0 ratings
Intelligence (AI) & Semantics For You
101 Midjourney Prompt Secrets Rating: 3 out of 5 stars3/5Midjourney Mastery - The Ultimate Handbook of Prompts Rating: 5 out of 5 stars5/5Mastering ChatGPT: 21 Prompts Templates for Effortless Writing Rating: 5 out of 5 stars5/5ChatGPT For Dummies Rating: 0 out of 5 stars0 ratingsThe Secrets of ChatGPT Prompt Engineering for Non-Developers Rating: 5 out of 5 stars5/5Killer ChatGPT Prompts: Harness the Power of AI for Success and Profit Rating: 2 out of 5 stars2/5Creating Online Courses with ChatGPT | A Step-by-Step Guide with Prompt Templates Rating: 4 out of 5 stars4/5ChatGPT Rating: 3 out of 5 stars3/5A Quickstart Guide To Becoming A ChatGPT Millionaire: The ChatGPT Book For Beginners (Lazy Money Series®) Rating: 4 out of 5 stars4/5ChatGPT Ultimate User Guide - How to Make Money Online Faster and More Precise Using AI Technology Rating: 0 out of 5 stars0 ratings10 Great Ways to Earn Money Through Artificial Intelligence(AI) Rating: 5 out of 5 stars5/5What Makes Us Human: An Artificial Intelligence Answers Life's Biggest Questions Rating: 5 out of 5 stars5/5AI for Educators: AI for Educators Rating: 5 out of 5 stars5/5Dancing with Qubits: How quantum computing works and how it can change the world Rating: 5 out of 5 stars5/5Chat-GPT Income Ideas: Pioneering Monetization Concepts Utilizing Conversational AI for Profitable Ventures Rating: 4 out of 5 stars4/5TensorFlow in 1 Day: Make your own Neural Network Rating: 4 out of 5 stars4/5Artificial Intelligence: A Guide for Thinking Humans Rating: 4 out of 5 stars4/5ChatGPT For Fiction Writing: AI for Authors Rating: 5 out of 5 stars5/5The Business Case for AI: A Leader's Guide to AI Strategies, Best Practices & Real-World Applications Rating: 0 out of 5 stars0 ratings
Reviews for Cross Correlation
0 ratings0 reviews
Book preview
Cross Correlation - Fouad Sabry
Chapter 1: Cross-correlation
Cross-correlation is used in signal processing to quantify the degree to which two series are comparable as a function of their relative displacement. A sliding dot product (or sliding inner-product) is another name for this concept. It is typically employed to sift through a lengthy signal in search of a discrete, predetermined feature. It can be used in a variety of fields, including neurophysiology, cryptanalysis, averaging, and pattern recognition. Convolution between two functions is analogous to the cross-correlation. The energy of a signal is represented by a peak at a lag of zero in an autocorrelation, which is the cross-correlation with itself.
Statistics and probability, the term cross-correlations refers to the correlations between the entries of two random vectors \mathbf {X} and \mathbf {Y} , while the correlations of a random vector \mathbf {X} are the correlations between the entries of \mathbf {X} itself, those forming the correlation matrix of \mathbf {X} .
If each of \mathbf {X} and \mathbf {Y} is a scalar random variable which is realized repeatedly in a time series, then the correlations of the various temporal instances of \mathbf {X} are known as autocorrelations of \mathbf {X} , and the cross-correlations of \mathbf {X} with \mathbf {Y} across time are temporal cross-correlations.
Statistics and probability, Correlations are always standardized so that they can take on values between 1 and +1 as part of their definition.
If X and Y are two independent random variables with probability density functions f and g , respectively, then the probability density of the difference Y-X is formally given by the cross-correlation (in the signal-processing sense) f\star g ; however, In the fields of probability and statistics, we do not utilize this language.
In contrast, the convolution f*g (equivalent to the cross-correlation of f(t) and {\displaystyle g(-t)} ) gives the probability density function of the sum X+Y .
For continuous functions f and g , definition of cross-correlation:
{\displaystyle (f\star g)(\tau )\ \triangleq \int _{-\infty }^{\infty }{\overline {f(t)}}g(t+\tau )\,dt}which is the same as
{\displaystyle (f\star g)(\tau )\ \triangleq \int _{-\infty }^{\infty }{\overline {f(t-\tau )}}g(t)\,dt}where {\displaystyle {\overline {f(t)}}} denotes the complex conjugate of f(t) , and \tau is called displacement or lag.
For highly-correlated f and g which have a maximum cross-correlation at a particular \tau , a feature in f at t also occurs later in g at t+\tau , hence g could be described to lag f by \tau .
If f and g are both continuous periodic functions of period T , the integration from -\infty to \infty is replaced by integration over any interval {\displaystyle [t_{0},t_{0}+T]} of length T :
{\displaystyle (f\star g)(\tau )\ \triangleq \int _{t_{0}}^{t_{0}+T}{\overline {f(t)}}g(t+\tau )\,dt}which is the same as
{\displaystyle (f\star g)(\tau )\ \triangleq \int _{t_{0}}^{t_{0}+T}{\overline {f(t-\tau )}}g(t)\,dt}The cross-correlation is defined in a similar fashion for discrete functions::
{\displaystyle (f\star g)[n]\ \triangleq \sum _{m=-\infty }^{\infty }{\overline {f[m]}}g[m+n]}which is the same thing as:
{\displaystyle (f\star g)[n]\ \triangleq \sum _{m=-\infty }^{\infty }{\overline {f[m-n]}}g[m]}For finite discrete functions {\displaystyle f,g\in \mathbb {C} ^{N}} , The definition of the (circular) cross-correlation is:
{\displaystyle (f\star g)[n]\ \triangleq \sum _{m=0}^{N-1}{\overline {f[m]}}g[(m+n)_{{\text{mod}}~N}]}which is the same thing as:
{\displaystyle (f\star g)[n]\ \triangleq \sum _{m=0}^{N-1}{\overline {f[(m-n)_{{\text{mod}}~N}]}}g[m]}For finite discrete functions {\displaystyle f\in \mathbb {C} ^{N}} , {\displaystyle g\in \mathbb {C} ^{M}} , definition of kernel cross-correlation:
{\displaystyle (f\star g)[n]\ \triangleq \sum _{m=0}^{N-1}{\overline {f[m]}}K_{g}[(m+n)_{{\text{mod}}~N}]}where
{\displaystyle K_{g}=[k(g,T_{0}(g)),k(g,T_{1}(g)),\dots ,k(g,T_{N-1}(g))]}is a vector of kernel functions {\displaystyle k(\cdot ,\cdot )\colon \mathbb {C} ^{M}\times \mathbb {C} ^{M}\to \mathbb {R} } and {\displaystyle T_{i}(\cdot )\colon \mathbb {C} ^{M}\to \mathbb {C} ^{M}} is an affine transform.
Specifically, {\displaystyle T_{i}(\cdot )} can be circular translation transform, rotation transform, invert the scales, etc.
Cross-correlation is expanded from linear to kernel space by means of the kernel cross-correlation.
Equivariance between cross-correlation and translation; Any affine transformation has no effect on the kernel cross-correlation, including translation, rotation, and scale, etc.
As an illustration, consider two real valued functions f and g differing only by an unknown shift along the x-axis.
One can use the cross-correlation to find how much g must be shifted along the x-axis to make it identical to f .
The formula essentially slides the g function along the x-axis, Integrating their goods at each position requires.
When their purposes are congruent, the value of (f\star g) is maximized.
Because when the high points (the positive areas) occur in a row,, They have a significant impact on the integral.
Similarly, when low points (troughs) coincide, Since the product of two negative numbers is positive, they likewise contribute positively to the integral.
With complex-valued functions f and g , taking the conjugate of f ensures that aligned peaks (or aligned troughs) with imaginary components will contribute positively to the integral.
In econometrics, lagged cross-correlation is sometimes referred to as cross-autocorrelation.: p.
⁷⁴
The cross-correlation of functions f(t) and g(t) is equivalent to the convolution (denoted by * ) of {\displaystyle {\overline {f(-t)}}} and g(t) .
That is:
{\displaystyle [f(t)\star g(t)](t)=[{\overline {f(-t)}}*g(t)](t).}{\displaystyle [f(t)\star g(t)](t)=[{\overline {g(t)}}\star {\overline {f(t)}}](-t).}If f is a Hermitian function, then {\displaystyle f\star g=f*g.}
If both f and g are Hermitian, then f\star g=g\star f .
{\displaystyle \left(f\star g\right)\star \left(f\star g\right)=\left(f\star f\right)\star \left(g\star g\right)}.
In a way similar to the convolution theorem, the cross-correlation theorem states that
{\displaystyle {\mathcal {F}}\left\{f\star g\right\}={\overline {{\mathcal {F}}\left\{f\right\}}}\cdot {\mathcal {F}}\left\{g\right\},}where {\mathcal {F}} denotes the Fourier transform, and an {\overline {f}} again indicates the complex conjugate of f , since {\displaystyle {\mathcal {F}}\left\{{\overline {f(-t)}}\right\}={\overline {{\mathcal {F}}\left\{f(t)\right\}}}} .
Together with efficient implementations of the Fourier transform, This characteristic is frequently used to speed up numerical cross-correlation calculations (see circular cross-correlation).
According to the Wiener-Khinchin theorem, the cross-correlation can be calculated from the spectral density.
The cross-correlation of a convolution of f and h with a function g is the convolution of the cross-correlation of g and f with the kernel h :
{\displaystyle g\star \left(f*h\right)=\left(g\star f\right)*h} .
For random vectors {\displaystyle \mathbf {X} =(X_{1},\ldots ,X_{m})} and {\displaystyle \mathbf {Y} =(Y_{1},\ldots ,Y_{n})} , consisting of random components with known mean and standard deviation, the cross-correlation matrix of \mathbf {X} and \mathbf {Y} is defined by: p.337
{\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {Y} }\triangleq \ \operatorname {E} \left[\mathbf {X} \mathbf {Y} \right]}and has dimensions m\times n .
Component-based writing:
{\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {Y} }={\begin{bmatrix}\operatorname {E} [X_{1}Y_{1}]&\operatorname {E} [X_{1}Y_{2}]&\cdots &\operatorname {E} [X_{1}Y_{n}]\\\\\operatorname {E} [X_{2}Y_{1}]&\operatorname {E} [X_{2}Y_{2}]&\cdots &\operatorname {E} [X_{2}Y_{n}]\\\\\vdots &\vdots &\ddots &\vdots \\\\\operatorname {E} [X_{m}Y_{1}]&\operatorname {E} [X_{m}Y_{2}]&\cdots &\operatorname {E} [X_{m}Y_{n}]\end{bmatrix}}}The random vectors \mathbf {X} and \mathbf {Y} need not have the same dimension, both may take the form of scalar values.
For example, if {\displaystyle \mathbf {X} =\left(X_{1},X_{2},X_{3}\right)} and {\displaystyle \mathbf {Y} =\left(Y_{1},Y_{2}\right)} are random vectors, then {\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {Y} }} is a {\displaystyle 3\times 2} matrix whose (i,j) -th entry is {\displaystyle \operatorname {E} [X_{i}Y_{j}]} .
If {\displaystyle \mathbf {Z} =(Z_{1},\ldots ,Z_{m})} and {\displaystyle \mathbf {W} =(W_{1},\ldots ,W_{n})} are complex random vectors, comprising random variables with known values and distributions, the cross-correlation matrix of \mathbf {Z} and \mathbf {W} is defined by
{\displaystyle \operatorname {R} _{\mathbf {Z} \mathbf {W} }\triangleq \ \operatorname {E} [\mathbf {Z} \mathbf {W} ^{\rm {H}}]}where {\displaystyle {}^{\rm {H}}} denotes Hermitian transposition.
Statistics and time series analysis, The cross-correlation between two random processes measures the relationship between their values across time, relative to the interval between the two.
Let {\displaystyle (X_{t},Y_{t})} be a pair of random processes, and t be any point in time ( t may be an integer for a discrete-time process or a real number for a continuous-time process).
Then X_{t} is the value (or realization) produced by a given run of the process at time t .
Suppose that the process has means {\displaystyle \mu _{X}(t)} and {\displaystyle \mu _{Y}(t)} and variances {\displaystyle \sigma _{X}^{2}(t)} and {\displaystyle \sigma _{Y}^{2}(t)} at time t , for each t .
Then the definition of the cross-correlation between times t_{1} and t_{2} is: p.392
{\displaystyle \operatorname {R} _{XY}(t_{1},t_{2})\triangleq \ \operatorname {E} \left[X_{t_{1}}{\overline {Y_{t_{2}}}}\right]}