Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Cross Correlation: Unlocking Patterns in Computer Vision
Cross Correlation: Unlocking Patterns in Computer Vision
Cross Correlation: Unlocking Patterns in Computer Vision
Ebook142 pages1 hour

Cross Correlation: Unlocking Patterns in Computer Vision

Rating: 0 out of 5 stars

()

Read preview

About this ebook

What is Cross Correlation


In signal processing, cross-correlation is a measure of similarity of two series as a function of the displacement of one relative to the other. This is also known as a sliding dot product or sliding inner-product. It is commonly used for searching a long signal for a shorter, known feature. It has applications in pattern recognition, single particle analysis, electron tomography, averaging, cryptanalysis, and neurophysiology. The cross-correlation is similar in nature to the convolution of two functions. In an autocorrelation, which is the cross-correlation of a signal with itself, there will always be a peak at a lag of zero, and its size will be the signal energy.


How you will benefit


(I) Insights, and validations about the following topics:


Chapter 1: Cross-correlation


Chapter 2: Autocorrelation


Chapter 3: Covariance matrix


Chapter 4: Estimation of covariance matrices


Chapter 5: Cross-covariance


Chapter 6: Autocovariance


Chapter 7: Variational Bayesian methods


Chapter 8: Normal-gamma distribution


Chapter 9: Expectation-maximization algorithm


Chapter 10: Griffiths inequality


(II) Answering the public top questions about cross correlation.


(III) Real world examples for the usage of cross correlation in many fields.


Who this book is for


Professionals, undergraduate and graduate students, enthusiasts, hobbyists, and those who want to go beyond basic knowledge or information for any kind of Cross Correlation.

LanguageEnglish
Release dateMay 10, 2024
Cross Correlation: Unlocking Patterns in Computer Vision

Read more from Fouad Sabry

Related to Cross Correlation

Titles in the series (100)

View More

Related ebooks

Intelligence (AI) & Semantics For You

View More

Related articles

Reviews for Cross Correlation

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Cross Correlation - Fouad Sabry

    Chapter 1: Cross-correlation

    Cross-correlation is used in signal processing to quantify the degree to which two series are comparable as a function of their relative displacement. A sliding dot product (or sliding inner-product) is another name for this concept. It is typically employed to sift through a lengthy signal in search of a discrete, predetermined feature. It can be used in a variety of fields, including neurophysiology, cryptanalysis, averaging, and pattern recognition. Convolution between two functions is analogous to the cross-correlation. The energy of a signal is represented by a peak at a lag of zero in an autocorrelation, which is the cross-correlation with itself.

    Statistics and probability, the term cross-correlations refers to the correlations between the entries of two random vectors \mathbf {X} and \mathbf {Y} , while the correlations of a random vector \mathbf {X} are the correlations between the entries of \mathbf {X} itself, those forming the correlation matrix of \mathbf {X} .

    If each of \mathbf {X} and \mathbf {Y} is a scalar random variable which is realized repeatedly in a time series, then the correlations of the various temporal instances of \mathbf {X} are known as autocorrelations of \mathbf {X} , and the cross-correlations of \mathbf {X} with \mathbf {Y} across time are temporal cross-correlations.

    Statistics and probability, Correlations are always standardized so that they can take on values between 1 and +1 as part of their definition.

    If X and Y are two independent random variables with probability density functions f and g , respectively, then the probability density of the difference Y-X is formally given by the cross-correlation (in the signal-processing sense) f\star g ; however, In the fields of probability and statistics, we do not utilize this language.

    In contrast, the convolution f*g (equivalent to the cross-correlation of f(t) and {\displaystyle g(-t)} ) gives the probability density function of the sum X+Y .

    For continuous functions f and g , definition of cross-correlation:

    {\displaystyle (f\star g)(\tau )\ \triangleq \int _{-\infty }^{\infty }{\overline {f(t)}}g(t+\tau )\,dt}

    which is the same as

    {\displaystyle (f\star g)(\tau )\ \triangleq \int _{-\infty }^{\infty }{\overline {f(t-\tau )}}g(t)\,dt}

    where {\displaystyle {\overline {f(t)}}} denotes the complex conjugate of f(t) , and \tau is called displacement or lag.

    For highly-correlated f and g which have a maximum cross-correlation at a particular \tau , a feature in f at t also occurs later in g at t+\tau , hence g could be described to lag f by \tau .

    If f and g are both continuous periodic functions of period T , the integration from -\infty to \infty is replaced by integration over any interval {\displaystyle [t_{0},t_{0}+T]} of length T :

    {\displaystyle (f\star g)(\tau )\ \triangleq \int _{t_{0}}^{t_{0}+T}{\overline {f(t)}}g(t+\tau )\,dt}

    which is the same as

    {\displaystyle (f\star g)(\tau )\ \triangleq \int _{t_{0}}^{t_{0}+T}{\overline {f(t-\tau )}}g(t)\,dt}

    The cross-correlation is defined in a similar fashion for discrete functions::

    {\displaystyle (f\star g)[n]\ \triangleq \sum _{m=-\infty }^{\infty }{\overline {f[m]}}g[m+n]}

    which is the same thing as:

    {\displaystyle (f\star g)[n]\ \triangleq \sum _{m=-\infty }^{\infty }{\overline {f[m-n]}}g[m]}

    For finite discrete functions {\displaystyle f,g\in \mathbb {C} ^{N}} , The definition of the (circular) cross-correlation is:

    {\displaystyle (f\star g)[n]\ \triangleq \sum _{m=0}^{N-1}{\overline {f[m]}}g[(m+n)_{{\text{mod}}~N}]}

    which is the same thing as:

    {\displaystyle (f\star g)[n]\ \triangleq \sum _{m=0}^{N-1}{\overline {f[(m-n)_{{\text{mod}}~N}]}}g[m]}

    For finite discrete functions {\displaystyle f\in \mathbb {C} ^{N}} , {\displaystyle g\in \mathbb {C} ^{M}} , definition of kernel cross-correlation:

    {\displaystyle (f\star g)[n]\ \triangleq \sum _{m=0}^{N-1}{\overline {f[m]}}K_{g}[(m+n)_{{\text{mod}}~N}]}

    where

    {\displaystyle K_{g}=[k(g,T_{0}(g)),k(g,T_{1}(g)),\dots ,k(g,T_{N-1}(g))]}

    is a vector of kernel functions {\displaystyle k(\cdot ,\cdot )\colon \mathbb {C} ^{M}\times \mathbb {C} ^{M}\to \mathbb {R} } and {\displaystyle T_{i}(\cdot )\colon \mathbb {C} ^{M}\to \mathbb {C} ^{M}} is an affine transform.

    Specifically, {\displaystyle T_{i}(\cdot )} can be circular translation transform, rotation transform, invert the scales, etc.

    Cross-correlation is expanded from linear to kernel space by means of the kernel cross-correlation.

    Equivariance between cross-correlation and translation; Any affine transformation has no effect on the kernel cross-correlation, including translation, rotation, and scale, etc.

    As an illustration, consider two real valued functions f and g differing only by an unknown shift along the x-axis.

    One can use the cross-correlation to find how much g must be shifted along the x-axis to make it identical to f .

    The formula essentially slides the g function along the x-axis, Integrating their goods at each position requires.

    When their purposes are congruent, the value of (f\star g) is maximized.

    Because when the high points (the positive areas) occur in a row,, They have a significant impact on the integral.

    Similarly, when low points (troughs) coincide, Since the product of two negative numbers is positive, they likewise contribute positively to the integral.

    With complex-valued functions f and g , taking the conjugate of f ensures that aligned peaks (or aligned troughs) with imaginary components will contribute positively to the integral.

    In econometrics, lagged cross-correlation is sometimes referred to as cross-autocorrelation.: p.

    ⁷⁴

    The cross-correlation of functions f(t) and g(t) is equivalent to the convolution (denoted by * ) of {\displaystyle {\overline {f(-t)}}} and g(t) .

    That is:

    {\displaystyle [f(t)\star g(t)](t)=[{\overline {f(-t)}}*g(t)](t).}{\displaystyle [f(t)\star g(t)](t)=[{\overline {g(t)}}\star {\overline {f(t)}}](-t).}

    If f is a Hermitian function, then {\displaystyle f\star g=f*g.}

    If both f and g are Hermitian, then f\star g=g\star f .

    {\displaystyle \left(f\star g\right)\star \left(f\star g\right)=\left(f\star f\right)\star \left(g\star g\right)}

    .

    In a way similar to the convolution theorem, the cross-correlation theorem states that

    {\displaystyle {\mathcal {F}}\left\{f\star g\right\}={\overline {{\mathcal {F}}\left\{f\right\}}}\cdot {\mathcal {F}}\left\{g\right\},}

    where {\mathcal {F}} denotes the Fourier transform, and an {\overline {f}} again indicates the complex conjugate of f , since {\displaystyle {\mathcal {F}}\left\{{\overline {f(-t)}}\right\}={\overline {{\mathcal {F}}\left\{f(t)\right\}}}} .

    Together with efficient implementations of the Fourier transform, This characteristic is frequently used to speed up numerical cross-correlation calculations (see circular cross-correlation).

    According to the Wiener-Khinchin theorem, the cross-correlation can be calculated from the spectral density.

    The cross-correlation of a convolution of f and h with a function g is the convolution of the cross-correlation of g and f with the kernel h :

    {\displaystyle g\star \left(f*h\right)=\left(g\star f\right)*h} .

    For random vectors {\displaystyle \mathbf {X} =(X_{1},\ldots ,X_{m})} and {\displaystyle \mathbf {Y} =(Y_{1},\ldots ,Y_{n})} , consisting of random components with known mean and standard deviation, the cross-correlation matrix of \mathbf {X} and \mathbf {Y} is defined by: p.337

    {\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {Y} }\triangleq \ \operatorname {E} \left[\mathbf {X} \mathbf {Y} \right]}

    and has dimensions m\times n .

    Component-based writing:

    {\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {Y} }={\begin{bmatrix}\operatorname {E} [X_{1}Y_{1}]&\operatorname {E} [X_{1}Y_{2}]&\cdots &\operatorname {E} [X_{1}Y_{n}]\\\\\operatorname {E} [X_{2}Y_{1}]&\operatorname {E} [X_{2}Y_{2}]&\cdots &\operatorname {E} [X_{2}Y_{n}]\\\\\vdots &\vdots &\ddots &\vdots \\\\\operatorname {E} [X_{m}Y_{1}]&\operatorname {E} [X_{m}Y_{2}]&\cdots &\operatorname {E} [X_{m}Y_{n}]\end{bmatrix}}}

    The random vectors \mathbf {X} and \mathbf {Y} need not have the same dimension, both may take the form of scalar values.

    For example, if {\displaystyle \mathbf {X} =\left(X_{1},X_{2},X_{3}\right)} and {\displaystyle \mathbf {Y} =\left(Y_{1},Y_{2}\right)} are random vectors, then {\displaystyle \operatorname {R} _{\mathbf {X} \mathbf {Y} }} is a {\displaystyle 3\times 2} matrix whose (i,j) -th entry is {\displaystyle \operatorname {E} [X_{i}Y_{j}]} .

    If {\displaystyle \mathbf {Z} =(Z_{1},\ldots ,Z_{m})} and {\displaystyle \mathbf {W} =(W_{1},\ldots ,W_{n})} are complex random vectors, comprising random variables with known values and distributions, the cross-correlation matrix of \mathbf {Z} and \mathbf {W} is defined by

    {\displaystyle \operatorname {R} _{\mathbf {Z} \mathbf {W} }\triangleq \ \operatorname {E} [\mathbf {Z} \mathbf {W} ^{\rm {H}}]}

    where {\displaystyle {}^{\rm {H}}} denotes Hermitian transposition.

    Statistics and time series analysis, The cross-correlation between two random processes measures the relationship between their values across time, relative to the interval between the two.

    Let {\displaystyle (X_{t},Y_{t})} be a pair of random processes, and t be any point in time ( t may be an integer for a discrete-time process or a real number for a continuous-time process).

    Then X_{t} is the value (or realization) produced by a given run of the process at time t .

    Suppose that the process has means {\displaystyle \mu _{X}(t)} and {\displaystyle \mu _{Y}(t)} and variances {\displaystyle \sigma _{X}^{2}(t)} and {\displaystyle \sigma _{Y}^{2}(t)} at time t , for each t .

    Then the definition of the cross-correlation between times t_{1} and t_{2} is: p.392

    {\displaystyle \operatorname {R} _{XY}(t_{1},t_{2})\triangleq \ \operatorname {E} \left[X_{t_{1}}{\overline {Y_{t_{2}}}}\right]}

    Enjoying the preview?
    Page 1 of 1