Riemannian Geometric Statistics in Medical Image Analysis
By Tom Fletcher
()
About this ebook
Over the past 15 years, there has been a growing need in the medical image computing community for principled methods to process nonlinear geometric data. Riemannian geometry has emerged as one of the most powerful mathematical and computational frameworks for analyzing such data.
Riemannian Geometric Statistics in Medical Image Analysis is a complete reference on statistics on Riemannian manifolds and more general nonlinear spaces with applications in medical image analysis. It provides an introduction to the core methodology followed by a presentation of state-of-the-art methods.
Beyond medical image computing, the methods described in this book may also apply to other domains such as signal processing, computer vision, geometric deep learning, and other domains where statistics on geometric features appear. As such, the presented core methodology takes its place in the field of geometric statistics, the statistical analysis of data being elements of nonlinear geometric spaces. The foundational material and the advanced techniques presented in the later parts of the book can be useful in domains outside medical imaging and present important applications of geometric statistics methodology
Content includes:
- The foundations of Riemannian geometric methods for statistics on manifolds with emphasis on concepts rather than on proofs
- Applications of statistics on manifolds and shape spaces in medical image computing
- Diffeomorphic deformations and their applications
As the methods described apply to domains such as signal processing (radar signal processing and brain computer interaction), computer vision (object and face recognition), and other domains where statistics of geometric features appear, this book is suitable for researchers and graduate students in medical imaging, engineering and computer science.
- A complete reference covering both the foundations and state-of-the-art methods
- Edited and authored by leading researchers in the field
- Contains theory, examples, applications, and algorithms
- Gives an overview of current research challenges and future applications
Related to Riemannian Geometric Statistics in Medical Image Analysis
Related ebooks
Quantum Machine Learning: What Quantum Computing Means to Data Mining Rating: 0 out of 5 stars0 ratingsAdaptive Learning Methods for Nonlinear System Modeling Rating: 0 out of 5 stars0 ratingsDifferential Equations with Mathematica Rating: 4 out of 5 stars4/5Methods of Matrix Algebra Rating: 0 out of 5 stars0 ratingsAlgebraic and Combinatorial Computational Biology Rating: 0 out of 5 stars0 ratingsThe Spectral Analysis of Time Series Rating: 0 out of 5 stars0 ratingsDynamical Systems Rating: 4 out of 5 stars4/5Chaotic Dynamics of Nonlinear Systems Rating: 5 out of 5 stars5/5Kalman Filters: Fundamentals and Applications Rating: 0 out of 5 stars0 ratingsApplied Nonstandard Analysis Rating: 3 out of 5 stars3/5Combinatorial Optimization: Algorithms and Complexity Rating: 4 out of 5 stars4/5Nonequilibrium Statistical Thermodynamics Rating: 0 out of 5 stars0 ratingsA Graduate Course in Probability Rating: 5 out of 5 stars5/5Stochastic Integrals Rating: 0 out of 5 stars0 ratingsReal Analysis: Measure Theory, Integration, and Hilbert Spaces Rating: 4 out of 5 stars4/5Statistical Mechanics in a Nutshell Rating: 3 out of 5 stars3/5Topics in Advanced Quantum Mechanics Rating: 5 out of 5 stars5/5Group Analysis of Differential Equations Rating: 0 out of 5 stars0 ratingsA Course in Ordinary and Partial Differential Equations Rating: 4 out of 5 stars4/5Introduction to Applied Probability Rating: 0 out of 5 stars0 ratingsInvitation to Dynamical Systems Rating: 5 out of 5 stars5/5Algebraic Methods in Statistical Mechanics and Quantum Field Theory Rating: 0 out of 5 stars0 ratingsConcepts from Tensor Analysis and Differential Geometry Rating: 0 out of 5 stars0 ratingsMatrix Theory and Applications for Scientists and Engineers Rating: 0 out of 5 stars0 ratingsElementary Principles in Statistical Mechanics Rating: 5 out of 5 stars5/5Mathematical Neuroscience Rating: 0 out of 5 stars0 ratingsOn Angular Momentum Rating: 0 out of 5 stars0 ratingsFunctional Integration and Quantum Physics Rating: 0 out of 5 stars0 ratingsQuantum Theory of Many-Particle Systems Rating: 5 out of 5 stars5/5Graphs, Dynamic Programming and Finite Games Rating: 0 out of 5 stars0 ratings
Intelligence (AI) & Semantics For You
ChatGPT For Fiction Writing: AI for Authors Rating: 5 out of 5 stars5/5Artificial Intelligence: A Guide for Thinking Humans Rating: 4 out of 5 stars4/52084: Artificial Intelligence and the Future of Humanity Rating: 4 out of 5 stars4/5101 Midjourney Prompt Secrets Rating: 3 out of 5 stars3/5Summary of Super-Intelligence From Nick Bostrom Rating: 5 out of 5 stars5/5Our Final Invention: Artificial Intelligence and the End of the Human Era Rating: 4 out of 5 stars4/5The Secrets of ChatGPT Prompt Engineering for Non-Developers Rating: 5 out of 5 stars5/5Mastering ChatGPT: 21 Prompts Templates for Effortless Writing Rating: 5 out of 5 stars5/5Chat-GPT Income Ideas: Pioneering Monetization Concepts Utilizing Conversational AI for Profitable Ventures Rating: 4 out of 5 stars4/5Creating Online Courses with ChatGPT | A Step-by-Step Guide with Prompt Templates Rating: 4 out of 5 stars4/5Dark Aeon: Transhumanism and the War Against Humanity Rating: 5 out of 5 stars5/5Discovery Writing with ChatGPT: AI-Powered Storytelling: Three Story Method, #6 Rating: 0 out of 5 stars0 ratingsImpromptu: Amplifying Our Humanity Through AI Rating: 5 out of 5 stars5/5ChatGPT For Dummies Rating: 0 out of 5 stars0 ratingsMidjourney Mastery - The Ultimate Handbook of Prompts Rating: 5 out of 5 stars5/5Ways of Being: Animals, Plants, Machines: The Search for a Planetary Intelligence Rating: 4 out of 5 stars4/5What Makes Us Human: An Artificial Intelligence Answers Life's Biggest Questions Rating: 5 out of 5 stars5/5The Algorithm of the Universe (A New Perspective to Cognitive AI) Rating: 5 out of 5 stars5/5THE CHATGPT MILLIONAIRE'S HANDBOOK: UNLOCKING WEALTH THROUGH AI AUTOMATION Rating: 5 out of 5 stars5/5AI for Educators: AI for Educators Rating: 5 out of 5 stars5/5ChatGPT Ultimate User Guide - How to Make Money Online Faster and More Precise Using AI Technology Rating: 0 out of 5 stars0 ratingsThe Business Case for AI: A Leader's Guide to AI Strategies, Best Practices & Real-World Applications Rating: 0 out of 5 stars0 ratingsHumans Need Not Apply: A Guide to Wealth & Work in the Age of Artificial Intelligence Rating: 4 out of 5 stars4/5
Reviews for Riemannian Geometric Statistics in Medical Image Analysis
0 ratings0 reviews
Book preview
Riemannian Geometric Statistics in Medical Image Analysis - Xavier Pennec
Riemannian Geometric Statistics in Medical Image Analysis
First edition
Xavier Pennec
Stefan Sommer
Tom Fletcher
Table of Contents
Cover image
Title page
Copyright
Contributors
Introduction
Introduction
Part 1: Foundations of geometric statistics
1: Introduction to differential and Riemannian geometry
Abstract
1.1. Introduction
1.2. Manifolds
1.3. Riemannian manifolds
1.4. Elements of analysis in Riemannian manifolds
1.5. Lie groups and homogeneous manifolds
1.6. Elements of computing on Riemannian manifolds
1.7. Examples
1.8. Additional references
References
2: Statistics on manifolds
Abstract
2.1. Introduction
2.2. The Fréchet mean
2.3. Covariance and principal geodesic analysis
2.4. Regression models
2.5. Probabilistic models
References
3: Manifold-valued image processing with SPD matrices
Abstract
Acknowledgements
3.1. Introduction
3.2. Exponential, logarithm, and square root of SPD matrices
3.3. Affine-invariant metrics
3.4. Basic statistical operations on SPD matrices
3.5. Manifold-valued image processing
3.6. Other metrics on SPD matrices
3.7. Applications in diffusion tensor imaging (DTI)
3.8. Learning brain variability from Sulcal lines
References
4: Riemannian geometry on shapes and diffeomorphisms
Abstract
4.1. Introduction
4.2. Shapes and actions
4.3. The diffeomorphism group in shape analysis
4.4. Riemannian metrics on shape spaces
4.5. Shape spaces
4.6. Statistics in LDDMM
4.7. Outer and inner shape metrics
4.8. Further reading
References
5: Beyond Riemannian geometry
Abstract
5.1. Introduction
5.2. Affine connection spaces
5.3. Canonical connections on Lie groups
5.4. Left, right, and biinvariant Riemannian metrics on a Lie group
5.5. Statistics on Lie groups as symmetric spaces
5.6. The stationary velocity fields (SVF) framework for diffeomorphisms
5.7. Parallel transport of SVF deformations
5.8. Historical notes and additional references
References
Part 2: Statistics on manifolds and shape spaces
6: Object shape representation via skeletal models (s-reps) and statistical analysis
Abstract
Acknowledgements
6.1. Introduction to skeletal models
6.2. Computing an s-rep from an image or object boundary
6.3. Skeletal interpolation
6.4. Skeletal fitting
6.5. Correspondence
6.6. Skeletal statistics
6.7. How to compare representations and statistical methods
6.8. Results of classification, hypothesis testing, and probability distribution estimation
6.9. The code and its performance
6.10. Weaknesses of the skeletal approach
References
7: Efficient recursive estimation of the Riemannian barycenter on the hypersphere and the special orthogonal group with applications
Abstract
Acknowledgements
7.1. Introduction
7.2. Riemannian geometry of the hypersphere
7.3. Weak consistency of iFME on the sphere
7.4. Experimental results
7.5. Application to the classification of movement disorders
7.6. Riemannian geometry of the special orthogonal group
7.7. Weak consistency of iFME on so(n)
7.8. Experimental results
7.9. Conclusions
References
8: Statistics on stratified spaces
Abstract
Acknowledgements
8.1. Introduction to stratified geometry
8.2. Least squares models
8.3. BHV tree space
8.4. The space of unlabeled trees
8.5. Beyond trees
References
9: Bias on estimation in quotient space and correction methods
Abstract
Acknowledgement
9.1. Introduction
9.2. Shapes and quotient spaces
9.3. Template estimation
9.4. Asymptotic bias of template estimation
9.5. Applications to statistics on organ shapes
9.6. Bias correction methods
9.7. Conclusion
References
10: Probabilistic approaches to geometric statistics
Abstract
10.1. Introduction
10.2. Parametric probability distributions on manifolds
10.3. The Brownian motion
10.4. Fiber bundle geometry
10.5. Anisotropic normal distributions
10.6. Statistics with bundles
10.7. Parameter estimation
10.8. Advanced concepts
10.9. Conclusion
10.10. Further reading
References
11: On shape analysis of functional data
Abstract
11.1. Introduction
11.2. Registration problem and elastic approach
11.3. Shape space and geodesic paths
11.4. Statistical summaries and principal modes of shape variability
11.5. Summary and conclusion
Appendix. Mathematical background
References
Part 3: Deformations, diffeomorphisms and their applications
12: Fidelity metrics between curves and surfaces: currents, varifolds, and normal cycles
Abstract
Acknowledgements
12.1. Introduction
12.2. General setting and notations
12.3. Currents
12.4. Varifolds
12.5. Normal cycles
12.6. Computational aspects
12.7. Conclusion
References
13: A discretize–optimize approach for LDDMM registration
Abstract
13.1. Introduction
13.2. Background and related work
13.3. Continuous mathematical models
13.4. Discretization of the energies
13.5. Discretization and solution of PDEs
13.6. Discretization in multiple dimensions
13.7. Multilevel registration and numerical optimization
13.8. Experiments and results
13.9. Discussion and conclusion
References
14: Spatially adaptive metrics for diffeomorphic image matching in LDDMM
Abstract
14.1. Introduction to LDDMM
14.2. Sum of kernels and semidirect product of groups
14.3. Sliding motion constraints
14.4. Left-invariant metrics
14.5. Open directions
References
15: Low-dimensional shape analysis in the space of diffeomorphisms
Abstract
Acknowledgements
15.1. Introduction
15.2. Background
15.3. PPGA of diffeomorphisms
15.4. Inference
15.5. Evaluation
15.6. Results
15.7. Discussion and conclusion
References
16: Diffeomorphic density registration
Abstract
Acknowledgements
16.1. Introduction
16.2. Diffeomorphisms and densities
16.3. Diffeomorphic density registration
16.4. Density registration in the LDDMM-framework
16.5. Optimal information transport
16.6. A gradient flow approach
References
Index
Copyright
Academic Press is an imprint of Elsevier
125 London Wall, London EC2Y 5AS, United Kingdom
525 B Street, Suite 1650, San Diego, CA 92101, United States
50 Hampshire Street, 5th Floor, Cambridge, MA 02139, United States
The Boulevard, Langford Lane, Kidlington, Oxford OX5 1GB, United Kingdom
Copyright © 2020 Elsevier Ltd. All rights reserved.
No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any information storage and retrieval system, without permission in writing from the publisher. Details on how to seek permission, further information about the Publisher's permissions policies and our arrangements with organizations such as the Copyright Clearance Center and the Copyright Licensing Agency, can be found at our website: www.elsevier.com/permissions.
This book and the individual contributions contained in it are protected under copyright by the Publisher (other than as may be noted herein).
Notices
Knowledge and best practice in this field are constantly changing. As new research and experience broaden our understanding, changes in research methods, professional practices, or medical treatment may become necessary.
Practitioners and researchers must always rely on their own experience and knowledge in evaluating and using any information, methods, compounds, or experiments described herein. In using such information or methods they should be mindful of their own safety and the safety of others, including parties for whom they have a professional responsibility.
To the fullest extent of the law, neither the Publisher nor the authors, contributors, or editors, assume any liability for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions, or ideas contained in the material herein.
Library of Congress Cataloging-in-Publication Data
A catalog record for this book is available from the Library of Congress
British Library Cataloguing-in-Publication Data
A catalogue record for this book is available from the British Library
ISBN: 978-0-12-814725-2
For information on all Academic Press publications visit our website at https://www.elsevier.com/books-and-journals
Publisher: Mara Conner
Acquisition Editor: Tim Pitts
Editorial Project Manager: Leticia M. Lima
Production Project Manager: Kamesh Ramajogi
Designer: Miles Hitchen
Typeset by VTeX
Contributors
Martin Bauer Florida State University, Department of Mathematics, Tallahassee, FL, United States
Rudrasis Chakraborty University of Florida, CISE Department, Gainesville, FL, United States
Benjamin Charlier
IMAG, Univ. Montpellier, CNRS, Montpellier, France
Institut du Cerveau et de la Moëlle Épinière, ARAMIS, Paris, France
Nicolas Charon Johns Hopkins University, Center of Imaging Sciences, Baltimore, MD, United States
Hyo-young Choi UNC, Chapel Hill, NC, United States
James Damon UNC, Chapel Hill, NC, United States
Loic Devilliers Université Côte d'Azur and Inria, Epione team, Sophia Antipolis, France
Aasa Feragen University of Copenhagen, Department of Computer Science, Copenhagen, Denmark
Tom Fletcher University of Virginia, Departments of Electrical & Computer Engineering and Computer Science, Charlottesville, VA, United States
Joan Glaunès MAP5, Université Paris Descartes, Paris, France
Polina Golland Massachusetts Institute of Technology, Computer Science and Artificial Intelligence Lab, Cambridge, MA, United States
Pietro Gori Télécom ParisTech, LTCI, équipe IMAGES, Paris, France
Junpyo Hong UNC, Chapel Hill, NC, United States
Sarang Joshi University of Utah, Department of Bioengineering, Scientific Computing and Imaging Institute, Salt Lake City, UT, United States
Sungkyu Jung Seoul National University, Seoul, Republic of Korea
Zhiyuan Liu UNC, Chapel Hill, NC, United States
Marco Lorenzi Université Côte d'Azur and Inria, Epione team, Sophia Antipolis, France
J.S. Marron UNC, Chapel Hill, NC, United States
Stephen Marsland Victoria University of Wellington, School of Mathematics and Statistics, Wellington, New Zealand
Nina Miolane
Université Côte d'Azur and Inria, Epione team, Sophia Antipolis, France
Stanford University, Department of Statistics, Stanford, CA, United States
Jan Modersitzki
Institute of Mathematics and Image Computing, University of Lübeck, Lübeck, Germany
Fraunhofer MEVIS, Lübeck, Germany
Klas Modin Chalmers University of Technology and the University of Gothenburg, Department of Mathematical Sciences, Göteborg, Sweden
Marc Niethammer
Department of Computer Science, University of North Carolina at Chapel Hill, Chapel Hill, NC, United States
Biomedical Research Imaging Center (BRIC), Chapel Hill, NC, United States
Tom Nye Newcastle University, School of Mathematics, Statistics and Physics, Newcastle upon Tyne, United Kingdom
Beatriz Paniagua UNC, Chapel Hill, NC, United States
Xavier Pennec Université Côte d'Azur and Inria, Epione team, Sophia Antipolis, France
Stephen M. Pizer UNC, Chapel Hill, NC, United States
Thomas Polzin Institute of Mathematics and Image Computing, University of Lübeck, Lübeck, Germany
Laurent Risser Institut de Mathématiques de Toulouse, CNRS, Université de Toulouse, UMR CNRS 5219, Toulouse, France
Pierre Roussillon ENS Cachan, CNRS, Université Paris-Saclay, CMLA, Cachan, France
Jörn Schulz Arctic University of Norway, Tromsø, Norway
Ankur Sharma UNC, Chapel Hill, NC, United States
Stefan Sommer University of Copenhagen, Department of Computer Science, Copenhagen, Denmark
Anuj Srivastava Florida State University, Tallahassee, FL, United States
Liyun Tu UNC, Chapel Hill, NC, United States
Baba C. Vemuri University of Florida, CISE Department, Gainesville, FL, United States
François-Xavier Vialard Laboratoire d'informatique Gaspard Monge, Université Paris-Est Marne-la-Vallée, UMR CNRS 8049, Champs sur Marne, France
Jared Vicory UNC, Chapel Hill, NC, United States
Jiyao Wang UNC, Chapel Hill, NC, United States
William M. Wells III Harvard Medical School, Department of Radiology, Boston, MA, United States
Miaomiao Zhang Washington University in St. Louis, Computer Science and Engineering, St. Louis, MO, United States
Ruiyi Zhang Florida State University, Tallahassee, FL, United States
Introduction
Xavier Pennec; Stefan Sommer; Tom Fletcher University Côte d'Azur and Inria, Sophia Antipolis, France
DIKU, University of Copenhagen, Copenhagen, Denmark
University of Virginia, Charlottesville, VA, United States
Introduction
Over the last two decades, there has been a growing need in the medical image computing community for principled methods to process nonlinear geometric data. Typical examples of data in this domain include organ shapes and deformations resulting from segmentation and registration in computational anatomy, and symmetric positive definite matrices in diffusion imaging. In this context, Riemannian geometry has gradually been established as one the most powerful mathematical and computational paradigms.
This book aims at being an introduction to and a reference on Riemannian geometric statistics and its use in medical image analysis for researchers and graduate students. The book provides both descriptions of the core methodology and presentations of state-of-the-art methods used in the field. We wish to present this combination of foundational material and current research together with examples, applications, and algorithms in a volume that is edited and authored by the leading researchers in the field. In addition, we wish to provide an overview of current research challenges and future applications.
Beyond medical image computing, the methods described in this book may also apply to other domains such as signal processing, computer vision, geometric deep learning, and other domains where statistics on geometric features appear. As such, the presented core methodology takes its place in the field of geometric statistics, the statistical analysis of data being elements of nonlinear geometric spaces. We hope that both the foundational material and the advanced techniques presented in the later parts of the book can be useful in domains outside medical imaging and present important applications of geometric statistics methodology.
Contents
Part 1 of this edited volume describes the foundations of Riemannian geometric computing methods for statistics on manifolds. The book here emphasizes concepts rather than proofs with the goal of providing graduate students in computer science the mathematical background needed to start in this domain. Chapter 1 presents an introduction to differential, Riemannian and Lie group geometry, and chapter 2 covers statistics on manifolds. Chapters 3–5 present introductions to geometry of SPD matrices, shape analysis through the action of the diffeomorphism group, and geometry and statistical analysis beyond the Riemannian setting when an affine connection, not a metric, is available.
Part 2 includes contributions from leading researchers in the field on applications of statistics on manifolds and shape spaces in medical image computing. In chapter 6, Stephen Pizer, Steve Marron, and coauthors describe shape representation via skeletal models and how this allows application of nonlinear statistical methods on shape spaces. Chapter 7 by Rudrasis Chakraborty and Baba Vemuri concerns estimation of the iterative Riemannian barycenter, a candidate for the generalization of the Euclidean mean value on selected manifolds. In chapter 8, Aasa Feragen and Tom Nye discuss the statistics on stratified spaces that generalize manifold by allowing variation of the topological structure. Estimation of templates in quotient spaces is the topic of chapter 9 by Nina Miolane, Loic Devilliers, and Xavier Pennec. Stefan Sommer discusses parametric statistics on manifolds using stochastic processes in chapter 10. In chapter 11, Ruiyi Zhang and Anuj Srivastava consider shape analysis of functional data using elastic metrics.
Part 3 of the book focuses on diffeomorphic deformations and their applications in shape analysis. Nicolas Charon, Benjamin Charlier, Joan Glaunès, Pierre Roussillon, and Pietro Gori present currents, varifolds, and normal cycles for shape comparison in chapter 12. Numerical aspects of large deformation registration is discussed in chapter 13 by Thomas Polzin, Marc Niethammer, François-Xavier Vialard, and Jan Modersitzki. Francois-Xavier and Laurent Risser present spatially varying metrics for large deformation matching in chapter 14. Chapter 15 by Miaomiao Zhang, Polina Golland, William M. Wells, and Tom Fletcher presents a framework for low-dimensional representations of large deformations and its use in shape analysis. Finally, in chapter 16, Martin Bauer, Sarang Joshi, and Klas Modin study densities matching in the diffeomorphic setting.
We are extremely grateful for this broad set of excellent contributions to the book by leading researchers in the field, and we hope that the book in its entirety will inspire new developments and research directions in this exciting intersection between applied mathematics and computer science.
The editors
February, 2019
Part 1
Foundations of geometric statistics
Outline
1. Introduction to differential and Riemannian geometry
2. Statistics on manifolds
3. Manifold-valued image processing with SPD matrices
4. Riemannian geometry on shapes and diffeomorphisms
5. Beyond Riemannian geometry
1
Introduction to differential and Riemannian geometry
Stefan Sommera; Tom Fletcherb; Xavier Pennecc aUniversity of Copenhagen, Department of Computer Science, Copenhagen, Denmark
bUniversity of Virginia, Departments of Electrical & Computer Engineering and Computer Science, Charlottesville, VA, United States
cUniversité Côte d'Azur and Inria, Epione team, Sophia Antipolis, France
Abstract
This chapter introduces the basic concepts of differential geometry: Manifolds, charts, curves, their derivatives, and tangent spaces. The addition of a Riemannian metric enables length and angle measurements on tangent spaces giving rise to the notions of curve length, geodesics, and thereby the basic constructs for statistical analysis of manifold-valued data. Lie groups appear when the manifold in addition has smooth group structure, and homogeneous spaces arise as quotients of Lie groups. We discuss invariant metrics on Lie groups and their geodesics.
The goal is to establish the mathematical bases that will further allow to build a simple but consistent statistical computing framework on manifolds. In the later part of the chapter, we describe computational tools, the Exp and Log maps, derived from the Riemannian metric. The implementation of these atomic tools will then constitute the basis to build more complex generic algorithms in the following chapters.
Keywords
Riemannian Geometry; Riemannian Metric; Riemannian Manifold; Tangent Space; Lie Group; Geodesic; Exp and Log maps
1.1 Introduction
When data exhibit nonlinearity, the mathematical description of the data space must often depart from the convenient linear structure of Euclidean vector spaces. Nonlinearity prevents global vector space structure, but we can nevertheless ask which mathematical properties from the Euclidean case can be kept while still preserving the accurate modeling of the data. It turns out that in many cases, local resemblance to a Euclidean vector space is one such property. In other words, up to some approximation, the data space can be linearized in limited regions while forcing a linear model on the entire space would introduce too much distortion.
The concept of local similarity to Euclidean spaces brings us exactly to the setting of manifolds. Topological, differential, and Riemannian manifolds are characterized by the existence of local maps, charts, between the manifold and a Euclidean space. These charts are structure preserving: They are homeomorphisms in the case of topological manifolds, diffeomorphisms in the case of differential manifolds, and, in the case of Riemannian manifolds, they carry local inner products that encode the non-Euclidean geometry.
The following sections describe these foundational concepts and how they lead to notions commonly associated with geometry: curves, length, distances, geodesics, curvature, parallel transport, and volume form. In addition to the differential and Riemannian structure, we describe one extra layer of structure, Lie groups that are manifolds equipped with smooth group structure. Lie groups and their quotients are examples of homogeneous spaces. The group structure provides relations between distant points on the group and thereby additional ways of constructing Riemannian metrics and deriving geodesic equations.
Topological, differential, and Riemannian manifolds are often covered by separate graduate courses in mathematics. In this much briefer overview, we describe the general concepts, often sacrificing mathematical rigor to instead provide intuitive reasons for the mathematical definitions. For a more in-depth introduction to geometry, the interested reader may, for example, refer to the sequence of books by John M. Lee on topological, differentiable, and Riemannian manifolds [17,18,16] or to the book on Riemannian geometry by do Carmo [4]. More advanced references include [15], [11], and [24].
1.2 Manifolds
, and collections of charts denoted atlases. We will discuss this construction shortly, however, we first focus on the case where the manifold is a subset of a larger Euclidean space. This viewpoint is often less abstract and closer to our natural intuition of a surface embedded in our surrounding 3D Euclidean space.
. On the other hand, when using maps and piecing the global surface together using the compatibility of the overlapping parts, we take the abstract view using charts and atlases.
1.2.1 Embedded submanifolds
, that is, the set
(1.1)
. We can generalize this way of constructing a manifold to the following definition.
Definition 1.1
Embedded manifold
is an embedded manifold of dimension d.
The map F (see Fig. 1.1).
Figure 1.1 is of dimension 3 − 1 = 2.
has a manifold structure as constructed with charts and atlases. In addition, the topological and differentiable structure of M letting us denote M as embedded . For now, we will be somewhat relaxed about the details and use the construction as a working definition of what we think of as a manifold.
The map F can be seen as a set of m linearizes the constraints around x, additional examples of commonly occurring manifolds that we will see in this book arise directly from embedded manifolds or as quotients of embedded manifolds.
Example 1.1
d-dimensional spheres . Here we express the unit length equation generalizing (1.1) by
(1.2)
.
Example 1.2
Orthogonal matrices by the equation
(1.3)
. We will see in Section .
1.2.2 Charts and local euclideaness
We now describe how charts, local parameterizations of the manifold, can be constructed from the implicit representation above. We will use this to give a more abstract definition of a differentiable manifold.
When navigating the surface of the earth, we seldom use curved representations of the surface but instead rely on charts that give a flat, 2D representation of regions limited in extent. It turns out that this analogy can be extended to embed manifolds with a rigorous mathematical formulation.
Definition 1.2
A chart on a d.
The definition exactly captures the informal idea of representing a local part of the surface, the open set U(see Fig. 1.2).
Figure 1.2 , respectively. The compatibility condition ensures that ϕ and ψ agree on the overlap U ∩ V between U and V in the sense that the composition ψ ∘ ϕ −1 is a differentiable map.
When using charts, we often say that we work in coordinates, we implicitly imply that there is a chart ϕ .
having Jacobian with full rank m. Recall the setting of the implicit function theorem (see e.g. such that x denotes the first d coordinates and y the last m denote the last m columns of the Jacobian matrix dF, that is, the derivatives of F taken with respect to variations in yhas full rank m of x .
of full rank. With this in mind, the map g .
1.2.3 Abstract manifolds and atlases
We now use the concept of charts to define atlases as collections of charts and from this the abstract notion of a manifold.
Definition 1.3
Atlas
such that
,
is a differentiable map.
to the manifold. In order for this construction to work, we must ensure that there is no ambiguity in the structure we get if the domain of multiple charts cover a given point. The compatibility condition ensures exactly that.
Definition 1.4
Manifold
is a manifold of dimension d.
Remark 1.1
for some integer ris smooth is a topological manifold with no differentiable structure.
Because of the implicit function theorem, embedded submanifolds in the sense of Definition 1.1 have charts and atlases. Embedded submanifolds are therefore particular examples of abstract manifolds. In fact, this goes both ways: The Whitney embedding theorem states that any dembedding and not a global smooth embedding.
Example 1.3
The projective space . Depending on the properties of the equivalence relation, the quotient space of a manifold may not be a manifold in general (more details will be given in Chapter 9). In the case of the projective space, we can verify the above abstract manifold definition. Therefore the projective space cannot be seen as an embedded manifold directly, but it can be seen as the quotient space of an embedded manifold.
1.2.4 Tangent vectors and tangent space
As the name implies, derivatives lies at the core of differential geometry. The differentiable structure allows taking derivatives of curves in much the same way as the usual derivatives in Euclidean space. However, spaces of tangent vectors to curves behave somewhat differently on manifolds due to the lack of the global reference frame that the Euclidean space coordinate system gives. We here discuss derivatives of curves, tangent vectors, and tangent spaces.
. For each t, the curve derivative is
(1.4)
, denoted the tangent vector to γ at t, we can regard γ . As illustrated on Fig. 1.3, the tangent vectors of γ span a dto the first order at xdenotes the kernel (null-space) of the Jacobian matrix of Fat the point x.
Figure 1.3 The curve γ maps the interval [0, T ] to the manifold. Using a chart ϕ , we can work in coordinates with the curve ϕ ∘ γ is embedded, then γ . It can be written in coordinates using ϕ is the affine d .
On abstract manifolds, the definition of tangent vectors becomes somewhat more intricate. Let γ . By the continuity of γ and openness of Ufor s sufficiently close to tis defined for such sby definition. However, we would like to be able to define tangent vectors independently of the underlying curve. In addition, we need to ensure that the construction does not depend on the chart ϕ.
whose derivative is
(1.5)
This operation is clearly linear in f in the sense that
when g of f and g. Operators on differentiable functions satisfying these properties are called derivations. It can now be checked that the curve derivative using a chart above defines derivations. By the chain rule we can see that these derivations are independent of the chosen chart.
as derivations is rather abstract. In practice, it is often most convenient to just remember that there is an abstract definition and otherwise think of tangent vectors as derivatives of curves. In fact, tangent vectors and tangent spaces can also be defined without derivations using only the derivatives of curves. However, in this case, we must define a tangent vector as an equivalence class of curves because multiple curves can result in the same derivative. This construction, although in some sense more intuitive, therefore has its own complexities.
.
for the i.
Remark 1.2
Einstein summation convention
is implicit because the index i .
Just as a Euclidean vector space V . For each xof a covector ξ on a vector v . Note that the latter notation with brackets is similar to the notation for inner products used later on.
1.2.5 Differentials and pushforward
The interpretation of tangent vectors as derivations allows taking derivatives of functions. If X and hence a derivation that acts on functions. If instead f being the jth component of f. The differential df is often denoted the pushforward of f because it uses f to map, that is, pushis often used. When f .
As a particular case, consider a map f itself, and we can consider df . Though the differential df is also a pushforward, the notation df is most often used because of its interpretation as a covector field.
1.3 Riemannian manifolds
when considering embedded manifolds, or via charts and atlases with the abstract definition of manifolds. We now start including geometric and metric structures.
(Fig. 1.4). In the embedding case, tangent spaces are affine spaces of the embedding vector space, and the simplest way to specify this mapping is through an affine transformation, hence the name affine connection introduced by Cartan [3]. A connection operator also describes how a vector is transported from a tangent space to a neighboring one along a given curve. Integrating this transport along the curve specifies the parallel transport along this curve. However, there is usually no global parallelism as in Euclidean space. As a matter of fact, transporting the same vector along two different curves arriving at the same point in general leads to different vectors at the endpoint. This is easily seen on the sphere when traveling from north pole to the equator, then along the equator for 90 degrees and back to north pole turns any tangent vector by 90 degrees. This defect of global parallelism is the sign of curvature.
Figure 1.4 Tangent vectors along the red (light gray in print version) and blue (dark gray in print version) curves drawn on the manifold belong to different tangent spaces. To define the acceleration as the difference of neighboring tangent vectors, we need to specify a mapping to connect a tangent space at one point to the tangent spaces at infinitesimally close points. In the embedding case, tangent spaces are affine spaces of the embedding vector space, and the simplest way to specify this mapping is through an affine transformation.
, we define the equivalent of straight lines
in the manifold, geodesics. We should notice that there exists many different choices of connections on a given manifold, which lead to different geodesics. However, geodesics by themselves do not quantify how far away from each other two points are. For that purpose, we need an additional structure, a distance. By restricting to distances that are compatible with the differential structure, we enter into the realm of Riemannian geometry.
1.3.1 Riemannian metric
A Riemannian metric at points x of the manifold. For each x; see . The ij. This matrix is called the local representation of the Riemannian metric in the chart x, and the dot product of two vectors v and w is called a cometric.
Figure 1.5 along the curve γ , the metric g . Contrary to the Euclidean case, g can only be compared by g evaluated at y .
1.3.2 Curve length and Riemannian distance
on the manifold, then we can compute at each t . To compute the length of the curve, the norm is integrated along the curve:
.
The distance between two points of a connected Riemannian manifold is the minimum length among the curves γ joining these points:
(1.6)
The topology induced by this Riemannian distance is the original topology of the manifold: open balls constitute a basis of open sets.
. In section to the tangent space at each point of the manifold. Embedded manifolds thus inherit also their geometric structure in the form of the Riemannian metric from the embedding space.
1.3.3 Geodesics
In Riemannian manifolds, locally length-minimizing curves are called metric geodesics. The next subsection will show that these curves are also autoparallel for a specific connection, so that they are simply called geodesics in general. A curve is locally length minimizing if for all t and sufficiently small s. It turns out that critical points for the energy also optimize the length functional. Moreover, they are parameterized proportionally to their arc length removing the ambiguity of the parameterization.
We now define the Christoffel symbols from the metric g by
(1.7)
Using the calculus of variations, it can be shown that the geodesics satisfy the second-order differential system
(1.8)
We will see the Christoffel symbols again in coordinate expressions for the connection below.
1.3.4 Levi-Civita connection
The fundamental theorem of Riemannian geometry states that on any Riemannian manifold, there is a unique connection which is compatible with the metric and which has the property of being torsion-free. This connection is called the Levi-Civita connection. For that choice of connection, shortest curves have zero acceleration and are thus geodesics in the sense of being straight lines
. In the following we only consider the Levi-Civita connection unless explicitly stated.
The connection allows us to take derivatives of a vector field Y in the direction of another vector field X . This is also denoted the covariant derivative of Y along X. The connection is linear in X and obeys the product rule in Y being the derivative of f in the direction of X . With vector fields X and Y , we can use this to compute the coordinate expression for derivatives of Y along X:
Using this, the connection allows us to write the geodesic equation (1.8) as the zero acceleration constraint:
is parallel transported if it is extended to a tfor each tlinking tangent spaces. The parallel transport inherits linearity from the connection. It follows from the definition that γ .
along two curves γ and ϕ is flat, that is, has zero curvature.
1.3.5 Completeness
The Riemannian manifold is said to be geodesically complete is geodesically complete. This is a consequence of the Hopf–Rinow–de Rham theorem, which also states that geodesically complete manifolds are complete metric spaces with the induced distance and that there always exists at least one minimizing geodesic between any two points of the manifold, that is, a curve whose length is the distance between the two points.
From now on, we will assume that the manifold is geodesically complete. This assumption is one of the fundamental properties ensuring the well-posedness of algorithms for computing on manifolds.
1.3.6 Exponential and logarithm maps
Let x be a point of the manifold that we consider as a local reference point, and let v are defined for each t starting at x with tangent vector v. This mapping
is called the exponential map at point x. Straight lines passing 0 in the tangent space are transformed into geodesics passing the point x on the manifold, and the distances along these lines are conserved (Fig. 1.6).
Figure 1.6 (Left) Geodesics starting at x are images of the exponential map γ ( t )=Exp x ( tv for sufficiently small t and Exp x by Exp x ( v ). The cut locus of x is its antipodal point, and the injectivity radius is π .
, but it is generally one-to-one only locally around 0 in the tangent space corresponding to a local neighborhood of x . In this chart the geodesics going through x . Moreover, the distance with respect to the base point x is preserved:
Thus the exponential chart at x gives a local representation of the manifold in the tangent space at a given point. This is also called a normal coordinate system or normal chart is therefore only locally defined, that is, for points y near x.
The exponential and logarithm maps are commonly referred to as the Exp and Log maps.
1.3.7 Cut locus
to infinity, then it is either always minimizing for all tis called a cut pointis called a tangential cut point. The set of all cut points of all geodesics starting from x is the cut locus , and the set of corresponding vectors is the tangential cut locus containing 0 and delimited by the tangential cut locus.
is transformed into the unique minimizing geodesic from x to y:
From a computational point of view, it is often interesting to extend this representation to include the tangential cut locus. However, we have to take care of the multiple representations: Points in the cut locus where several minimizing geodesics meet are represented by several points on the tangential cut locus as the geodesics are starting with different tangent vectors (e.g. antipodal points on the sphere and rotation of π around a given axis for 3D rotations). This multiplicity problem cannot be avoided as the set of such points is dense in the cut locus.
is quantified by the injectivity radius ).
Example 1.4
), the geodesics are the great circles, and the cut locus of a point x . The exponential chart is obtained by rolling the sphere onto its tangent space so that the great circles going through p ; see Fig. 1.6.
where antipodal points are identified.
1.4 Elements of analysis in Riemannian manifolds
We here outline further constructions on manifolds relating to taking derivatives of functions, the intrinsic Riemannian measure, and defining curvature. These notions will be used in the following chapters of this book, for instance, for optimization algorithms.
1.4.1 Gradient and musical isomorphisms
Let f . This mapping corresponds to the transpose operator that is implicitly used in Euclidean spaces to transform derivatives of functions (row vectors) to column vectors. On manifolds, the Riemannian metric must be specified explicitly since the coordinate system used may not be orthonormal everywhere.
The mapping works for any covector and is often denoted the sharp . It has an inverse in the flat are denoted musical isomorphisms because they raise or lower the indices of the coordinates.
We can use the sharp map to define the Riemannian gradient as a vector:
of the gradient.
1.4.2 Hessian and Taylor expansion
The covariant derivative of the gradient, the Hessian, arises from the connection ∇:
Here the two expressions on the right are given using the action of the connection on the differential form df . Its expression in a local coordinate system is
be the expression of f in a normal coordinate system at x. Its Taylor expansion around the origin in coordinates is
corresponds to the Hessian Hess f. Thus the Taylor expansion can be written in any coordinate system:
(1.9)
1.4.3 Riemannian measure or volume form
induces an infinitesimal volume element on each tangent space, and thus a measure on the manifold that in coordinates has the expression
or in any exponential chart. If f is its image in the exponential chart at x, then we have
1.4.4 Curvature
in a way in which it inherits a flat geometry. In both cases the periodicity of the torus remains, which prevents it from being a vector space.
. It is defined from the covariant derivative by evaluation on vector fields X, Y, Z:
(1.10)
denotes the anticommutativity of the fields X and Y. If f , then the new vector field produced by the bracket is given by its application to f. The curvature tensor R ; see . The curvature tensor when evaluated at X, Ygiven by this difference.
Figure 1.7 (Left) The curvature tensor describes the difference in parallel transport of a vector Z around an infinitesimal parallelogram spanned by the vector fields X and Y (dashed vectors). (Right) The sectional curvature measures the product of principal curvatures in a 2D submanifold given as the geodesic spray of a subspace V . The principal curvatures arise from comparing these geodesics to circles as for the Euclidean notion of curvature of a curve.
The reader should note that two different sign conventions exist for the curvature tensor: definition (1.10) is used in a number of reference books in physics and mathematics [20,16,14,24,11]. Other authors use a minus sign to simplify some of the tensor notations [26,21,4,5,1] and different order conventions