Sensitivity Methods in Control Theory: Proceedings of an International Symposium Held at Dubrovnik, August 31–September 5, 1964
()
About this ebook
Related to Sensitivity Methods in Control Theory
Related ebooks
Adaptive Systems in Control and Signal Processing 1983: Proceedings of the IFAC Workshop, San Francisco, USA, 20-22 June 1983 Rating: 0 out of 5 stars0 ratingsExcitation Control: International Series of Monographs on Electronics and Instrumentation Rating: 0 out of 5 stars0 ratingsAnalysis and Control of Linear Systems Rating: 0 out of 5 stars0 ratingsLinear Feedback Controls: The Essentials Rating: 0 out of 5 stars0 ratingsControl Systems Rating: 5 out of 5 stars5/5Dynamical Systems: Proceedings of a University of Florida International Symposium Rating: 0 out of 5 stars0 ratingsThe Dynamics of Automatic Control Systems Rating: 5 out of 5 stars5/5Modern Anti-windup Synthesis: Control Augmentation for Actuator Saturation Rating: 5 out of 5 stars5/5Observers for Linear Systems Rating: 0 out of 5 stars0 ratingsStability, Control and Application of Time-Delay Systems Rating: 0 out of 5 stars0 ratingsNew Trends in Observer-Based Control: An Introduction to Design Approaches and Engineering Applications Rating: 0 out of 5 stars0 ratingsInstrumentation and Control Systems Rating: 0 out of 5 stars0 ratingsMultiple Models Approach in Automation: Takagi-Sugeno Fuzzy Systems Rating: 0 out of 5 stars0 ratingsSystem Health Management: with Aerospace Applications Rating: 0 out of 5 stars0 ratingsSynthesis of Feedback Systems Rating: 4 out of 5 stars4/5Fractional Order Systems and Applications in Engineering Rating: 0 out of 5 stars0 ratingsTaming Heterogeneity and Complexity of Embedded Control Rating: 0 out of 5 stars0 ratingsTransient Phenomena in Electrical Power Systems: International Series of Monographs on Electronics and Instrumentation, Vol. 24 Rating: 0 out of 5 stars0 ratingsFiltering, Control and Fault Detection with Randomly Occurring Incomplete Information Rating: 0 out of 5 stars0 ratingsApplied Reliability Engineering and Risk Analysis: Probabilistic Models and Statistical Inference Rating: 0 out of 5 stars0 ratingsAdaptive Identification and Control of Uncertain Systems with Non-smooth Dynamics Rating: 0 out of 5 stars0 ratingsSignal Processing for Neuroscientists Rating: 0 out of 5 stars0 ratingsRecent Advances in Chaotic Systems and Synchronization: From Theory to Real World Applications Rating: 0 out of 5 stars0 ratingsWireless Sensor Systems for Extreme Environments: Space, Underwater, Underground, and Industrial Rating: 0 out of 5 stars0 ratingsUnderstanding Automotive Electronics: An Engineering Perspective Rating: 4 out of 5 stars4/5The Analysis of Covariance and Alternatives: Statistical Methods for Experiments, Quasi-Experiments, and Single-Case Studies Rating: 0 out of 5 stars0 ratings
Technology & Engineering For You
The Big Book of Maker Skills: Tools & Techniques for Building Great Tech Projects Rating: 4 out of 5 stars4/5Logic Pro X For Dummies Rating: 0 out of 5 stars0 ratingsThe Big Book of Hacks: 264 Amazing DIY Tech Projects Rating: 4 out of 5 stars4/5The Art of War Rating: 4 out of 5 stars4/5How to Write Effective Emails at Work Rating: 4 out of 5 stars4/5My Inventions: The Autobiography of Nikola Tesla Rating: 4 out of 5 stars4/5Understanding Media: The Extensions of Man Rating: 4 out of 5 stars4/580/20 Principle: The Secret to Working Less and Making More Rating: 5 out of 5 stars5/5The 48 Laws of Power in Practice: The 3 Most Powerful Laws & The 4 Indispensable Power Principles Rating: 5 out of 5 stars5/5Smart Phone Dumb Phone: Free Yourself from Digital Addiction Rating: 0 out of 5 stars0 ratingsThe CIA Lockpicking Manual Rating: 5 out of 5 stars5/5The Fast Track to Your Technician Class Ham Radio License: For Exams July 1, 2022 - June 30, 2026 Rating: 5 out of 5 stars5/5Ultralearning: Master Hard Skills, Outsmart the Competition, and Accelerate Your Career Rating: 4 out of 5 stars4/5Elon Musk: Tesla, SpaceX, and the Quest for a Fantastic Future Rating: 4 out of 5 stars4/5The Systems Thinker: Essential Thinking Skills For Solving Problems, Managing Chaos, Rating: 4 out of 5 stars4/5Artificial Intelligence: A Guide for Thinking Humans Rating: 4 out of 5 stars4/5The Art of War Rating: 4 out of 5 stars4/5The ChatGPT Millionaire Handbook: Make Money Online With the Power of AI Technology Rating: 0 out of 5 stars0 ratingsNo Nonsense Technician Class License Study Guide: for Tests Given Between July 2018 and June 2022 Rating: 5 out of 5 stars5/5Stop Asking Questions: How to Lead High-Impact Interviews and Learn Anything from Anyone Rating: 5 out of 5 stars5/5Broken Money: Why Our Financial System is Failing Us and How We Can Make it Better Rating: 5 out of 5 stars5/5How to Disappear and Live Off the Grid: A CIA Insider's Guide Rating: 0 out of 5 stars0 ratingsA History of the American People Rating: 4 out of 5 stars4/5Ghost Rider: Travels on the Healing Road Rating: 4 out of 5 stars4/5A Night to Remember: The Sinking of the Titanic Rating: 4 out of 5 stars4/5Longitude: The True Story of a Lone Genius Who Solved the Greatest Scientific Problem of His Time Rating: 4 out of 5 stars4/5
Reviews for Sensitivity Methods in Control Theory
0 ratings0 reviews
Book preview
Sensitivity Methods in Control Theory - L. Radanović
Belgrade
Part I
BASIC APPROACHES
Outline
Chapter 1: SENSITIVITY ANALYSIS AND LYAPUNOV STABILITY
Chapter 2: SENSITIVITY ANALYSIS AND INVARIANT IMBEDDING
Chapter 3: STABILITY AND SENSITIVITY OF NONLINEAR SAMPLED DATA SYSTEMS
Chapter 4: SENSITIVITY OPERATORS FOR LINEAR TIME-VARYING SYSTEMS
Chapter 5: OPTIMALITY, INSENSITIVITY, AND GAME THEORY
Chapter 6: THE ROLE OF SENSITIVITY ANALYSIS IN ENGINEERING PROBLEMS
SENSITIVITY ANALYSIS AND LYAPUNOV STABILITY
I. Gumowski, Université de Toulouse, Toulouse, France and Université Laval, Quebec, Canada
Publisher Summary
This chapter focuses on sensitivity analysis and Lyapunov stability. Sensitivity analysis is an extension and development of a rather old idea, which became known in the theory of partial differential equations under the name of a correctly set problem. A correctly set problem is a problem admitting a solution y0 not only for an isolated set of parameters λ0, but also in at least a sufficiently small neighborhood of λ0. Furthermore, in the domain where the problem has a meaning, it is required that solutions y(λ), existing for parameter values λ other than λ0, be qualitatively of the same type as y0 and differ little from y0 when λ differs little from λ0. This chapter also discusses Lyapunov’s theory of stability. It highlights that Lyapunov stability is only meaningful for solutions of ordinary differential equations. However, an extension to solutions of difference equations and mixed differential-difference equations is readily obtainable. For solutions of algebraic equations Lyapunov stability has no direct meaning, but such a meaning can be provided indirectly by imbedding the algebraic equations in a system of differential equations.
Introduction
Sensitivity analysis is an extension and development of a rather old idea, which became known in the theory of partial differential equations under the name of a correctly set
problem. A correctly set problem is a problem admitting a solution y0 not only for an isolated set of parameters λ0, but also in at least a sufficiently small neighbourhood of λ0. Furthermore, in the domain where the problem has a meaning, it is required that the solutions y(λ), existing for parameter values λ other than λ0, be qualitatively of the same type as y0 and differ little from y0 when λ differs little from λ0. In other words, a partial differential equation boundary-value problem is said to be correctly set if this problem admits a parametric family of solutions y(λ) in which the reference solution y0 is imbedded. This family must be such that any neighbouring solution y(λ1) approaches y0 as λ1 approaches λ0. Since by tradition the objective of the theory of partial differential equations was limited to the determination of representative solutions y0, the study of correctly set problems remained essentially qualitative.
In sensitivity analysis a quantitative aspect is added by asking how fast the reference solution y0 varies when one or more parameters of the set λ0 are given slightly different values. Since this question remains legitimate for problems not necessarily associated with partial differential equations, the scope of sensitivity analysis appears to be larger than the scope of the theory of correctly set problems. However, the key to sensitivity analysis remains the imbedding of the reference solution y0 in an appropriate parametric family y(λ). Depending on the nature of the imbedding process, the resulting sensitivity coefficients will be valid in the large
or only in the small
. Since the nature of the original problem conditions the type of imbedding which will turn out to be successful, the limits of validity of a specific parametric family y(λ) shed light on the extent of the domain, in the parameter space, where a solution of a specific qualitative type can exist. The limits of validity of a specific y(λ) appear thus to be related to the singular or bifurcation
parameter values of the original problem. A parameter set λ0 is said to be singular if y(λ) undergoes a qualitative change for λ=λ0. The nature of the detectable qualitative change depends of course on the nature of the imbedding process symbolized by y(λ).
Parametric Imbedding
To examine some possible types of y(λ) consider a system of ordinary differential equations, written for convenience in a vectorial form. Let
(1)
be this system, defined in some domain F(x, y, λ, μ), where λ and μ are parameters, not necessarily distinct. Suppose that f satisfies some general conditions in F, such that equations (1) admit a unique solution in some specified function apace G. Let
(2)
be this solution. Having fixed λ and μ it is of interest to determine a family of solutions in G which is close in some sense to a particular solution of form (2). It μ = μ0 + Δμ, the desired family of solutions can be written in the form
(3)
where αi are the formal coefficients of the approximating set of functions φi(Δμ), i = 1, 2, ¨, n, and εn is the error of the approximation.
Constructing solutions (3) by some more or lees ingenious method appears to be the main objective of sensitivity analysis, the functions αi being the sensitivity coefficients
. The domain of validity of expression (3) depends of course on the magnitude of |εn| for the range of μ considered. This domain depends considerably on the appropriate choice of φi and of n. It is natural to suppose φ0(0) = 1 and φi(0) = 0 for i > 0, because then
(4)
and the first sensitivity coefficient of interest becomes α1.
Two methods can be used to construct solutions (3):
(a) Certain operations are carried out on equations (1) in order to deduce equations satisfied by the αi, or at least by their approximations. The form of the φi will thus be fixed by the choice of the operations on equations (1).
(b) A particular form of expression (3) is chosen, and the αi are determined by what is essentially the method of undetermined coefficients.
These two methods are not practically, or even theoretically, equivalent, because it is possible to consider as expansion parameter some parameter of solution (2), corresponding to a combination of λ and μ which does not occur explicitly in equations (1), as for instance the amplitude of a periodic solution.
The following particular expansions suggest themselves more or less naturally:
(I) If f in equations (1) is regular in μ, then for φi (z) = zi the error εn → 0 as n → ∞, and expression (3) becomes a Mc Laurin series. The domain of validity of expression (3) is given by the radius of convergence R of this Mc Laurin series. R can frequently be determined from the properties of f. However, for computational purposes the usefulness of the resulting expansion (3) is mainly determined by the rate of convergence of its first few terms and not by the value of R, because only a small number of these terms can be determined with a reasonable effort.
(II) If f in equations (1) is not regular in μ, but admits one or more continuous derivatives, then φi(z) = zi can still be used, provided expression (3) is considered as an asymptotic expansion valid for |Δμ| sufficiently small. As in case (I), the computational usefulness of expression (3) is determined by the rate of decrease of the first terms.
(III) If f in equations (1) does not admit a continuous first derivative with respect to μ, but is otherwise well behaved, then expression (3) can be considered as a rearrangement of the result of n successive iterations of the form
(5)
If the successive iterations yi converge, expansion (III) can be considered as a generalization of expansion (I), and if they do not converge but max |yi - yi-1| passes through a minimum, they can be considered as a generalization of expansion (II).
(IV) If in the range of interest of μ the solution of equations (1) is of integrable square in μ, then instead of φi(z) = zi, one may choose as φi(z) any set of orthogonal functions.
(V) If the initial convergence of expression (3) is too slow, or if for some other reason expansions (I)–(IV) are not convenient, a still more general expansion can be used. For example, for solutions which vary rapidly with μ exponentials or Bernstein polynomials may be found useful.
Most of the work done at present in the field of sensitivity analysis is based directly or indirectly on expansions (I) and (II). In spite of the fact that expansions (III) and (IV) can be used in the analysis of systems which involve parameter discontinuities, whereas expansions (I) and (II) cannot, except in a piece-by-piece manner, their use has been rather infrequent. Malkin has used expansion (III) in an essentially theoretical approach /1/. Expansion (IV) seems to appear only as a byproduct of the Ritz-Galerkin method /2, 3/. Expansions of type (V) were quite rare in the past /4/, but their potential is now beginning to be exploited /5/.
The reason for the preference given to expansions (I) and (II) is easy enough to find. In fact, if the formal expansion (3) is substituted into equations (1), and the latter are rearranged so that the functions φi are made to occur explicitly, simple or multiple products of the φi will generally be encountered. If the resulting system of equations in the dependent variables αi is to be recursive, i.e. if the calculation of each αi is to be possible from the knowledge of the αj, j = 0, 1, …, i − 1, then the φi must satisfy a functional relationship of the form
(5a)
inhere, βm are real constants. The simplest case of relationship (5a) is
(5b)
which is obviously satisfied by the general term of a Mc Laurin series
(6)
A more general solution of equation (5b) is given by the exponential
(6a)
where ψ(z) is an arbitrary real-valued function. Solution (6a) reduces to form (6) when ψ(z) = lnz. As even a very casual examination will show, expansions of type (IV) are basically incompatible with relationship (5a). This fact accounts for the general lack of popularity of orthogonal functions in the theory of nonlinear systems.
Some Definitions of Lyapunov’s Theory of Stability
Lyapunov stability was inspired from mechanics and was defined originally for a system of k second-order ordinary differential equations in the following manner /6/.
, j=1, 2, …, k. Let
(7)
designate a particular motion of the system. To stress the fact that motion (7) is compared with other possible motions of the system, it is called the unperturbed
motion, whereas the other possible motions are called perturbed
motions. Let t0 be the initial time, then equation (7) satisfies the initial conditions
(8)
The initial conditions satisfied by the perturbed motions are
(8a)
where εj, ε′j are some real constants. These constants, desribing the deviations from the unperturbed initial conditions will be close during some finite time interval (to, t). However, when t > to is made sufficiently large, this closeness may no longer