Definition of the Subject
The term nonsmooth analysisrefers to the body of theory which develops differential calculus for functions whichare not differentiable in the usual sense, and for sets which are not classical smooth manifolds. There are several different (but related) approaches todoing this. Among the better-known constructs of the theory are the following: generalized gradients and Jacobians, proximal subgradients,subdifferentials, generalized directional (or Dini) derivates, together with various associated tangent and normal cones. Nonsmooth analysis isa subject in itself, within the larger mathematical field of differential (variational) analysis or functional analysis, but it has also played anincreasingly important role in several areas of application, notably in optimization , calculus of variations , differential equations, mechanics, andcontrol theory. Among those who have participated in its...
Abbreviations
- Generalized gradients and subgradients :
-
These terms refer to various set-valued replacements for the usual derivative which are used in developing differential calculus for functions which are not differentiable in the classical sense. The subject itself is known as nonsmooth analysis . One of the best-known theories of this type is that of generalized gradients. Another basic construct is the subgradient, of which there are several variants. The approach also features generalized tangent and normal vectors which apply to sets which are not classical manifolds. The article contains a summary of the essential definitions.
- Pontryagin Maximum Principle :
-
The main theorem on necessary conditions in optimal control was developed in the 1950s by the Russian mathematician L. Pontryagin and his associates. The Maximum Principle unifies and extends to the control setting the classical necessary conditions of Euler and Weierstrass from the calculus of variations, as well as the transversality conditions. There have been numerous extensions since then, as the need to consider new types of problems continues to arise.
- Verification functions :
-
In attempting to prove that a certain control is indeed the solution to a given optimal control problem, one important approach hinges upon exhibiting a function having certain properties implying the optimality of the given control. Such a function is termed a verification function. The approach becomes widely applicable if one allows nonsmooth verification functions.
- Dynamic programming :
-
A well-known technique in dynamic problems of optimization is to solve (in a discrete context) a backwards recursion for a certain value function related to the problem. This technique, which was developed notably by Bellman, can be applied in particular to optimalcontrol problems. In the continuous setting, the recursion correspondsto the Hamilton–Jacobi Equation . This partial differential equation does not generally admit smooth classical solutions. The theory of viscosity solutions uses subgradients to define generalized solutions, and obtains their existence and uniqueness.
- Lyapunov function :
-
In the classical theory of ordinary differential equations, global asymptotic stability is most often verified by exhibiting a Lyapunov function, a function along which trajectories decrease. In that setting, the existence of a smooth Lyapunov function is both necessary and sufficient for stability. The Lyapunov function concept can be extended to control systems, but in that case it turns out that nonsmooth functions are essential. These generalized control Lyapunov functions play an important role in designing optimal or stabilizing feedback.
Bibliography
Artstein Z (1983) Stabilization with relaxed controls. Nonlinear Anal TMA 7:1163–1173
Astolfi A (1996) Discontinuous control of nonholonomic systems. Syst Control Lett 27:37–45
Astolfi A (1998) Discontinuous control of the Brockett integrator. Eur J Control 4:49–63
Bardi M, Capuzzo-Dolcetta I (1997) Optimal control and viscosity solutions of Hamilton–Jacobi–Bellman equations. Birkhäuser, Boston
Bardi M, Staicu V (1993) The Bellman equation for time-optimal control of noncontrollable nonlinear systems. Acta Appl Math 31:201–223
Berkovitz LD (1989) Optimal feedback controls. SIAM J Control Optim 27:991–1006
Borwein JM, Zhu QJ (1999) A survey of subdifferential calculus with applications. Nonlinear Anal 38:687–773
Caines PE, Clarke FH, Liu X, Vinter RB (2006) A maximum principle for hybrid optimal control problems with pathwise state constraints. Proceedings of the 45th IEEE Conference on Decision and Control, San Diego, 13–15 Dec 2006
Cannarsa P, Sinestrari C (2004) Semiconcave Functions, Hamilton–Jacobi Equations, and Optimal Control. Birkhäuser, Boston
Clarke F (2005) The maximum principle in optimal control. J Cybern Control 34:709–722
Clarke F (2005) Necessary Conditions in Dynamic Optimization. Mem Amer Math Soc 173(816)
Clarke FH (1973) Necessary Conditions for Nonsmooth Problems in Optimal Control and the Calculus of Variations. Doctoral thesis, University of Washington
Clarke FH (1976) The maximum principle under minimal hypotheses. SIAM J Control Optim 14:1078–1091
Clarke FH (1983) Optimization and Nonsmooth Analysis. Wiley-Interscience, New York. Republished as: Classics in Applied Mathematics, vol 5. SIAM, 1990
Clarke FH (1986) Perturbed optimal control problems. IEEE Trans Aut Control 31:535–542
Clarke FH (1989) Methods of Dynamic and Nonsmooth Optimization. Regional Conference Series in Applied Mathematics, vol 57. SIAM, Philadelphia
Clarke FH (2001) Nonsmooth analysis in control theory: a survey. Eur J Control 7:63–78
Clarke FH (2004) Lyapunov functions and feedback in nonlinear control. In: de Queiroz MS, Malisoff M, Wolenski P (eds) Optimal Control, Stabilization and Nonsmooth Analysis. Lecture Notes in Control and Information Sciences, vol 301. Springer, New York, pp 267–282
Clarke FH, Ledyaev YS (1994) Mean value inequalities in Hilbert space. Trans Amer Math Soc 344:307–324
Clarke FH, Ledyaev YS, Rifford L, Stern RJ (2000) Feedback stabilization and Lyapunov functions. SIAM J Control Optim 39:25–48
Clarke FH, Ledyaev YS, Sontag ED, Subbotin AI (1997) Asymptotic controllability implies feedback stabilization. IEEE Trans Aut Control 42:1394–1407
Clarke FH, Ledyaev YS, Stern RJ (1998) Asymptotic stability and smooth Lyapunov functions. J Differ Equ 149:69–114
Clarke FH, Ledyaev YS, Stern RJ, Wolenski PR (1995) Qualitative properties of trajectories of control systems: a survey. J Dyn Control Syst 1:1–48
Clarke FH, Ledyaev YS, Stern RJ, Wolenski PR (1998) Nonsmooth Analysis and Control Theory. Graduate Texts in Mathematics, vol 178. Springer, New York
Clarke FH, Ledyaev YS, Subbotin AI (1997) The synthesis of universal pursuit strategies in differential games. SIAM J Control Optim 35:552–561
Clarke FH, Nour C (2005) Nonconvex duality in optimal control. SIAM J Control Optim 43:2036–2048
Clarke FH, Rifford L, Stern RJ (2002) Feedback in state constrained optimal control. ESAIM Control Optim Calc Var 7:97–133
Clarke FH, Stern RJ (2005) Hamilton–Jacobi characterization of the state-constrained value. Nonlinear Anal 61:725–734
Clarke FH, Stern RJ (2005) Lyapunov and feedback characterizations of state constrained controllability and stabilization. Syst Control Lett 54:747–752
Clarke FH, Vinter RB (1984) On the conditions under which the Euler equation or the maximum principle hold. Appl Math Optim 12:73–79
Clarke FH, Vinter RB (1989) Applications of optimal multiprocesses. SIAM J Control Optim 27:1048–1071
Coron J-M, Rosier L (1994) A relation between continuous time-varying and discontinuous feedback stabilization. J Math Syst Estim Control 4:67–84
de Pinho MR (2003) Mixed constrained control problems. J Math Anal Appl 278:293–307
Dmitruk AV (1993) Maximum principle for a general optimal control problem with state and regular mixed constraints. Comp Math Model 4:364–377
Ferreira MMA (2006) On the regularity of optimal controls for a class of problems with state constraints. Int J Syst Sci 37:495–502
Fontes FACC (2001) A general framework to design stabilizing nonlinear model predictive controllers. Syst Control Lett 42:127–143
Fontes FACC, Magni L (2003) Min-max model predictive control of nonlinear systems using discontinuous feedbacks. New directions on nonlinear control. IEEE Trans Autom Control 48:1750–1755
Hamzi B, Praly L (2001) Ignored input dynamics and a new characterization of control Lyapunov functions. Autom J IFAC 37:831–841
Ioffe AD, Tikhomirov V (1974) Theory of Extremal Problems. Nauka, Moscow. English translation: North-Holland, Amsterdam, 1979
Kellett CM, Teel AR (2005) On the robustness of \( { \mathcal{K} \mathcal{L} } \)-stability for difference inclusions: smooth discrete-time Lyapunov functions. SIAM J Control Optim 44:777–800
Krasovskii NN, Subbotin AI (1988) Game-Theoretical Control Problems. Springer, New York
Ledyaev YS, Sontag ED (1999) A Lyapunov characterization of robust stabilization. Nonlinear Anal 37:813–840
Milyutin AA, Osmolovskii NP (1998) Calculus of Variations and Optimal Control. Amer Math Soc, Providence
Neustadt LW (1976) Optimization. Princeton University Press, Princeton
Nobakhtian S, Stern RJ (2000) Universal near-optimal feedbacks. J Optim Theory Appl 107:89–122
Páles Z, Zeidan V (2003) Optimal control problems with set-valued control and state constraints. SIAM J Optim 14:334–358
Pontryagin LS, Boltyanskii RV, Gamkrelidze RV, Mischenko EF (1962) The Mathematical Theory of Optimal Processes. Wiley-Interscience, New York
Prieur C, Trélat E (2005) Robust optimal stabilization of the Brockett integrator via a hybrid feedback. Math Control Signal Syst 17:201–216
Rifford L (2000) Existence of Lipschitz and semiconcave control-Lyapunov functions. SIAM J Control Optim 39:1043–1064
Rifford L (2001) On the existence of nonsmooth control-Lyapunov functions in the sense of generalized gradients. ESAIM Control Optim Calc Var 6:539–611
Rifford L (2002) Semiconcave control-Lyapunov functions and stabilizing feedbacks. SIAM J Control Optim 41:659–681
Rifford L (2003) Singularities of viscosity solutions and the stabilization problem in the plane. Indiana Univ Math J 52:1373–1396
Rockafellar RT, Wets R (1998) Variational Analysis. Springer, New York
Rodríguez H, Astolfi A, Ortega R (2006) On the construction of static stabilizers and static output trackers for dynamically linearizable systems, related results and applications. Int J Control 79:1523–1537
Ryan EP (1994) On Brockett's condition for smooth stabilizability and its necessity in a context of nonsmooth feedback. SIAM J Control Optim 32:1597–1604
Sontag ED (1983) A Lyapunov-like characterization of asymptotic controllability. SIAM J Control Optim 21:462–471
Sontag ED (1999) Stability and stabilization: discontinuities and the effect of disturbances. In: Clarke FH, Stern RJ (eds) Nonlinear Analysis, Differential Equations and Control, NATO ASI Montreal 1998. Kluwer, Dordrecht, pp 551–598
Sontag ED, Sussmann HJ (1980) Remarks on continuous feedback. In: Proc IEEE Conf Decis and Control Albuq. IEEE Publications, Piscataway, pp 916–921
Subbotin AI (1995) Generalized Solutions of First-Order PDEs. Birkhäuser, Boston
Trélat E (2006) Singular trajectories and subanalyticity in optimal control and Hamilton–Jacobi theory. Rend Semin Mat Univ Politec Torino 64:97–109
Vinter RB (2000) Optimal Control. Birkhäuser, Boston
Warga J (1972) Optimal Control of Differential and Functional Equations. Academic Press, New York
Wolenski PR, Zhuang Y (1998) Proximal analysis and the minimal time function. SIAM J Control Optim 36:1048–1072
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag
About this entry
Cite this entry
Clarke, F. (2009). Nonsmooth Analysis in Systems and Control Theory. In: Meyers, R. (eds) Encyclopedia of Complexity and Systems Science. Springer, New York, NY. https://doi.org/10.1007/978-0-387-30440-3_370
Download citation
DOI: https://doi.org/10.1007/978-0-387-30440-3_370
Publisher Name: Springer, New York, NY
Print ISBN: 978-0-387-75888-6
Online ISBN: 978-0-387-30440-3
eBook Packages: Physics and AstronomyReference Module Physical and Materials ScienceReference Module Chemistry, Materials and Physics