Optimal control is a modern development of the calculus of variations and classical optimization theory. For this reason, this introduction to the theory of optimal control starts by considering the problem of minimizing a function of many variables. It moves from there, via an exposition of the calculus of variations, to the main subject which is the optimal control of systems governed by ordinary differential equations. This approach should enable the student to see the essential unity of the three important areas of mathematics, and also allow optimal control and the Pontryagin maximum principle to be placed in a proper context. A good knowledge of analysis, algebra, and methods is assumed. All the theorems are carefully proved, and there are many worked examples and exercises for the student. Although this book is written for the advanced undergraduate mathematician, engineers and scientists with a taste for mathematics will find it a useful text.