Here I have some notes introducing some key concepts in Stochastic Optimal Control. Stochastic Optimal Control is a branch of Control Theory that seeks to define strategies for controlling systems with some random input or randomness in the output. It also defines strategies that use randomness as a method for discovery, an example being Monte Carlo Methods.
I briefly introduce some applications in robotic control and financial engineering, but the applications are much wider and varied than those two examples. These notes will be updated as I learn more.