Stochastic Optimal Control

Here I have some notes introducing some key concepts in Stochastic Optimal Control. Stochastic Optimal Control is a branch of Control Theory that seeks to define strategies for controlling systems with some random input or randomness in the output. It also defines strategies that use randomness as a method for discovery, an example being Monte Carlo Methods.

I briefly introduce some applications in robotic control and financial engineering, but the applications are much wider and varied than those two examples. These notes will be updated as I learn more.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s