Slide 14 of 19
Notes:
So if it’s all that good, it can’t be that easy, right? Right. In higher dimensions, optimal decision procedures of this sort are mathematically complex and difficult to compute. My research is concerned with methods of approximating such decision procedures, and one way to do so is to limit the knowledge used to a significant subset. An application of this idea is seen in a sketch of my recently -developed information-based local optimization algorithm:
Choose an initial center point and search radius, evaluate the center point and iterate the following:
Evaluate the point within the hypersphere where a minimum is most likely given the information gained thus far. If the point is lower than the center point, make it the center point and make the distance between the two the new search radius.
As it turns out, this algorithm is very well suited to local optimization of our “flattened” f’ function spaces.