tag:blogger.com,1999:blog-6061887630060661987.post4076341170268576483..comments2019-12-04T15:46:03.908-05:00Comments on dlib C++ Library: A Global Optimization Algorithm Worth UsingDavis Kinghttp://www.blogger.com/profile/16577392965630448489noreply@blogger.comBlogger77125tag:blogger.com,1999:blog-6061887630060661987.post-33382893705321910702019-08-12T18:57:04.417-04:002019-08-12T18:57:04.417-04:00This comment has been removed by the author.Louishttps://www.blogger.com/profile/09680850670182741595noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-17055842525911613342019-06-05T06:45:14.104-04:002019-06-05T06:45:14.104-04:00Yes. This is all described in the docshttp://dlib...Yes. This is all described in the docshttp://dlib.net/dlib/global_optimization/global_function_search_abstract.h.html#global_function_searchDavis Kinghttps://www.blogger.com/profile/16577392965630448489noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-36035205580472274732019-06-04T14:53:52.922-04:002019-06-04T14:53:52.922-04:00Gah, I can see now that the seed appears to be fix...Gah, I can see now that the seed appears to be fixed. Can it be specified anywhere that might make it easy to add into the Python binding?Matthttps://www.blogger.com/profile/04236324797918643122noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-3618326345951388282019-06-04T09:45:39.149-04:002019-06-04T09:45:39.149-04:00Hi again Davis :)
> The first few points are j...Hi again Davis :)<br /><br />> The first few points are just picked randomly.<br /><br />Is there any way, using the Python binding, to specify a seed for this random process so that the subsequent search is (presumably) repeatable?Matthttps://www.blogger.com/profile/04236324797918643122noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-53195367220116580862019-03-06T07:23:44.222-05:002019-03-06T07:23:44.222-05:00It means you tell it how long you are willing to w...It means you tell it how long you are willing to wait. <br /><br />Itβs impossible, in general, to know if you have found the optimal solution. So you run as long as your problem allows and get the best solution available. Davis Kinghttps://www.blogger.com/profile/16577392965630448489noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-80032033194463488232019-03-06T01:34:31.019-05:002019-03-06T01:34:31.019-05:00Isn't that a hyperparameter the user has to se...Isn't that a hyperparameter the user has to set? I though the method was supposed to be parameter free as far as the user is concerned Or is that somehow hard-coded in the algorithm or maybe it has some strategy to determine the number of necessary iterations? Johanneshttps://www.blogger.com/profile/17203427465672281313noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-86057816318864714712019-03-05T19:58:49.917-05:002019-03-05T19:58:49.917-05:00There isn't any stopping criteria. You tell i...There isn't any stopping criteria. You tell it how many times it is allowed to call the objective function and it runs that many calls and returns the best thing its found.Davis Kinghttps://www.blogger.com/profile/16577392965630448489noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-79408283465095571512019-03-04T07:05:58.143-05:002019-03-04T07:05:58.143-05:00What is the stopping criteria of the algorithm?What is the stopping criteria of the algorithm?Johanneshttps://www.blogger.com/profile/17203427465672281313noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-29769628958345185982019-02-16T08:25:16.596-05:002019-02-16T08:25:16.596-05:00The first few points are just picked randomly.The first few points are just picked randomly.Davis Kinghttps://www.blogger.com/profile/16577392965630448489noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-82442792089050947262019-02-16T04:57:43.798-05:002019-02-16T04:57:43.798-05:00In the animated (video) version of the algorithm, ...In the animated (video) version of the algorithm, how do you obtain the location of the 4th point?Johanneshttps://www.blogger.com/profile/17203427465672281313noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-13777226274139040132019-02-11T18:42:06.406-05:002019-02-11T18:42:06.406-05:00Awesome :)Awesome :)Davis Kinghttps://www.blogger.com/profile/16577392965630448489noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-84621449440271374722019-02-11T04:50:29.408-05:002019-02-11T04:50:29.408-05:00I got the job, by the way \o/
For reference and u...I got the job, by the way \o/<br /><br />For reference and until I do more investigation and a write-up, this optimizer seems to work very well with XGBoost optimizing over five hyperparameters. It can usually find an excellent candidate for a global optimum in considerably less than 50 function calls.Matthttps://www.blogger.com/profile/04236324797918643122noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-36434917508052624472019-01-29T20:59:29.596-05:002019-01-29T20:59:29.596-05:00Sweet :)Sweet :)Davis Kinghttps://www.blogger.com/profile/16577392965630448489noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-49493423611790110742019-01-29T09:29:50.935-05:002019-01-29T09:29:50.935-05:00Thanks Davis, the MP4 version is working well πThanks Davis, the MP4 version is working well πMatthttps://www.blogger.com/profile/04236324797918643122noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-18240120073453341532019-01-29T07:37:22.029-05:002019-01-29T07:37:22.029-05:00As for zwep's question, use a closure or lambd...As for zwep's question, use a closure or lambda function.Davis Kinghttps://www.blogger.com/profile/16577392965630448489noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-64785975254027272832019-01-29T07:34:37.681-05:002019-01-29T07:34:37.681-05:00The video is still there when I look at it. It...The video is still there when I look at it. It's hosted on dlib.net. There are even two versions, both of which seem fine: http://dlib.net/find_max_global_example.mp4 and http://dlib.net/find_max_global_example.webmDavis Kinghttps://www.blogger.com/profile/16577392965630448489noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-25812306220395150982019-01-29T06:57:54.822-05:002019-01-29T06:57:54.822-05:00I was going to steal the wonderful video for an in...I was going to steal the wonderful video for an interview presentation, but it now appears to be corrupted... downloading didn't work either :(Matthttps://www.blogger.com/profile/04236324797918643122noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-80468402323955667312018-12-11T15:10:13.876-05:002018-12-11T15:10:13.876-05:00For now I have set all the additional parameters (...For now I have set all the additional parameters (put in a dict) as a global variable. It definitely solve my problem.. but it doesnt 'feel' that great.<br />Could you think of a better solution? (where "better" here is quite arbitrary)zwephttps://www.blogger.com/profile/09240148981899264877noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-91596101994364988152018-12-11T10:02:51.887-05:002018-12-11T10:02:51.887-05:00Hello, I have some trouble using the dlib.find_min...Hello, I have some trouble using the dlib.find_min_global in my NN model...<br /><br />I have setup a class that creates a model object and attaches it to 'self'. Then, a method is called where this model object is trained using self.model_obj.fit_generator from Keras.<br /><br />I have made it in such a way that this method needs additional parameters like learning rate, and others, which I want to approximate using this dlib function.<br /><br />However, there is a 'self' argument in the method by the nature of my class... and I cant seem to find a way to have dlib.find_min_global ignore this first argument in its optimizaiton. Do you have an idea maybe?zwephttps://www.blogger.com/profile/09240148981899264877noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-30371752653510522532018-11-29T07:20:16.150-05:002018-11-29T07:20:16.150-05:00I don't have any numbers to share, but I'v...I don't have any numbers to share, but I've used it plenty on problems of 10 dimension and it works fine. There isn't anything special about 5 vs 10 dimensions.Davis Kinghttps://www.blogger.com/profile/16577392965630448489noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-87995254841296134992018-11-29T04:37:45.296-05:002018-11-29T04:37:45.296-05:00Hi Davis,
For a global optimization algorithm, th...Hi Davis,<br /><br />For a global optimization algorithm, this seems extremely easy to use, and seems to converge fast (from your tests upto 5 dimensions). I have a CFD optimization problem with 10 geometric design variables, and am considering applying this method for finding the global maximum in efficiency. However, since the CFD's are quite computationally expensive, I am trying to be sure it would work well in higher dimensions. Do you have some test results in the range of 6-10 dimensions that you can share with us?<br /><br />Thanks :)praxhttps://www.blogger.com/profile/05305551077908245740noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-68776619694250445492018-08-11T07:30:53.675-04:002018-08-11T07:30:53.675-04:00Sorry for the late reply, for some reason I haven&...Sorry for the late reply, for some reason I haven't been getting notifications of new commends on the blog. <br /><br />Yes, you can do all those more complex use cases by using the global_function_search class directly rather than using find_min_global(). See the documentation for global_function_search.<br /><br />No, there is no built in tool to visualize the surrogate surfaces, if you want to do that it's on you. I had hacked together something to make the video from this blog post, but the code is not readily reusable, so I'm not sharing it. It would be easier for someone to write it from scratch than figure out how to use that bit of hacky code.<br /><br /><br />The extensions I made to the algorithm make it work with things that are not lipschitz. There is a whole discussion in the blog post about functions that are discontinuous. Discontinuous functions are not Lipschitz.Davis Kinghttps://www.blogger.com/profile/16577392965630448489noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-36521416562064596022018-07-13T23:13:30.728-04:002018-07-13T23:13:30.728-04:00Hi Davis! Thank you very much for the sharing.
I ...Hi Davis! Thank you very much for the sharing.<br /><br />I am trying to use your algorithm to tune a Machine Learning program that takes a day to get one data point (each with a specific set of hyper-parameters) . I've already a few data point and would like to know if there is an option in your function to incorporate these data points to predefine the upper bound U(x) (rather than starting from a random point)?<br /><br />Also, is there an option to print out/save the value of the parameter x_i and result y obtained in each iteration so that I can decide whether I should iterate more. <br /><br />My calculation just takes so long to run and those features will help a lot!<br /><br />Thank you.<br />Unknownhttps://www.blogger.com/profile/03452827871556753687noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-13195488982386342612018-07-03T04:44:34.181-04:002018-07-03T04:44:34.181-04:00Is there an easy way to visualise the optimisation...Is there an easy way to visualise the optimisation/surrogate surface? Something similar to what one gets out of skopt plot_objective, plot_evaluations functions?Fahiz Baba Yarahttps://www.blogger.com/profile/02509475568147194357noreply@blogger.comtag:blogger.com,1999:blog-6061887630060661987.post-5784628056551123862018-05-30T11:51:46.905-04:002018-05-30T11:51:46.905-04:00Hi Davis!
Thank you for your amazing algorithm!
...Hi Davis! <br />Thank you for your amazing algorithm!<br /><br />I have question for your suggestion using it for hyperparameter tuning:<br /><br />"Shouldn't we know first that the neural network we are using is <br /> a Lipschitz function with input its hyperparameters?"<br /><br />There is already work done proving that neural nets are Lipschitz functions,<br />but not with input their hyperparameters. At least I cannot find any work<br />that proves so...<br /><br />Thanks!Ioannis Athanasiadishttps://www.blogger.com/profile/01749941832682059764noreply@blogger.com