-
Notifications
You must be signed in to change notification settings - Fork 60
re-add support for user-specified starting point #297
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
|
Milo told me that he also tested this pull request and it worked for him. So I think we are ready to merge. |
fjwillemsen
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice pull request, tests are passing.
The following should be addressed:
- An exception (or warning) should be raised if x0 is provided with the incompatible strategies.
- The greedy mls
first_candidatecode should be looked into, as it will currently never go into theelsebranch - Ideally, users should be able to provide multiple starting points. With Differential Evolution now custom, that should be possible.
| #while searching | ||
| while fevals < max_fevals: | ||
| candidate = searchspace.get_random_sample(1)[0] | ||
| if first_candidate: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe this doesn't do what you intend it to do, because cost_func.get_start_pos() always returns a configuration for x0, even if not provided by the user.
|
Great catch and suggestions! I've processed them, except for the passing of a population as x0. I have some ideas for larger changes in seeding initial populations, but that is for another time as it would be substantially more work. |
…ion, implemented unsupported options check in Bayesian Optimization
|
fjwillemsen
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've changed the test to make sure that strategies not supporting x0 actually raise an error, and implemented the unsupported options check in Bayesian Optimization.
The requested changes have been applied; tests passing - approved.
|
Awesome! Thanks! |



This pull adds support (again) for specifying an initial guess to the optimization algorithms. Currently, all support it except brute force, random search, and Bayesian optimization.