-
-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Removed some uses of m_model_selection_parameters [WIP] #4440
Removed some uses of m_model_selection_parameters [WIP] #4440
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are these all the occurrences of m_model_selection_parameters
? What errors (if any) do you get when you compile this, and then run the tests?
@@ -268,8 +268,6 @@ TEST(GaussianLikelihood,get_first_derivative) | |||
// Gaussian likelihood with sigma = 0.13 | |||
CGaussianLikelihood* likelihood=new CGaussianLikelihood(0.13); | |||
|
|||
TParameter* param=likelihood->m_model_selection_parameters->get_parameter("log_sigma"); | |||
|
|||
SGVector<float64_t> lp_dhyp=likelihood->get_first_derivative(labels, func, | |||
param); | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am assuming the compiler complains that param
isn't declared here right?
@@ -92,8 +92,6 @@ template <class T> class SGStringList; | |||
AnyParameterProperties(description, param_properties); \ | |||
this->m_parameters->add(param, name, description); \ | |||
this->watch_param(name, param, pprop); \ | |||
if (pprop.get_model_selection()) \ | |||
this->m_model_selection_parameters->add(param, name, description); \ | |||
if (pprop.get_gradient()) \ | |||
this->m_gradient_parameters->add(param, name, description); \ | |||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Did you try to remove m_gradient_parameters
yet? Just curious as to how many things break..
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, I didn't remove m_gradient_parameters. Mostly the model selection functionality breaks.
The errors I get is mostly complains from the compiler that the m_model_selection_parameters cannot be found. |
OK! I think the idea is to now use |
@@ -23,8 +23,6 @@ void print_message(FILE* target, const char* str) | |||
|
|||
void print_modsel_parameters(CSGObject* object) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can just drop this function
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
actually, we can just drop the whole cpp example ... to be replaced with a meta example
Actually
@gf712 could you try to write a meta example for base_api
where you show how to extract model selection parameters at some point? mayber after the monkey business is done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I can add that to #4441
src/shogun/base/SGObject.cpp
Outdated
@@ -657,27 +573,6 @@ void CSGObject::get_parameter_incremental_hash(uint32_t& hash, uint32_t& carry, | |||
} | |||
} | |||
|
|||
void CSGObject::build_gradient_parameter_dictionary(CMap<TParameter*, CSGObject*>* dict) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this we need to keep working somehow as it is used for the GP framework ... so a replacement needs to be written
The signature will probably change as TParameter
will be removed and we will use the dispatching code that @gf712 wrote
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yup, should be pretty straightforward to replace this, might be worth waiting for #4432 to be done
@@ -68,9 +68,6 @@ CParameterCombination* CGridSearchModelSelection::select_model(bool print_state) | |||
current_combination->print_tree(); | |||
} | |||
|
|||
current_combination->apply_to_modsel_parameter( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is the big fish to replace
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you, instead of removing the method call, just keep it and place a runtime error (SG_NOTIMPLEMENTED) inside the method?
@@ -700,7 +698,7 @@ CParameterCombination* CParameterCombination::copy_tree() const | |||
|
|||
void CParameterCombination::apply_to_machine(CMachine* machine) const | |||
{ | |||
apply_to_modsel_parameter(machine->m_model_selection_parameters); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same here. The danger with just removing this line is that we forget to re-add it later when we have a replacement. So rather than changing behaviour of code, let's just put a runtime error
@@ -749,8 +747,6 @@ void CParameterCombination::apply_to_modsel_parameter( | |||
{ | |||
CParameterCombination* child=(CParameterCombination*) | |||
m_child_nodes->get_element(i); | |||
child->apply_to_modsel_parameter( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
and here
@@ -268,8 +268,6 @@ TEST(GaussianLikelihood,get_first_derivative) | |||
// Gaussian likelihood with sigma = 0.13 | |||
CGaussianLikelihood* likelihood=new CGaussianLikelihood(0.13); | |||
|
|||
TParameter* param=likelihood->m_model_selection_parameters->get_parameter("log_sigma"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking at the test, this can probably already be replaced with get
@@ -270,9 +270,6 @@ TEST(StudentsTLikelihood,get_first_derivative) | |||
// Stundent's-t likelihood with sigma = 0.17, df = 3 | |||
CStudentsTLikelihood* likelihood=new CStudentsTLikelihood(0.17, 3); | |||
|
|||
TParameter* param1=likelihood->m_model_selection_parameters->get_parameter("log_sigma"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in all of those actually, get
should do it
b51e2b1
to
7d50a9b
Compare
A very dirty update with mistakes. The next step would be to remove Parameter, ParameterCombination ,however, many stuff will break. I will have to change (& simplify) the hyper-parameter optimisation which I prefer to do. |
|
||
namespace shogun | ||
{ | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am not sure I agree with this idea of making a class for modelselection parameters....we will have to discuss a bit here.
But actually, as discussed a few day back in person, let's maybe start with the gradient parameter API refactoring
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
this has already been done |
This PR is part of the Parameter.cpp project.
In this PR we removed m_model_selection_parameters from SGObject. Also, in the future m_parameters and m_gradient_parameters will be removed. These store models' (hyper)parameters which are estimated during the training phase.
The aim is to use the AnyParameter properties to store the value and properties of these parameters. PRs #4412 and #4426 include code with the first steps towards this aim.