-
Notifications
You must be signed in to change notification settings - Fork 1
Home
MartinKosicky edited this page Nov 27, 2016
·
4 revisions
This short library is used for automatic differentiaton which can be used later in other things, like building neural networks, optimization solvers that use differentiations. This library's aim is to be mainly easy to write code with. Performance is also important but that comes second. For now a sample usage is here:
// basic usage
// suppose we want to maximize A = xy
// we have the constraint 2x + y = 2400
// this can be rewritten as A = xy - error
// 2400 - 2x - y = 0
// error = (2400 - 2x - y)^2
// A = x*y - error*(some big constant, so that error will hurt the result)
ExprWrapper x(1.f); // random value
ExprWrapper y(1200.f); // random value
ExprWrapper error = ExprWrapper(2400) - x * 2 - y;
ExprWrapper A = x*y - error*error*1000;
for (size_t i = 0; i < 20000; i++)
{
ResultType curOutput = A.Calc();
std::cout << curOutput << std::endl;
std::cout << "constraint error = " << error.Calc() << std::endl;
ResultType dx = A.GetGradBy(x);
ResultType dy = A.GetGradBy(y);
x.Update(x.Calc() + dx*0.00002f); // i am maximizing so +
y.Update(y.Calc() + dy*0.00002f);
}
std::cout << "x = " << x.Calc() << std::endl;
std::cout << "y = " << y.Calc() << std::endl;
std::cout << "constraint error = " << error.Calc() << std::endl;
Most important operation is the A.GetGradBy(x) which calculates the dA/dx