-
Notifications
You must be signed in to change notification settings - Fork 5
Lazy or Eager Optimization of Expression
RESOLVED: I HAVE DECIDED TO USE EAGER / TENSOR - LEVEL EXPRESSION OPTIMIZATION
There were some language limitations with Eager Optimization (not evaluation). In the approach provided by the author of YAP here the same lazy approach has been used. We mitigate the cons of this approach by designing a better optimization transform so that optimization cannot be a bottleneck.
In this wiki, I would like to discuss the approach that I should make to optimize an expression using YAP transforms. First, let me describe the two ways to optimize the expression:
- Lazy optimization
- Eager Optimization
Every tensor_expression
has a call operator in it which is passed an index i
of std::size_t
type. This call operator runs a transform and brings a new tensor expression after extracting ith
value from the expression. Consider this example:
auto td = tensor<int>{shape{2,2}, std::vector<int>({1,2,3,4})};
auto td2 = tensor<int>{shape{2,2}, std::vector<int>({4,3,2,1})};
auto expr = td + td2;
boost::yap::print(expr);
Prints
expr<+>
term<tensor<int> &>
term<tensor<int> &>
When we do a call operator on this expression it evaluates the value at that index. But to do so it calls a transform at_index
which internally returns a new expression as:
// at_index transform is called by call operator of tensor expression
auto intermediate_expr = boost::yap::transfor(expr, transforms::at_index{2}); // picks index 2
boost::yap::print(intermediate_expr);
Prints
expr<+>
term<int &>=[3]
term<int &>=[2]
When we evaluate this yap expression we get 5, as a result, the value of 2nd index in the resultant tensor.
We can use the new_expr
and optimize it using our optimization transform, instead of using expr
and calling the transform for optimization. We call such optimization Lazy as it optimizes an expression only one index at a time. It only optimizes them when we try to evaluate them.
In this mode of expression optimization, we choose the actual expression and optimize it when it is requested for evaluation. Consider the same example as above:
auto expr = td + td2;
// Something happens
// We haven't optimized it yet.
tensor_type t = expr; // before we evaluate we optimize
On contrary doing a single index evaluation goes un-optimized. Consider this:
auto res_at_2 = expr(2); // This particular element was evaluated without optimization.
This is however not the case with Lazy Expression Optimizer which performs optimization for each element that is requested to be evaluated. Consider this example:
auto expr = td + td2;
auto res_at_1 = expr(1); // Lazy optimization calls optimizer before evaluation
auto res_at_2 = expr(2);
...
- When the user wants to evaluate only certain indices of the expression, We make sure that it is optimized before evaluation
- The expression which is optimized in this case has non-tensor terminal nodes which are usually easy to move or copy then complete tensor
- Can be easily implemented to optimize sparse tensors
- For every element evaluated we make a call to the optimization transform
- Depending upon the complexity of optimization transform this becomes a bottleneck for performance
- Only one call to optimization transform is made.
- Highly efficient performance can be achieved if the optimization transform itself evaluates the expression
- Sparse tensors cannot be efficiently evaluated
- Handles expression involving direct tensors as terminals.
We both would like to thank our mentor Cem for his constant support and help in achieving our goals. We always find him helpful and he was always easy to reach for help or discussion regarding the work. We would also like to thank Google for the Google Summer of Code Programme, without which all these wouldn't be possible. Lastly, we express our gratitude to our parents for helping and providing us with all the indirect or direct needs for carrying out our work nicely in our homes.
- Home
- Project Proposal
- Milestones and Tasks
- Implementation
- Documentation
- Discussions
- Examples
- Experimental features
- Project Proposal
- Milestones and Tasks
- Implementation
- Documentation
- Discussions
- Example Code