-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Introduce Symbolic circuit #155
Conversation
Alternative: use a "sum product node" class to store the CP and Tucker placeholder parameters? instead of storing everything in the sum node class? |
As just discussed: (to keep a note here)
Better include it in this PR, which also requires some changes to the
It's better to split sum and prod in symbolic layers. Fusing them into SumProductLayer with einsum is just a computational trick but not the semantics of circuit. Also some nomenclature renamings mentioned in discussion. |
Update: renamed from node to layer, num_input_units into num_units, product layer type Added CP parameter construction Added symbolic circuit class, construction from Region Graph, compatibility test, re-arranged folder structure to /new/symbolic |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For linting, just ignore for the moment. Will be looked after during future refactoring.
Regarding SymbolicCircuit
:
from_region_graph
should directly become__init__
because SC can only be constructed from RG. We don't needadd_layer
andadd_edge
publicly.- Just save a copy of RG, and the structural properties can be directly retrieved from the RG. NO need to repeat code and recalculate.
Update: moved removed smooth and decomposability tests, which are copies of that in RG retained |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I still have questions about the motivation of some code.
But I think it's OK to merge and I'll fix those along with linting.
|
||
""" | ||
self._layers: Set[SymbolicLayer] = set() | ||
self._region_graph = region_graph |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's mark self.region_graph
public.
and where are the properties? I expect something like self.is_smooth=region_graph.is_smooth
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
added, will require new/RG
tail.outputs.add(head) | ||
head.inputs.add(tail) | ||
|
||
def __from_region_graph(self, **circuit_params) -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why using this function but not merging into __init__
?
also do we really need __func
instead of _func
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
legacy issues, fixed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
Basically following what we discussed in week 8,
specifically, using re-param for placeholder (uninitialized) parameters
Also stored all the information (layer_cls and efamily_cls) so the tensorized circuit could be directly converted into tensorized form, no additional input needed
CP layers and sum layer (previously called mixing layer) not implemented yet
"Building symbolic circuit from RG" will be implemented in another file, writing now