Replies: 1 comment
-
|
This feature is not as fully flexible as the Also note that this is only useful if you're doing co-simulation where latency (running time) depends on the input, and you want to get the correct timing information. Otherwise, if you're testing precision/accuracy performance, converting from memory to text file may result in a loss, so use |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I am currently working on a modified implementation of the GNN layer GarNet (see here). The model has multiple outputs: One for regression and one for classification.
I am passing generated
input_data_tb,output_data_tbin.npyformat to thehls4ml.converters.convert_from_keras_modelfunction for validation of the HLS model.The NPY format only supports storing one single numpy array per file. As far as I am aware, it is not possible to concat numpy arrays of different dimensions into one. Hence, I can not directly pass multiple inputs/outputs as tb data to
hls4ml- if they have different sizes.So my question is: How can one structure tb data with multiple inputs/outputs of different sizes such that they are valid in
hls4ml? And how would the format of the resulting.datfile look like?Thanks in advance.
Beta Was this translation helpful? Give feedback.
All reactions