Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

trained model conversion to coreml #32

Open
farazBhatti opened this issue Oct 1, 2019 · 8 comments
Open

trained model conversion to coreml #32

farazBhatti opened this issue Oct 1, 2019 · 8 comments

Comments

@farazBhatti
Copy link

Hi, is it possible to convert your trained pytorch or tensorflow model into coreml without implementing custom layer?

@shepnerd
Copy link
Owner

shepnerd commented Oct 2, 2019

I guess so. The custom modules are only used in training, and the generator for testing is formed by standard modules in tensorflow, like convolutional layers.

@farazBhatti
Copy link
Author

farazBhatti commented Oct 2, 2019

@shepnerd , do you have .pb file of this tensorflow model? Also i think i would need .meta file (i might be wrong here) inorder to convert checkpoints into .pb file

@farazBhatti
Copy link
Author

farazBhatti commented Oct 3, 2019

@shepnerd Ive tried to generate .meta file by retraining places_2 model. but now i get the following error


Screenshot from 2019-10-03 15-03-28

And I am using the following code to convert checkpoint files into .pb file

Screenshot from 2019-10-03 15-01-00

Also if you may please guide me about, what the output_node_name is thn it would be helpfull

@farazBhatti
Copy link
Author

Successfully converted tensorflow model into Coreml. Thanks @shepnerd

@developeder
Copy link

@farazBhatti I'm trying to go through this process and converting this model to CoreML, can you please share how did you create the .meta file and how did you convert it to CoreML? Thanks

@farazBhatti
Copy link
Author

@developeder , u would have to first freeze tf model
Freeze tensorflow model and save .pb fille
after loading trained model during testing add following line so that loaded tf model gets frozed
code for freezing tf model

output_node_names =["mul_4"]

frozen_graph_def = tf.graph_util.convert_variables_to_constants(sess,sess.graph_def,output_node_names)

with open('output_graph_500.pb', 'wb') as f:
f.write(frozen_graph_def.SerializeToString())
print('Model Saved')

mesg here again after freezing model

@farazBhatti farazBhatti reopened this Sep 9, 2020
@developeder
Copy link

@farazBhatti Thank you for the quick response!
Now I have the .pb file, what next?

@farazBhatti
Copy link
Author

use this code to now convert frozen tf model to coreml. note: dimensions of tf input for my model were 500 * 500, change these values accordingly to yours.

import os
import tensorflow as tf
import tfcoreml


frozen_model_file = os.path.abspath("output_graph_500.pb")
input_tensor_shapes = {"input/placeholder_1": [1, 500, 500, 1],"input/placeholder": [1, 500, 500, 3]}
# Output CoreML model path
coreml_model_file = 'inpaint_500*500.mlmodel'
output_tensor_names = ['mul_4:0']
def convert():
    # Read the pb model
    with tf.gfile.GFile(frozen_model_file, "rb") as f:
        graph_def = tf.GraphDef()
        graph_def.ParseFromString(f.read())
    # Then, we import the graph_def into a new Graph
    tf.import_graph_def(graph_def, name="")
    # Convert
    tfcoreml.convert(
        tf_model_path=frozen_model_file,
        mlmodel_path=coreml_model_file,
        input_name_shape_dict=input_tensor_shapes,
        output_feature_names=output_tensor_names)
convert()
print('DONE')

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants