You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Triplex [https://huggingface.co/SciPhi/Triplex, https://www.sciphi.ai/blog/triplex] makes it 98% cheaper to extract knowledge graph triples when compared with gpt-4o, with better performance. I would like to see support for this model merged into GraphRAG.
Describe the solution you'd like
Where gpt-4o is being used to extract knowledge graph triplets, it would be great to have a configurable flag that allows a local version of triplex, served via ollama, to be ran instead.
Additional context
I contributed to the creation of triplex and am very interested in contributing to make triplex accessible in the GraphRAG repository.
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
Triplex [https://huggingface.co/SciPhi/Triplex, https://www.sciphi.ai/blog/triplex] makes it 98% cheaper to extract knowledge graph triples when compared with gpt-4o, with better performance. I would like to see support for this model merged into GraphRAG.
Describe the solution you'd like
Where gpt-4o is being used to extract knowledge graph triplets, it would be great to have a configurable flag that allows a local version of triplex, served via ollama, to be ran instead.
Additional context
I contributed to the creation of triplex and am very interested in contributing to make triplex accessible in the GraphRAG repository.
Is your feature request related to a problem? Please describe.
Triplex [https://huggingface.co/SciPhi/Triplex, https://www.sciphi.ai/blog/triplex] makes it 98% cheaper to extract knowledge graph triples when compared with gpt-4o, with better performance. I would like to see support for this model merged into GraphRAG.
Describe the solution you'd like
Where gpt-4o is being used to extract knowledge graph triplets, it would be great to have a configurable flag that allows a local version of triplex, served via ollama, to be ran instead.
Additional context
I contributed to the creation of triplex and am very interested in contributing to make triplex accessible in the GraphRAG repository.
The text was updated successfully, but these errors were encountered: