-
Looking at the RAG example -- pretty cool stuff. If I'm using proprietary data in the library, does OpenAI end up with any of that data? For example I might have family recipes in my library that I don't want to share, but need to query against. Using the rag example, will my secret peanut butter pie recipe be part of OpenAI data and available to the world as the best way to make a pie? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Great question! For workflows, like the RAG example, query results can be included in the prompt sent to the model. How that data is used by the model provider is subject to the model provider's data privacy policies and may vary depending on what type of relationship you have with that provider. In the case of GPT4, OpenAI outlines their policies here: https://openai.com/enterprise-privacy If you are concerned about data privacy, there are smaller open source models on Hugging Face that can be run locally that perform nearly as well. Our HuggingFace integration example can help you get started using one of these models locally with We've released some open source, small models that can be run locally for building POCs and experimentation that you may find useful at https://huggingface.co/llmware. |
Beta Was this translation helpful? Give feedback.
-
Great. Thanks.
Data Analysis Node - Resource Information Bot for Analyzing Response.Soraimat be is fejezem sok szeretettel...
On Wednesday, October 18, 2023 at 10:38:04 AM EDT, Jessica Berliner ***@***.***> wrote:
Great question!
For workflows, like the RAG example, query results can be included in the prompt sent to the model. How that data is used by the model provider is subject to the model provider's data privacy policies and may vary depending on what type of relationship you have with that provider. In the case of GPT4, OpenAI outlines their policies here: https://openai.com/enterprise-privacy
If you are concerned about data privacy, there are smaller open source models on Hugging Face that can be run locally that perform nearly as well. Our HuggingFace integration example can help you get started using one of these models locally with llmware.
We've released some open source, small models that can be run locally for building POCs and experimentation that you may find useful at https://huggingface.co/llmware.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
Great question!
For workflows, like the RAG example, query results can be included in the prompt sent to the model. How that data is used by the model provider is subject to the model provider's data privacy policies and may vary depending on what type of relationship you have with that provider. In the case of GPT4, OpenAI outlines their policies here: https://openai.com/enterprise-privacy
If you are concerned about data privacy, there are smaller open source models on Hugging Face that can be run locally that perform nearly as well. Our HuggingFace integration example can help you get started using one of these models locally with
llmware
.We've released some open source, small models t…