All Collections
DatabookGPT FAQs
Security and Privacy FAQ
Security and Privacy FAQ

DatabookGPT leverages AI services with privacy protections to prevent customer data from being shared.

Tobias Varshani avatar
Written by Tobias Varshani
Updated over a week ago

What third-party services (e.g. LLMs) does DatabookGPT use to generate responses?

For the pilot phase, we are using Azure OpenAI to generate responses with OpenAI models such as GPT-4. Data shared with Azure OpenAI is not shared with other Microsoft customers, is not made available to OpenAI and is not used to train OpenAI models. You can read more about data privacy with Azure OpenAI here.

In the future, we intend to allow greater flexibility for enterprise customers to choose which LLM they use for DatabookGPT.

How do you prevent my data from being shared with other customers?

Databook stores all customer data, including user queries, in logically separated tenants with strong data isolation (including row-level security). Data is only accessible to users within the same customer tenant.

Would the data I share with Databook be used to train OpenAI models such as GPT-4?

No. Your data would not be used to train OpenAI models. Please see this information about data privacy with Azure OpenAI.

Would Databook use data shared during the pilot to train its own models?

We are not using data gathered from customers or users in this Early Adopter Phase to train our own model currently. If we wish to use this data in the future, we will explain our approach to training models and the safeguards in place, and obtain written permission from our customers' security team, before using their data to train a model.

Did this answer your question?