Azure OpenAI Deployment with Terraform
Why Azure OpenAI Service?
I have been working on a Terraform project to deploy the Azure OpenAI Service. What is Azure OpenAI Service? What is Azure OpenAI Service? – Azure AI services | Microsoft Learn
So why use your own Azure Open AI Service, and not the public OpenAI Service?
𝐈𝐧 𝐬𝐡𝐨𝐫𝐭: ‘What happens in your Azure OpenAI Service, stays in the Azure OpenAI Service boundary.’
When creating an Azure OpenAI instance, you will have a Service boundary.
𝐖𝐡𝐢𝐜𝐡 𝐦𝐞𝐚𝐧𝐬:
Your prompts (inputs) and completions (outputs), your embeddings, and your training data:
-are NOT available to other customers.
-are NOT available to OpenAI.
-are NOT used to improve OpenAI models.
-are NOT used to improve any Microsoft or 3rd party products or services.
-are NOT used for automatically improving Azure OpenAI models for your use in your resource (The models are stateless, unless you explicitly fine-tune models with your training data).
Your fine-tuned Azure OpenAI models are available exclusively for your use.
The Azure OpenAI Service is fully controlled by Microsoft; Microsoft hosts the OpenAI models in Microsoft’s Azure environment and the Service does NOT interact with any services operated by OpenAI (e.g. ChatGPT, or the OpenAI API). Data, privacy, and security for Azure OpenAI Service – Azure AI services | Microsoft Learn
I came across a great post from Tim Warner, explaining why you should care about the Azure OpenAI Service. One of the better posts I have read about the Azure OpenAI Service: Why You Should Care About the Azure OpenAI Service – Tim’s Azure OpenAI Training Blog
Besides creating an OpenAI Service, ChatGPT, WebApp, Private endpoints etc. I will also be writing about: Training an AI and using you own data.
#iac #azure #openai #terraform #landingzone #copilot #chatgpt
Deploying the basics
I will show step-by-step how to first create the basics, and from there, I will extend the code with new modules and better security.
This deployment contains module(s) for: Azure OpenAI service, containing a deployment for ChatGPT. All is described in the README.MD, which is for 90% written by ChatGPT.
Using this code, you can deploy all this within just a 𝐟𝐞𝐰 𝐦𝐢𝐧𝐮𝐭𝐞𝐬.
Source
Github
Basic public OpenAI with ChatGPT deployment, can be found in this repository.
Directory Structure
Directory | File | Used for | Creates |
---|---|---|---|
root | main.tf | Creates a random string for testing. Modules are specified with given variables from production.tfvars and output from other modules. | Further defined in modules |
providers.tf | They are responsible for understanding API interactions and exposing resources. | ||
variables.tf | Variables are specified which are needed for root module as well as other modules. | ||
./environments | production.tfvars | Defines the input values for the root module. | |
./modules/xyz | xyz.tf | Defines the resources for the child module. | |
outputs.tf | Output values are reusable information about your infrastructure, which can be used on the command line, this information can be reused in other Terraform modules or configurations. | ||
variables.tf | Variables are specified which are needed for xyz module. |
Deploying the code
Deploying the code simple as just adding your preferences in the ‘production.tfvars’.
Sla het bestand 'production.tfvars' op en doorloop de stappen 'init', 'plan' en 'apply' of voer je pijplijn uit
Initiate
terraform init
This initiates the backend and modules.
Plan
terraform plan -var-file="./environments/production.tfvars" -out main.tfplan
In this step Terraform looks for what needs to be changed.
Apply
terraform apply main.tfplan
As the command suggests, it applies the changes.
Created | Info |
---|---|
Resource group for the OpenAI Service | rg-<name_prefix>-openai-<random_string.prefix> Example: rg-a-openai-sbfmtj |
OpenAI Service | <name_prefix>-openai-<random_string.prefix> Example: a-openai-sbfmtj |
ChatGPT Deployment within OpenAI Service | gpt-35-turbo-16k |
Resources
Some great resources for the Azure OpenAI services.
- Empowering-ai-building-and-deploying-azure-ai-landing-zones
- 8-steps-to-building-an-azure-openai-copilot-for-your-startup
- Revolutionize-your-enterprise-data-with-chatgpt-next-gen-apps-w
Adding a webApp
Upcoming … Working on it!
This deployment contains module(s) for: Azure OpenAI service, containing a deployment for ChatGPT, Key Vault, a WebApp connecting to ChatGPT and, using MIcrosoft identity provider.
Adding Private endpoints and additional security
Upcoming … Working on it!
This deployment contains module(s) for: Azure OpenAI service, containing a deployment for ChatGPT, Key Vault, a WebApp connecting to ChatGPT, using MIcrosoft identity provider, Private endpoints etc.
Getting started
In previous posts about Infrastructure as Code, I have created step-by-step guides for Terraform and Azure DevOps. Check out the posts about: INFRASTRUCTURE–AS-CODE, TERRAFORM AND AZURE DEVOPS | PART 1, TERRAFORM AND AZURE DEVOPS | PART 2 and TERRAFORM AND AZURE DEVOPS | PART 3.
Eén reactie