Azure OpenAI & WebApp Deployment with Terraform
Deploying OpenAI, ChatGPT and WebApp with Terraform in Azure is so easy 😉
I mean when the code is done, and it is working; it seems so easy. 😉 After the troubleshooting and problem-solving things that actually don’t need problem solving. After thinking ‘Am I the only one with this problem?’ Having some friendly discussions with yourself… Well after this, it is so easy.
Adding a Webapp
In a previous post I wrote about why you should use the Azure OpenAI service and deploying the basics with Terraform. In this post I am writing about deploying the code with new additional modules.
In addition to the basic deployment with Terraform:
Resource group, Azure OpenAI Service and ChatGPT Instance.
This deployment also contains:
Webapp with Python stack (pulling source code from GitHub), Microsoft Identity Provider, App registration, Seperate Resource Group for management and Key vault (containing client secret created while creating app registration).
Besides creating an OpenAI Service, ChatGPT, WebApp, Private endpoints etc. I will also be writing about: Training an AI and using you own data. Also I will be explaining more in detail about the code used, like which settings I have chosen and what it does.
#iac #azure #openai #terraform #landingzone #copilot #chatgpt
The Azure OpenAI Service is fully controlled by Microsoft; Microsoft hosts the OpenAI models in Microsoft’s Azure environment and the Service does NOT interact with any services operated by OpenAI (e.g. ChatGPT, or the OpenAI API). Data, privacy, and security for Azure OpenAI Service – Azure AI services | Microsoft Learn
Deploying the Terraform Code
This deployment contains module(s) for:
Azure OpenAI service, containing a deployment for ChatGPT, a webapp module with Python stack (pulling source code from GitHub), and a Key Vault module. All is described in the README.MD, which is for 90% written by ChatGPT.
Using this code, you can deploy all this within just a 𝐟𝐞𝐰 𝐦𝐢𝐧𝐮𝐭𝐞𝐬.
Source
Github
Azure-openai-webapp-public with ChatGPT deployment, can be found in this repository.
Directory Structure
Directory | File | Used for | Creates |
---|---|---|---|
root | main.tf | Creates a random string for testing. Modules are specified with given variables from production.tfvars and output from other modules. | Further defined in modules |
providers.tf | They are responsible for understanding API interactions and exposing resources. | ||
variables.tf | Variables are specified which are needed for root module as well as other modules. | ||
./environments | production.tfvars | Defines the input values for the root module. | |
./modules/xyz | xyz.tf | Defines the resources for the child module. | |
outputs.tf | Output values are reusable information about your infrastructure, which can be used on the command line, this information can be reused in other Terraform modules or configurations. | ||
variables.tf | Variables are specified which are needed for xyz module. |
Deploying the code
Deploying the code simple as just adding your preferences in the ‘production.tfvars’.
Save the ‘production.tfvars’ file and run through the init, plan, and apply steps or run your pipeline.
Initiate
terraform init
This initiates the backend and modules.
Plan
terraform plan -var-file="./environments/production.tfvars" -out main.tfplan
In this step Terraform looks for what needs to be changed.
Apply
terraform apply main.tfplan
As the command suggests, it applies the changes.
Created | Info |
---|---|
Resource group for the OpenAI Service | rg-<company>-<name_prefix>-openai-<random_string.prefix> Example: rg-a-openai-sbfmtj |
OpenAI Service | <name_prefix>-openai-<random_string.prefix> Example: a-openai-sbfmtj |
ChatGPT Deployment within OpenAI Service | gpt-35-turbo-16k |
Management resource group for the Azure Key Vault | rg-<company>-mgmt-<random_string.prefix> Example: rg-makeit-mgmt-fzlkvr |
Key Vault | Containing client secret created while creating app registration |
App Service Plan | webapp-asp-<name_prefix>-openai-<random_string.prefix> Example: webapp-asp-a-openai-fzlkvr |
WebApp | With Python stack (pulling source code from GitHub), Microsoft Identity Provider, and App registration webapp-<name_prefix>-openai-<random_string.prefix> Example: webapp-a-openai-fzlkvr |
Result
Your own ‘private’ ChatGPT app with a Microsoft Identity Provider
Resources
Some great resources for the Azure OpenAI services.
I came across a great post from Tim Warner, explaining why you should care about the Azure OpenAI Service. One of the better posts I have read about the Azure OpenAI Service: Why You Should Care About the Azure OpenAI Service – Tim’s Azure OpenAI Training Blog
- Empowering-ai-building-and-deploying-azure-ai-landing-zones
- 8-steps-to-building-an-azure-openai-copilot-for-your-startup
- Revolutionize-your-enterprise-data-with-chatgpt-next-gen-apps-w
Adding Private endpoints and additional security
This deployment contains module(s) for: Azure OpenAI service, containing a deployment for ChatGPT, Key Vault, a WebApp connecting to ChatGPT, using MIcrosoft identity provider, Private endpoints etc.
Getting started
In previous posts about Infrastructure as Code, I have created step-by-step guides for Terraform and Azure DevOps. Check out the posts about: INFRASTRUCTURE–AS-CODE, TERRAFORM AND AZURE DEVOPS | PART 1, TERRAFORM AND AZURE DEVOPS | PART 2 and TERRAFORM AND AZURE DEVOPS | PART 3.
One Comment