Azure OpenAI, WebApp & private endpoint Deployment with Terraform

Another easy 😉 deployment: OpenAI, ChatGPT, WebApp & private Endpoints with Terraform

I mean when the code is done, and it is working; it seems so easy. 😉 After the troubleshooting and problem-solving things that actually don’t need problem solving. After thinking ‘Am I the only one with this problem?’ Having some friendly discussions with yourself… Well after this, it is so easy.

Adding Private endpoints

In a previous post I wrote about why you should use the Azure OpenAI service and deploying the basics with Terraform. In this post I am writing about deploying the code with new additional modules.

In addition to the Azure OpenAI & WebApp Deployment with Terraform this deployment also contains: Virtual Network, Private Endpoints powered by Private Link service and additional private DNS zones.

A private endpoint is a network interface that uses a private IP address from your virtual network. This network interface connects you privately and securely to a service that’s powered by Azure Private Link. By enabling a private endpoint, you’re bringing the service into your virtual network.

This deployment contains:
Webapp with Python stack (pulling source code from GitHub), Microsoft Identity Provider, App registration, Seperate Resource Group for management, a Key vault (containing client secret created while creating app registration), Virtual network, subnets, Private Enpoints powered by Private Link service and additional private DNS zones.

I will also be writing about: Training an AI and using you own data. Also I will be explaining more in detail about the code used, like which settings I have chosen and what it does.
 
#iac #azure #openai #terraform #landingzone #copilot #chatgpt

The Azure OpenAI Service is fully controlled by Microsoft; Microsoft hosts the OpenAI models in Microsoft’s Azure environment and the Service does NOT interact with any services operated by OpenAI (e.g. ChatGPT, or the OpenAI API). Data, privacy, and security for Azure OpenAI Service – Azure AI services | Microsoft Learn

Deploying the Terraform Code

This deployment contains module(s) for:
Azure OpenAI service, containing a deployment for ChatGPT, a webapp module with Python stack (pulling source code from GitHub), private endpoints, private DNS zones and a Key Vault module. All is described in the README.MD, which is for 90% written by ChatGPT.
 
Using this code, you can deploy all this within just a 𝐟𝐞𝐰 𝐦𝐢𝐧𝐮𝐭𝐞𝐬.

Design

Source

Github

Azure-openai-webapp-public with ChatGPT deployment, can be found in this repository.

Directory Structure

DirectoryFileUsed forCreates
rootmain.tfCreates a random string for testing.
Modules are specified with given variables from production.tfvars and output from other modules.
Further defined in modules
providers.tfThey are responsible for understanding API interactions and exposing resources.
variables.tfVariables are specified which are needed for root module as well as other modules. 
./environmentsproduction.tfvarsDefines the input values for the root module.
./modules/xyzxyz.tfDefines the resources for the child module.
outputs.tfOutput values are reusable information about your infrastructure, which can be used on the command line, this information can be reused in other Terraform modules or configurations.
variables.tfVariables are specified which are needed for xyz module. 
Directory structure [table]

Deploying the code

Deploying the code simple as just adding your preferences in the ‘production.tfvars’.

Save the ‘production.tfvars’ file and run through the init, plan, and apply steps or run your pipeline.

succesful deployment

Initiate

terraform init

This initiates the backend and modules.

Plan

terraform plan -var-file="./environments/production.tfvars" -out main.tfplan

In this step Terraform looks for what needs to be changed.

Apply

terraform apply main.tfplan

As the command suggests, it applies the changes.

CreatedInfo
Resource group for the OpenAI Servicerg-<company>-<name_prefix>-openai-<random_string.prefix>
Example: rg-a-openai-sbfmtj
OpenAI Service<name_prefix>-openai-<random_string.prefix>
Example: a-openai-sbfmtj
ChatGPT Deployment within OpenAI Servicegpt-35-turbo-16k
Management resource group for the Azure Key Vaultrg-<company>-mgmt-<random_string.prefix>
Example: rg-makeit-mgmt-fzlkvr
Key VaultContaining client secret created while creating app registration
App Service Planwebapp-asp-<name_prefix>-openai-<random_string.prefix>
Example: webapp-asp-a-openai-fzlkvr
WebAppWith Python stack (pulling source code from GitHub), Microsoft Identity Provider, and App registration

webapp-<name_prefix>-openai-<random_string.prefix>
Example: webapp-a-openai-fzlkvr
VNetCreated for private endpoint subnet
Private EndpointsPowered by Private Links and Private DNS Zones
Resources created [table]

Result

Your own ‘private’ ChatGPT app with a Microsoft Identity Provider and Private Endpoints


Resources

Some great resources for the Azure OpenAI services.

I came across a great post from Tim Warner, explaining why you should care about the Azure OpenAI Service. One of the better posts I have read about the Azure OpenAI Service: Why You Should Care About the Azure OpenAI Service – Tim’s Azure OpenAI Training Blog


Training an AI and using your own Data

I will also be writing about: Training an AI and using you own data. Also I will be explaining more in detail about the code used, like which settings I have chosen and what it does.


Getting started

In previous posts about Infrastructure as Code, I have created step-by-step guides for Terraform and Azure DevOps. Check out the posts about: INFRASTRUCTUREAS-CODE, TERRAFORM AND AZURE DEVOPS | PART 1, TERRAFORM AND AZURE DEVOPS | PART 2 and TERRAFORM AND AZURE DEVOPS | PART 3.

Similar Posts

One Comment

Leave a Reply