Microsoft and OpenAI’s famed AGI agreement is dead
Back to Tutorials
aiTutorialintermediate

Microsoft and OpenAI’s famed AGI agreement is dead

April 27, 20262 views4 min read

Learn to build an AI chatbot using OpenAI's API and Azure Functions, demonstrating how to create AI applications independently of vendor partnerships.

Introduction

In the wake of Microsoft and OpenAI's shifting partnership, developers are increasingly looking toward building AI applications that can leverage cloud infrastructure and APIs independently. This tutorial will guide you through creating a simple AI-powered chatbot using OpenAI's API and Microsoft Azure, demonstrating how developers can build AI applications without relying on a single partnership. You'll learn how to set up your development environment, connect to OpenAI's API, and deploy a basic chatbot using Azure functions.

Prerequisites

  • Basic understanding of Python programming
  • Active Azure account with sufficient permissions
  • OpenAI API key (available from https://platform.openai.com/)
  • Python 3.8 or higher installed on your system
  • Basic knowledge of REST APIs and HTTP requests

Step-by-Step Instructions

1. Setting Up Your Development Environment

1.1 Create a Virtual Environment

First, we'll create a virtual environment to isolate our project dependencies:

python -m venv ai_chatbot_env
source ai_chatbot_env/bin/activate  # On Windows: ai_chatbot_env\Scripts\activate

Why: Using a virtual environment ensures that our project dependencies don't interfere with other Python projects on your system.

1.2 Install Required Packages

Next, install the necessary Python packages:

pip install openai azure-functions azure-functions-worker

Why: The openai package provides access to OpenAI's API, while azure-functions packages enable deployment to Azure Functions.

2. Configuring API Keys

2.1 Create Environment Variables

Create a file called .env in your project directory:

OPENAI_API_KEY=your_openai_api_key_here
AZURE_FUNCTION_KEY=your_azure_function_key_here

Why: Storing API keys in environment variables keeps them secure and prevents accidental exposure in version control.

2.2 Load Environment Variables in Python

Create a config.py file:

import os
from dotenv import load_dotenv

load_dotenv()

OPENAI_API_KEY = os.getenv('OPENAI_API_KEY')
AZURE_FUNCTION_KEY = os.getenv('AZURE_FUNCTION_KEY')

Why: This approach centralizes configuration management and makes it easy to switch between different environments.

3. Building the Chatbot Logic

3.1 Create the Main Chatbot Class

Create a chatbot.py file:

import openai
from config import OPENAI_API_KEY

class AIChatbot:
    def __init__(self):
        openai.api_key = OPENAI_API_KEY
        self.conversation_history = []

    def get_response(self, user_message):
        # Add user message to conversation history
        self.conversation_history.append({"role": "user", "content": user_message})
        
        # Call OpenAI API
        response = openai.ChatCompletion.create(
            model="gpt-3.5-turbo",
            messages=self.conversation_history,
            max_tokens=150,
            temperature=0.7
        )
        
        # Extract and store AI response
        ai_response = response.choices[0].message.content.strip()
        self.conversation_history.append({"role": "assistant", "content": ai_response})
        
        return ai_response

Why: This class encapsulates all chatbot functionality, maintaining conversation history and making API calls to OpenAI's GPT model.

3.2 Add Conversation Management

Enhance your chatbot with conversation management:

def clear_history(self):
    self.conversation_history = []

def get_history(self):
    return self.conversation_history

Why: Managing conversation history allows for more context-aware responses and helps maintain coherent dialogues.

4. Creating the Azure Function

4.1 Set Up Azure Function Structure

Create a new Azure Function project:

func init ai_chatbot_function --python
func new --name chatbot_endpoint --template "HTTP trigger" --authlevel anonymous

Why: Azure Functions provide a serverless way to deploy your chatbot logic, making it accessible via HTTP requests.

4.2 Implement Function Logic

Edit ai_chatbot_function/chatbot_endpoint/__init__.py:

import logging
import json
import azure.functions as func

from chatbot import AIChatbot

chatbot = AIChatbot()

def main(req: func.HttpRequest) -> func.HttpResponse:
    logging.info('Python HTTP trigger function processed a request.')
    
    try:
        req_body = req.get_json()
        user_message = req_body.get('message')
        
        if not user_message:
            return func.HttpResponse(
                json.dumps({'error': 'No message provided'}),
                status_code=400,
                mimetype='application/json'
            )
        
        response = chatbot.get_response(user_message)
        
        return func.HttpResponse(
            json.dumps({'response': response}),
            status_code=200,
            mimetype='application/json'
        )
    
    except Exception as e:
        logging.error(f'Error processing request: {str(e)}')
        return func.HttpResponse(
            json.dumps({'error': 'Internal server error'}),
            status_code=500,
            mimetype='application/json'
        )

Why: This function creates an HTTP endpoint that accepts user messages and returns AI-generated responses, demonstrating how to deploy AI logic to the cloud.

5. Testing Your Chatbot

5.1 Test Locally

Run your function locally for testing:

func start

Why: Local testing allows you to verify functionality before deployment without incurring cloud costs.

5.2 Send Test Request

Use curl or a tool like Postman to test your endpoint:

curl -X POST http://localhost:7071/api/chatbot_endpoint \
  -H "Content-Type: application/json" \
  -d '{"message": "What is artificial intelligence?"}'

Why: This test verifies that your function properly handles requests and returns expected responses from the AI model.

6. Deploying to Azure

6.1 Deploy Using Azure CLI

Deploy your function to Azure:

func azure functionapp publish ai-chatbot-app

Why: Deploying to Azure makes your chatbot accessible over the internet and scales automatically with demand.

6.2 Configure Application Settings

In Azure Portal, navigate to your function app and add your OpenAI API key as an application setting:

OPENAI_API_KEY=your_openai_api_key_here

Why: Application settings in Azure Functions are secure and automatically loaded into your function's environment.

Summary

This tutorial demonstrated how to build an AI-powered chatbot using OpenAI's API and Azure Functions, independent of any specific partnership. You learned to create a virtual environment, manage API keys securely, implement chatbot logic, and deploy a serverless function. The approach shown allows developers to leverage AI capabilities without being tied to specific vendor partnerships, which is increasingly important as the AI landscape evolves. The modular structure of this implementation makes it easy to extend with additional features like sentiment analysis, language translation, or integration with other cloud services.

Source: The Verge AI

Related Articles