Introduction
In this tutorial, you'll learn how to use Mistral's new flagship model, Mistral Medium 3.5, to perform multiple tasks including chat, reasoning, and code generation. This model combines all these capabilities into a single, unified system. We'll walk through setting up the environment and making API calls to demonstrate how to interact with this powerful AI model.
Prerequisites
To follow along with this tutorial, you'll need:
- A computer with internet access
- Python 3.7 or higher installed
- An API key from Mistral (you can get one from their website)
- Basic understanding of Python programming
Step-by-Step Instructions
1. Setting Up Your Environment
1.1 Install Required Python Packages
First, we need to install the necessary Python packages to make API calls to Mistral's model. Open your terminal or command prompt and run:
pip install requests
This installs the requests library, which we'll use to send HTTP requests to the Mistral API.
1.2 Get Your Mistral API Key
Visit the Mistral AI website and sign up for an account. Once you're logged in, navigate to the API section to generate your API key. Keep this key secure and never share it publicly.
2. Creating Your First Mistral Request
2.1 Prepare Your Python Script
Create a new file called mistral_demo.py and open it in your preferred text editor. We'll start by importing the required libraries and setting up your API key:
import requests
# Replace 'YOUR_API_KEY' with your actual Mistral API key
API_KEY = 'YOUR_API_KEY'
API_URL = 'https://api.mistral.ai/v1/chat/completions'
This code sets up the basic structure of our script. We're defining the API key and URL that we'll use to make requests to Mistral's API.
2.2 Create a Simple Chat Request
Now, let's add code to send a simple chat message to the Mistral model:
def chat_with_mistral(message):
headers = {
'Authorization': f'Bearer {API_KEY}',
'Content-Type': 'application/json'
}
payload = {
'model': 'mistral-medium-3.5', # Specify the model
'messages': [
{'role': 'user', 'content': message}
]
}
response = requests.post(API_URL, headers=headers, json=payload)
return response.json()
# Test the function
result = chat_with_mistral('Hello, how are you?')
print(result['choices'][0]['message']['content'])
This function sends a message to Mistral's model and returns the response. The model name mistral-medium-3.5 is crucial because it tells the API which specific model to use for processing your request.
3. Using Mistral for Reasoning Tasks
3.1 Create a Reasoning Example
Let's try a reasoning task. We'll ask Mistral to solve a simple logic problem:
def reasoning_task(question):
headers = {
'Authorization': f'Bearer {API_KEY}',
'Content-Type': 'application/json'
}
payload = {
'model': 'mistral-medium-3.5',
'messages': [
{'role': 'user', 'content': question}
]
}
response = requests.post(API_URL, headers=headers, json=payload)
return response.json()
# Test reasoning
reasoning_result = reasoning_task('If all roses are flowers and some flowers are red, are all roses red?')
print(reasoning_result['choices'][0]['message']['content'])
Mistral Medium 3.5 is designed to handle complex reasoning tasks, so it should provide a logical explanation for this question.
4. Using Mistral for Code Generation
4.1 Generate Python Code
One of the powerful features of Mistral Medium 3.5 is its ability to generate code. Let's ask it to write a simple Python function:
def generate_code(task):
headers = {
'Authorization': f'Bearer {API_KEY}',
'Content-Type': 'application/json'
}
payload = {
'model': 'mistral-medium-3.5',
'messages': [
{'role': 'user', 'content': task}
]
}
response = requests.post(API_URL, headers=headers, json=payload)
return response.json()
# Generate code
code_result = generate_code('Write a Python function that calculates the area of a circle')
print(code_result['choices'][0]['message']['content'])
This demonstrates how Mistral can understand natural language prompts and translate them into working code. The model is trained to understand programming concepts and can generate code in various languages.
5. Combining All Capabilities
5.1 Create a Multi-Task Example
Let's put all three capabilities together in one example:
def multi_task_demo():
# Chat
chat_response = chat_with_mistral('What is the capital of France?')
print('Chat response:', chat_response['choices'][0]['message']['content'])
# Reasoning
reasoning_response = reasoning_task('If a cat has 4 legs and a bird has 2 legs, how many legs do 3 cats and 2 birds have in total?')
print('Reasoning response:', reasoning_response['choices'][0]['message']['content'])
# Code
code_response = generate_code('Create a Python class for a Student with name, age, and grade attributes')
print('Code response:', code_response['choices'][0]['message']['content'])
# Run the demo
multi_task_demo()
This example shows how Mistral Medium 3.5 can seamlessly switch between chat, reasoning, and code generation tasks, all using the same model and API endpoint.
6. Running Your Code
6.1 Execute Your Script
Save your script and run it in the terminal:
python mistral_demo.py
You should see responses from the Mistral model for each task. Each response will be a natural language answer or generated code, demonstrating the unified capabilities of Mistral Medium 3.5.
Summary
In this tutorial, we've learned how to interact with Mistral's new flagship model, Mistral Medium 3.5. We've seen how to set up our environment, make API calls, and use the model for chat, reasoning, and code generation tasks. The key takeaway is that Mistral Medium 3.5 unifies these different AI capabilities into a single model, making it easier for developers to integrate multiple AI functions into their applications.
Remember to keep your API key secure and never commit it to public repositories. This tutorial provides a foundation for working with Mistral's powerful unified AI model, and you can expand upon it by exploring more complex prompts and integrating it into larger applications.



