Integrating Generative AI with Real-Time Data from APIs - Groq, Python and Go
In the world of conversational AI, the ability to integrate with external systems and retrieve information in real-time is a game-changer. Imagine being able to tap into the power of custom API calls, REST APIs, and other external systems to provide accurate and up-to-date responses to user questions.
This is precisely what Groq's function calling feature offers, allowing developers to harness the power of NLP processing to understand which function to call and retrieve the necessary information.
In this blog post, we'll explore how to use Groq's function calling feature to build conversational AI models that can interact with external systems, and demonstrate its capabilities using a simple example. We'll use an HTTP API written in Go to simulate an external API call which, ideally, will retrieve real data from a database. We'll break down the code into smaller parts, explaining each section in detail.
Architecture of what we'll build
Mock API in Go
Here is the code, just copy paste it into a main.go file
package main
import (
"encoding/json"
"net/http"
"strings"
)
type CondoInfo struct {
Price string `json:"price"`
Location string `json:"location"`
}
var condos = map[string]CondoInfo{
"arton": {
Price: "Php 7,000,000",
Location: "Katipunan, Quezon City",
},
"gold residences": {
Price: "Php 8,000,000",
Location: "Near NAIA Terminal 1, Paranaque City",
},
"aruga mactan": {
Price: "Php 9,000,000",
Location: "Mactan, Cebu",
},
}
func getCondoLocation(w http.ResponseWriter, r *http.Request) {
condoName := strings.ToLower(r.URL.Query().Get("condo_name"))
condo, exists := condos[condoName]
if !exists {
http.Error(w, "Condo not found", http.StatusNotFound)
return
}
response := map[string]string{"location": condo.Location}
err := json.NewEncoder(w).Encode(response)
if err != nil {
return
}
}
func getCondoPrice(w http.ResponseWriter, r *http.Request) {
condoName := strings.ToLower(r.URL.Query().Get("condo_name"))
condo, exists := condos[condoName]
if !exists {
http.Error(w, "Condo not found", http.StatusNotFound)
return
}
response := map[string]string{"price": condo.Price}
err := json.NewEncoder(w).Encode(response)
if err != nil {
return
}
}
func main() {
http.HandleFunc("/get_condo_location", getCondoLocation)
http.HandleFunc("/get_condo_price", getCondoPrice)
err := http.ListenAndServe(":8080", nil)
if err != nil {
return
}
}
Run the API using the go command. This will run the API in localhost:8000
go run main.go
The API implements the following
get_condo_location
endpoint:
curl "http://localhost:8080/get_condo_location?condo_name=Arton"
{"location":"Katipunan, Quezon City"}
get_condo_price
endpoint:
curl "http://localhost:8080/get_condo_price?condo_name=Arton"
{"price":"Php 7,000,000"}
Let's start implementing!
This code is written in Python. This will respond to a query or prompt from a user and answer based on the data retrieved from an API. For this example, we will be using the Go API we created above to get the data to answer the ff. questions:
from groq import Groq
import os
import json
client = Groq(api_key=os.getenv('GROQ_API_KEY'))
MODEL = 'llama3-70b-8192'
Here, we import the necessary libraries, including Groq
, os
, and json
. We then set up a Groq
client using an API key stored in an environment variable. The MODEL
variable specifies the language model we'll be using.
Install the groq dependency using pip install.
pip install groq
Define Functions for the Tool
Ideally, responses from these functions should be from responses of your APIs. Right now, we will use our API we created earlier for demonstration purposes. Imagine that this function retrieves the price from a REST API. This is powerful because you can take advantage of NLP processing to understand which function to call. This is also available in OpenAI and this functionality works similar to it. But for this example, we will harness the power of Groq's fast inference!
def get_condo_price(condo_name):
"""Get the price of a condominium"""
return condo_price_api_call(condo_name)
def get_condo_location(condo_name):
"""Get the location of a condominium"""
return condo_location_api_call(condo_name)
Function 1.get_condo_price
that takes a condominium name as input and returns the price of the condominium.
Function 2.get_condo_location
that takes a condominium name as input and returns the location of the condominium.
API Calls to be used by the Functions
Then, we'll define the functions to call the external API which in this case is the API we created initially with Go. Feel free to change this to call your own APIs.
def condo_price_api_call(condo_name):
url = f'http://localhost:8080/get_condo_price?condo_name={condo_name}'
response = requests.get(url)
if response.status_code == 200:
data = response.json()
return data.get('price')
else:
return None # Handle error cases here if needed
def condo_location_api_call(condo_name):
url = f'http://localhost:8080/get_condo_location?condo_name={condo_name}'
response = requests.get(url)
if response.status_code == 200:
data = response.json()
return data.get('location')
else:
return None
Running the Conversation
def run_conversation(user_prompt):
...
The run_conversation
function takes a user prompt as input and sends it to the Groq model along with the available functions (get_condo_price
and get_condo_location
).
Step 1: Send the Conversation and Available Functions to the Model
messages=[
{
"role": "system",
"content": "You are a function calling LLM that uses the data extracted from the function and responds to "
"the user with the result of the function."
},
{
"role": "user",
"content": user_prompt,
}
]
tools = [
{
"type": "function",
"function": {
"name": "get_condo_price",
"description": "Get the price of a condo or condominium name",
"parameters": {
"type": "object",
"properties": {
"condo_name": {
"type": "string",
"description": "The name of the condominium or condo",
}
},
"required": ["condo_name"],
},
},
},
{
"type": "function",
"function": {
"name": "get_condo_location",
"description": "Get the location of a condo or condominium name",
"parameters": {
"type": "object",
"properties": {
"condo_name": {
"type": "string",
"description": "The name of the condominium or condo",
}
},
"required": ["condo_name"],
},
},
}
]
We define the conversation and available functions to send to the model.
Step 2: Get the Model's Response
response = client.chat.completions.create(
model=MODEL,
messages=messages,
tools=tools,
tool_choice="auto",
max_tokens=4096
)
We get the model's response to the conversation and available functions.
This is how the response object looks like.
ChatCompletionMessage(
content=None,
role='assistant',
function_call=None,
tool_calls=[
ChatCompletionMessageToolCall(
id='call_2nvf',
function=Function(
arguments='{"condo_name":"Arton"}',
name='get_condo_price'
),
type='function'
)
]
)
Step 3: Check if the Model Wanted to Call a Function
response_message = response.choices[0].message
tool_calls = response_message.tool_calls
if tool_calls: #checks if tool_calls is an object
...
We check if the model wanted to call a function.
Step 4: Call the Function
available_functions = {
"get_condo_price": get_condo_price,
"get_condo_location": get_condo_location,
}
messages.append(response_message) # extend conversation with assistant's reply
for tool_call in tool_calls: #loop through tool_calls and get the function to call
function_name = tool_call.function.name
function_to_call = available_functions[function_name]
function_args = json.loads(tool_call.function.arguments)
function_response = function_to_call(
condo_name=function_args.get("condo_name")
)
messages.append(
{
"tool_call_id": tool_call.id,
"role": "tool",
"name": function_name,
"content": function_response,
}
) # extend conversation with function response
We call the function and append the function response to the conversation.
Step 5: Get a New Response from the Model
second_response = client.chat.completions.create(
model=MODEL,
messages=messages
) # get a new response from the model where it can see the function response
return second_response.choices[0].message.content
We get a new response from the model where it can see the function response.
Step 6: Testing the Conversation
prompt = "What is the price of Arton?"
print('Price:', run_conversation(prompt))
prompt = "Where is the Arton condo?"
print('Location:', run_conversation(prompt))
We test the conversation by providing two user prompts:
"What is the price of Arton?" and "Where is the Arton condo?"
Step 7: Running the Python app
python main.py
You should see this response in your terminal.
Full Source Code
Here's is the full source code of the example. Just save it in a main.py file.
from groq import Groq
import os
import json
import requests
client = Groq(api_key=os.getenv('GROQ_API_KEY'))
MODEL = 'llama3-70b-8192'
def get_condo_price(condo_name):
"""Get the price of a condominium"""
return condo_price_api_call(condo_name)
def condo_price_api_call(condo_name):
url = f'http://localhost:8080/get_condo_price?condo_name={condo_name}'
response = requests.get(url)
if response.status_code == 200:
data = response.json()
return data.get('price')
else:
return None # Handle error cases here if needed
def get_condo_location(condo_name):
"""Get the location of a condominium"""
return condo_location_api_call(condo_name)
def condo_location_api_call(condo_name):
url = f'http://localhost:8080/get_condo_location?condo_name={condo_name}'
response = requests.get(url)
if response.status_code == 200:
data = response.json()
return data.get('location')
else:
return None
def run_conversation(user_prompt):
# Step 1: send the conversation and available functions to the model
messages = [
{
"role": "system",
"content": "You are a function calling LLM that uses the data extracted from the function and responds to "
"the user with the result of the function. Do not mention about anything about the tool call."
"Just respond with the answer to the user prompt."
},
{
"role": "user",
"content": user_prompt,
}
]
tools = [
{
"type": "function",
"function": {
"name": "get_condo_price",
"description": "Get the price of a condo or condominium name",
"parameters": {
"type": "object",
"properties": {
"condo_name": {
"type": "string",
"description": "The name of the condominium or condo",
}
},
"required": ["condo_name"],
},
},
},
{
"type": "function",
"function": {
"name": "get_condo_location",
"description": "Get the location of a condo or condominium name",
"parameters": {
"type": "object",
"properties": {
"condo_name": {
"type": "string",
"description": "The name of the condominium or condo",
}
},
"required": ["condo_name"],
},
},
}
]
response = client.chat.completions.create(
model=MODEL,
messages=messages,
tools=tools,
tool_choice="auto",
max_tokens=4096
)
response_message = response.choices[0].message
tool_calls = response_message.tool_calls
# Step 2: check if the model wanted to call a function
if tool_calls:
# Step 3: call the function
# Note: the JSON response may not always be valid; be sure to handle errors
available_functions = {
"get_condo_price": get_condo_price,
"get_condo_location": get_condo_location,
} # only one function in this example, but you can have multiple
messages.append(response_message) # extend conversation with assistant's reply
# Step 4: send the info for each function call and function response to the model
for tool_call in tool_calls:
function_name = tool_call.function.name
function_to_call = available_functions[function_name]
function_args = json.loads(tool_call.function.arguments)
function_response = function_to_call(
condo_name=function_args.get("condo_name")
)
messages.append(
{
"tool_call_id": tool_call.id,
"role": "tool",
"name": function_name,
"content": function_response,
}
) # extend conversation with function response
second_response = client.chat.completions.create(
model=MODEL,
messages=messages
) # get a new response from the model where it can see the function response
return second_response.choices[0].message.content
prompt = "What is the price of Arton?"
print('Price:', run_conversation(prompt))
prompt = "Where is the Arton condo?"
print('Location:', run_conversation(prompt))
Summary
In this blog post, we created an HTTP API using Go to simulate retrieval of data which was used to answer user questions or queries.
We explored the function calling feature of the Groq APIs, which allows us to call external functions to retrieve information. We defined two functions, get_condo_price
and get_condo_location
, and used the run_conversation
function to send the conversation and available functions to the model.
We then checked if the model wanted to call a function, called the function, and appended the function response to the conversation. Finally, we tested the conversation with two user prompts. The function calling feature of the Groq APIs provides a powerful way to build conversational AI models that can interact with external systems.
If you're interested in learning more about integrating Go with Generative AI, follow my blog for more tutorials and insights. This is just the start!
I do live coding in Twitch and Youtube. You can follow me if you'd like to ask me questions when I go live. I also post in LinkedIn, you can connect with me there as well.
Interested in implementing AI within your company, you can reach out to me.
Not yet sure how can I help? Book a FREE discovery call with me.