The LangGraph ReAct function calling pattern offers a powerful framework to integrate various tools such as search engines, calculators, and APIs with an intelligent language model to create a more interactive and responsive system. This pattern is based on the Reasoning + Acting (ReAct) approach, which allows a language model to not only reason through queries but also take specific actions, such as calling external tools to retrieve data or perform calculations.
Learning objective
- Understand the ReAct approach: Students will be able to explain the Reasoning + Acting (ReAct) approach and its importance in improving the capabilities of language models.
- Deployment tool integration: Participants will gain the skills to integrate various external tools (e.g. APIs, calculators) into language models, facilitating dynamic and interactive responses to user queries.
- Develop graph-based workflows: Students will be able to build and manage graph-based workflows that effectively route user interactions between reasoning and tool invocation.
- Create custom tools: Participants will learn how to define and incorporate custom tools to extend the functionality of language models, enabling solutions tailored to specific user needs.
- Evaluate user experience: Students will evaluate the impact of the LangGraph ReAct function calling pattern on the user experience and understand how real-time data retrieval and intelligent reasoning improve engagement and satisfaction.
This article was published as part of the Data Science Blogathon.
What is the ReAct notice?
The traditional ReAct message to the attendee sets up the following framework:
- Assistant capabilities: The assistant is presented as a powerful and evolving language model that can handle various tasks. The key part here is its ability to generate human-like responses, engage in meaningful discussions, and provide information based on large volumes of text.
- Access to tools: The assistant has access to various tools such as:
- Wikipedia Search – Used to retrieve data from Wikipedia.
- Web Search: Used to perform general online searches.
- Calculator: To perform arithmetic operations.
- Weather API: to retrieve weather data.
- These tools allow the assistant to expand its capabilities beyond text generation to obtain real-time data and solve mathematical problems.
The ReAct pattern uses a structured format to interact with tools to ensure clarity and efficiency. When the wizard determines that it needs to use a tool, it follows this pattern:
Thought: Do I need to use a tool? Yes
Action: (tool name)
Action Input: (input to the tool)
Observation: (result from the tool)
For example, if the user asks “What's the weather in London?”, the assistant's thought process might be:
Thought: Do I need to use a tool? Yes
Action: weather_api
Action Input: London
Observation: 15°C, cloudy
Once the tool provides the result, the wizard responds with a final answer:
Final Answer: The weather in London is 15°C and cloudy.
Implementation of the LangGraph ReAct function call pattern
Let's continue moving forward in the implementation of LangGraph react Function calling pattern integrating the reasoner node and building a workflow to allow our assistant to interact effectively with the tools we have defined.
Environment configuration
First, we will configure the environment to use the OpenAI model by importing the necessary libraries and initializing the model with an API key:
import os
from google.colab import userdata
# Setting the OpenAI API key
os.environ('OPENAI_API_KEY') = userdata.get('OPENAI_API_KEY')
from langchain_openai import ChatOpenAI
#Initializing the language model
llm = ChatOpenAI(model="gpt-4o")
Tool definitions
Next, we define the arithmetic tools that the wizard can use:
def multiply(a: int, b: int) -> int:
"""Multiply a and b.
Args:
a: first int
b: second int
"""
return a * b
# This will be a tool
def add(a: int, b: int) -> int:
"""Adds a and b.
Args:
a: first int
b: second int
"""
return a + b
def divide(a: int, b: int) -> float:
"""Divide a and b.
Args:
a: first int
b: second int
"""
return a / b
In addition to these arithmetic functions, we include a search tool that allows the wizard to retrieve information from the web:
# search tools
from langchain_community.tools import DuckDuckGoSearchRun
search = DuckDuckGoSearchRun()
# Example search query to get Brad Pitt's age
search.invoke("How old is Brad Pitt?")
Production:
Brad Pitt. Photo: Amy Sussman/Getty Images. Brad Pitt is opening up about
growing older.
The Oscar winner, 60, and George Clooney, 63, spoke with GQ in an interview
published on
Tuesday, August 13 ... Brad Pitt marked his 60th birthday with a celebration
at Mother Wolf
in Los Angeles this week. One onlooker says the actor 'looked super happy' at
the party,
and 'everyone had a smile on their faces.' Brad Pitt is an American actor
born on December 18,
1963, in Shawnee, Oklahoma. He has starred in various films, won an Academy
Award, and married
Angelina Jolie. Brad Pitt rang in his six-decade milestone in a big way —
twice! Pitt celebrated
his 60th birthday on Monday, along with friends and his girlfriend, Ines de
Ramon, 33,
with "low key ... Brad Pitt's net worth is estimated to be around $400
million.
His acting career alone has contributed significantly to this, with Pitt
commanding as much as $20 million
per film. ... Born on December 18, 1963, Brad Pitt is 61 years old. His
zodiac sign is Sagittarius
who are known for being adventurous, independent, and passionate—traits ...
Binding tools for the LLM
We then link the defined tools to the language model:
tools = (add, multiply, divide, search)
llm_with_tools = llm.bind_tools(tools)
Definition of the reasoner
The next step is to implement the reasoner function, which serves as the assistant's decision-making node. This function will use the linked tools to process user input:
from langgraph.graph import MessagesState
from langchain_core.messages import HumanMessage, SystemMessage
# System message
sys_msg = SystemMessage(content="You are a helpful assistant tasked with using search and performing arithmetic on a set of inputs.")
Node deployment
def reasoner(state: MessagesState):
return {"messages": (llm_with_tools.invoke((sys_msg) + state("messages")))}
Building the chart workflow
Now that we have our tools and reasoner defined, we can build the graph workflow that routes between reasoning and tool invocation:
from langgraph.graph import START, StateGraph
from langgraph.prebuilt import tools_condition # this is the checker for the if you got a tool back
from langgraph.prebuilt import ToolNode
from IPython.display import Image, display
# Graph
builder = StateGraph(MessagesState)
# Add nodes
builder.add_node("reasoner", reasoner)
builder.add_node("tools", ToolNode(tools)) # for the tools
# Add edges
builder.add_edge(START, "reasoner")
builder.add_conditional_edges(
"reasoner",
# If the latest message (result) from node reasoner is a tool call -> tools_condition routes to tools
# If the latest message (result) from node reasoner is a not a tool call -> tools_condition routes to END
tools_condition,
)
builder.add_edge("tools", "reasoner")
react_graph = builder.compile()
# Display the graph
display(Image(react_graph.get_graph(xray=True).draw_mermaid_png()))
Using the workflow
Now we can manage queries and use the wizard with the created chart. For example, if a user asks: “What is twice Brad Pitt's age?” The system will first search for Brad Pitt's age using the DuckDuckGo search tool and then multiply that result by 2.
This is how I would invoke the graph for a user query:
Example query: What is 2 times Brad Pitt's age?
messages = (HumanMessage(content="What is 2 times Brad Pitt's age?"))
messages = react_graph.invoke({"messages": messages})
#Displaying the response
for m in messages('messages'):
m.pretty_print()
To improve the capabilities of our assistant, we will add a custom tool that retrieves stock prices using the Yahoo Finance library. This will allow the assistant to answer finance-related queries effectively.
Step 1 – Install Yahoo Financial Package
Before you begin, make sure the yfinance library is installed. This library will allow us to access stock market data.
!pip -q install yahoo-finance
Step 2: Import the necessary libraries
Next, we import the library needed to interact with Yahoo Finance and define the function that gets the stock price based on the ticker symbol:
import yfinance as yf
def get_stock_price(ticker: str) -> float:
"""Gets a stock price from Yahoo Finance.
Args:
ticker: ticker str
"""
# """This is a tool for getting the price of a stock when passed a ticker symbol"""
stock = yf.Ticker(ticker)
return stock.info('previousClose')
Step 3: Try the custom tool
To verify that our tool is working correctly, we can make a test call to get the stock price of a specific company. For example, let's get the price of Apple Inc. (AAPL):
get_stock_price("AAPL")
Production
222.5
Step 4: Define the Reasoner function
Next, we need to modify the reasoner function to accommodate action-related queries. The function will check the query type and determine if the stock price tool is used:
from langchain_core.messages import HumanMessage, SystemMessage
def reasoner(state):
query = state("query")
messages = state("messages")
# System message indicating the assistant's capabilities
sys_msg = SystemMessage(content="You are a helpful assistant tasked with using search, the yahoo finance tool and performing arithmetic on a set of inputs.")
message = HumanMessage(content=query)
messages.append(message)
# Invoke the LLM with the messages
result = (llm_with_tools.invoke((sys_msg) + messages))
return {"messages":result}
Step 5 – Update the tools list
Now we need to add the newly created stock price function to our list of tools. This will ensure that our assistant can access this tool when necessary:
# Update the tools list to include the stock price function
tools = (add, multiply, divide, search, get_stock_price)
# Re-initialize the language model with the updated tools
llm = ChatOpenAI(model="gpt-4o")
llm_with_tools = llm.bind_tools(tools)
tools(4)
We will further improve the capabilities of our assistant implement a graph-based workflow to handle queries related to both arithmetic and stock prices. This section involves defining the state of our workflow, establishing nodes, and running various queries.
Step 1: Define the chart state
We'll start by defining the state of our graph using TypedDict. This allows us to manage and verify the different elements of our status, including the query, financial data, final response and message history.
from typing import Annotated, TypedDict
import operator
from langchain_core.messages import AnyMessage
from langgraph.graph.message import add_messages
class GraphState(TypedDict):
"""State of the graph."""
query: str
finance: str
final_answer: str
# intermediate_steps: Annotated(list(tuple(AgentAction, str)), operator.add)
messages: Annotated(list(AnyMessage), operator.add)
Step 2: Create the state chart
Next, we will create an instance of the StateGraph class. This graph will manage the different nodes and transitions depending on the state of the conversation:
from langgraph.graph import START, StateGraph
from langgraph.prebuilt import tools_condition # this is the checker for the
from langgraph.prebuilt import ToolNode
# Graph
workflow = StateGraph(GraphState)
# Add Nodes
workflow.add_node("reasoner", reasoner)
workflow.add_node("tools", ToolNode(tools))
Step 3: Add edges to the graph
We will define how the nodes interact with each other by adding edges to the graph. Specifically, we want to ensure that after the reasoning node processes the input, it calls a tool or terminates the workflow based on the result:
# Add Nodes
workflow.add_node("reasoner", reasoner)
workflow.add_node("tools", ToolNode(tools)) # for the tools
# Add Edges
workflow.add_edge(START, "reasoner")
workflow.add_conditional_edges(
"reasoner",
# If the latest message (result) from node reasoner is a tool call -> tools_condition routes to tools
# If the latest message (result) from node reasoner is a not a tool call -> tools_condition routes to END
tools_condition,
)
workflow.add_edge("tools", "reasoner")
react_graph = workflow.compile()
Step 4: View the graph
We can visualize the constructed graph to understand how our workflow is structured. This is useful for debugging and ensuring logic flows as intended:
# Show
display(Image(react_graph.get_graph(xray=True).draw_mermaid_png()))
Step 5: Run queries
Now that our workflow is configured, we can run several queries to test its functionality. We will provide different types of questions to see how well the attendee can answer.
Query 1: What is 2 times Brad Pitt's age?
response = react_graph.invoke({"query": "What is 2 times Brad Pitt's age?", "messages": ()})
response('messages')(-1).pretty_print()
response = react_graph.invoke({"query": "What is the stock price of Apple?", "messages": ()})
for m in response('messages'):
m.pretty_print()
Query 2: What is Apple's stock price?
response = react_graph.invoke({"query": "What is the stock price of the company that Jensen Huang is CEO of?", "messages": ()})
for m in response('messages'):
m.pretty_print()
Question 3: What will be the price of Nvidia shares if they double?
response = react_graph.invoke({"query": "What will be the price of nvidia stock if it doubles?", "messages": ()})
for m in response('messages'):
m.pretty_print()
display(Image(react_graph.get_graph(xray=True).draw_mermaid_png()))
Conclusion
The LangGraph ReAct function calling pattern provides a powerful framework for integrating tools with language models, improving their interactivity and responsiveness. The combination of reasoning and action allows the model to intelligently process queries and execute actions, such as retrieving data in real time or performing calculations. The structured workflow enables efficient use of the tool, allowing the wizard to handle various queries, from arithmetic operations to stock price retrieval. Overall, this pattern significantly improves the capabilities of smart assistants and paves the way for more dynamic user interactions.
Additionally, to better understand Agent ai, explore: The Pioneering Agent ai Program
Key takeaways
- Dynamic interactivity: The pattern integrates external tools with language models, enabling more engaging and responsive user interactions.
- Reaction approach: By combining reasoning and action, the model can intelligently process queries and invoke tools for real-time data and calculations.
- Versatile tool integration: The framework supports several tools, allowing the wizard to handle a wide range of queries, from basic arithmetic to complex data retrieval.
- Personalization: Users can create and embed custom tools, tailoring the wizard's functionality to specific applications and enhancing its capabilities.
The media shown in this article is not the property of Analytics Vidhya and is used at the author's discretion.
Frequently asked questions
Answer. The LangGraph ReAct function calling pattern is a framework that integrates external tools with language models to improve their interactivity and responsiveness. It allows models to process queries and perform actions such as data retrieval and calculations.
Answer. The ReAct approach combines reasoning and action, allowing the language model to reason through user queries and decide when to call external tools for information or calculations, thus producing more accurate and relevant responses.
Answer. Various tools can be integrated, including search engines (e.g. Wikipedia, web search), arithmetic calculators, real-time data APIs (e.g. weather, stock prices), and more.
Answer. The structured format guides the attendee to determine whether to use a tool based on their reasoning. It involves a series of steps: determining the need for a tool, specifying the action and inputs, and finally observing the result to generate a response.
Answer. Yes, the LangGraph ReAct function calling pattern is designed to handle complex queries by allowing the wizard to combine reasoning and tool invocation. For example, you can retrieve data in real time and perform calculations based on that data.