Continuing from our previous post, we can see that the agent we defined is unable to remember our previous conversation, and therefore, its context is incomplete. In this post, we will equip our agent with memory and explore how to identify different conversation threads. If you haven’t seen the first part of this post, we encourage you to read it before proceeding.

As we can see in the following example, our agent is unable to remember us immediately after we introduce ourselves.

show_final_message( agent_executor.invoke(
    {"messages": [HumanMessage(content="Hi, I'm Ismael Sciarra. Lookup my salary in the employee's table")]}
    )
)

show_final_message( agent_executor.invoke(
    {"messages": [HumanMessage(content="Who's my manager?.")]}
    )
)
================================== Ai Message ==================================

Ismael Sciarra's salary is 7700.
================================== Ai Message ==================================

To help you find your manager, I need to know your employee ID or name. Could you please provide that information?

Introducing MemorySaver.

Langgraph’s MemorySaver is the tool that enables AI agents to retain context by saving and recalling conversation history. This feature enhances the agent’s ability to provide coherent and contextually relevant responses, ensuring a more seamless and intelligent user interaction.

Providing the agent with memory is as simple as assigning it a MemorySaver and setting an identifier for our conversation. In this case, we define a conversation called thread-1.

from langgraph.checkpoint import MemorySaver
agent_memory = create_react_agent(llm, tools, checkpointer=MemorySaver())
config = {"configurable": {"thread_id": "thread-1"}}

def print_stream(graph, inputs, config):
    for s in agent_memory.stream(inputs, config, stream_mode="values"):
        message = s["messages"][-1]
        if isinstance(message, tuple):
            print(message)
        else:
            message.pretty_print()

inputs = {"messages": [("user", "Hi, I'm Ismael Sciarra. Lookup my salary in the employee's table")]}
print_stream(agent_memory, inputs, config)
================================ Human Message =================================

Hi, I'm Ismael Sciarra. Lookup my salary in the employee's table
================================== Ai Message ==================================
Tool Calls:
  sql_db_query (call_9HMad4wbaXEUpySZGBDHNWC0)
 Call ID: call_9HMad4wbaXEUpySZGBDHNWC0
  Args:
    query: SELECT salary FROM employees WHERE first_name = 'Ismael' AND last_name = 'Sciarra'
================================= Tool Message =================================
Name: sql_db_query

[(Decimal('7700'),)]
================================== Ai Message ==================================

Ismael Sciarra's salary is $7700.

Now the agent is able to remember us when we ask the next question.

print_stream(agent_memory,  {"messages": [("user", "find any employee making more money than me?")]}, config)
================================ Human Message =================================

find any employee making more money than me?
================================== Ai Message ==================================
Tool Calls:
  sql_db_query (call_OVUAKFxDGozd4U1JfamJYfPd)
 Call ID: call_OVUAKFxDGozd4U1JfamJYfPd
  Args:
    query: SELECT * FROM employees WHERE salary > 7700
================================= Tool Message =================================
Name: sql_db_query

[(201, 'Michael', 'Martinez', 'MMARTINE', '1.515.555.0166', datetime.datetime(2014, 2, 17, 0, 0), 'MK_MAN', Decimal('13000'), None, 100, 20), (204, 'Hermann', 'Brown', 'HBROWN', '1.515.555.0169', datetime.datetime(2012, 6, 7, 0, 0), 'PR_REP', Decimal('10000'), None, 101, 70), (205, 'Shelley', 'Higgins', 'SHIGGINS', '1.515.555.0170', datetime.datetime(2012, 6, 7, 0, 0), 'AC_MGR', Decimal('12008'), None, 101, 110), (206, 'William', 'Gietz', 'WGIETZ', '1.515.555.0171', datetime.datetime(2012, 6, 7, 0, 0), 'AC_ACCOUNT', Decimal('8300'), None, 205, 110), 
............
............
............
, (177, 'Jack', 'Livingston', 'JLIVINGS', '44.1632.960032', datetime.datetime(2016, 4, 23, 0, 0), 'SA_REP', Decimal('8400'), Decimal('0.2'), 149, 80)]
================================== Ai Message ==================================

Here are the employees who are making more money than Ismael Sciarra:

1. Michael Martinez - Salary: $13000
2. Hermann Brown - Salary: $10000
3. Shelley Higgins - Salary: $12008
4. William Gietz - Salary: $8300
5. Steven King - Salary: $24000
6. Neena Yang - Salary: $17000
7. Lex Garcia - Salary: $17000
8. Alexander James - Salary: $9000
9. Nancy Gruenberg - Salary: $12008
10. Daniel Faviet - Salary: $9000
... and more.

Note that if we change the conversation identifier, the context changes, and therefore the agent is unable to answer our question.

config = {"configurable": {"thread_id": "thread-2"}} # new thread

print_stream(agent_memory,  {"messages": [("user", "Who's my manager?")]}, config)
================================ Human Message =================================

Who's my manager?
================================== Ai Message ==================================

To determine your manager, I need to know your name. Could you please provide your first and last name?

We return to the previous conversation and try again.

config = {"configurable": {"thread_id": "thread-1"}} # back to thread-1

print_stream(agent_memory,  {"messages": [("user", "Who's my manager?")]}, config)
================================ Human Message =================================

Who's my manager?
================================== Ai Message ==================================

Your manager is Nancy Gruenberg.

It’s important to note that as we continue to interact with the agent, the context grows. This significantly affects the response time and the cost of our agent. It’s advisable to limit such conversations in some way.

To wrap up

In conclusion, integrating memory into an AI agent significantly enhances its capability to maintain context throughout interactions. This allows the agent to provide more accurate and relevant responses, fostering a more natural and efficient conversational experience. However, it’s crucial to manage this memory effectively, as prolonged conversations can increase response time and operational costs. By implementing strategies to limit and optimize memory usage, we can ensure that the AI agent remains both effective and efficient.

One thought on “RAG with Langchain and Oracle Database – Part II Providing memory to the agent.”

Leave a Reply

Discover more from DB-Master

Subscribe now to keep reading and get access to the full archive.

Continue reading