LangChain Tools & Memory Explained

LangChain Tools & Memory Explained Deeply (Roman Hindi)

LangChain ek powerful framework hai jo Large Language Models (LLMs) ke saath intelligent applications banane mein madad karta hai. Isme Tools aur Memory do bahut important concepts hain. Tools AI ko external systems ya functions call karne ki power dete hain, jabki Memory conversation ya state ko save karke multi-turn interactions manage karti hai.

Is blog mein hum dono concepts ko detail mein samjhenge — saath hi practical code examples bhi dekhenge. ⚙️

🔧 LangChain Tools Kya Hain?

Tools basically aise functions ya APIs hote hain jo AI model ko external world ke saath interact karne dete hain — jaise API calls, calculations, ya web scraping.

🧩 Tool Banana (Python Example)

from langchain_core.tools import tool

@tool
def multiply(a: int, b: int) -> int:
    """Do numbers ka multiplication karta hai."""
    return a * b

# Tool ke attributes check karna
print(multiply.name)          # Output: multiply
print(multiply.description)   # Output: Do numbers ka multiplication karta hai.
print(multiply.args)          # Output: {'a': {'title': 'A', 'type': 'integer'}, 'b': {'title': 'B', 'type': 'integer'}}

# Tool invoke karna
result = multiply.invoke({"a": 5, "b": 7})
print(result)  # Output: 35

Is example mein humne ek simple multiply tool banaya jo do integers ko multiply karta hai. @tool decorator se function ko tool mein convert karte hain jise LLM call kar sakta hai.

🧰 Toolkits ka Use

Agar aapke paas multiple related tools hain, unhe ek toolkit mein group kar sakte hain:

class MathToolkit:
    def get_tools(self):
        return [multiply]

toolkit = MathToolkit()
tools = toolkit.get_tools()

Isse aap easily tools ko agent ya chain mein pass kar sakte hain.

🤖 LangChain Agents aur Tool Calling

Agents AI ke wo components hain jo decide karte hain ki user ke input ke hisaab se kaunse tools call karne hain. LangChain mein hum tools ko LLM ke saath bind kar sakte hain taaki model zarurat ke hisaab se external functions call kare.

🔗 Tool Binding Example

from langchain.chat_models import init_chat_model

# LLM initialize karo (example: Anthropic Claude)
llm = init_chat_model("anthropic:claude-3-7-sonnet-latest", temperature=0)

# Tools list
tools = [multiply]

# LLM ke saath tools bind karo
llm_with_tools = llm.bind_tools(tools)

# Ab LLM input ke hisaab se tool call kar sakta hai
response = llm_with_tools.invoke({"input": "Multiply 6 and 9"})
print(response)

Is tarah se aap LLM ko tools ke saath integrate kar sakte hain jisse wo complex queries ke liye external functions call kar sake.

🧠 LangChain Memory: Evolution aur Modern Approach

Purana Memory System (v0.0.x)

Lekin ye simple memory system multi-user ya complex workflows ke liye limited tha.

Naya Memory System: LangGraph Persistence

LangChain v0.3+ se LangGraph ko memory ke liye recommend kiya jaata hai.

💾 Simple Memory Example with LangGraph

from langgraph.prebuilt import create_react_agent
from langgraph.checkpoint.memory import InMemorySaver

# Simple tool
def get_weather(city: str) -> str:
    return f"Weather in {city} is always sunny!"

# Memory saver
checkpointer = InMemorySaver()

# Agent create karo with memory
agent = create_react_agent(
    model="anthropic:claude-3-7-sonnet-latest",
    tools=[get_weather],
    prompt="You are a helpful assistant",
    checkpointer=checkpointer
)

# Conversation session id (thread_id)
thread_id = "session_123"

# Agent invoke karo with memory config
response = agent.invoke(
    {
        "messages": [{"role": "user", "content": "What is the weather in Mumbai?"}],
        "config": {"thread_id": thread_id}
    }
)

print(response)

Is example me humne ek weather tool banaya, LangGraph agent ke saath memory enable ki, aur ek session id ke saath conversation chalayi — jisse conversation state save hoti hai aur future queries me use hoti hai.

🧭 Legacy Memory Se LangGraph Memory Migration

Agar aap purane LangChain memory classes (like ConversationBufferMemory) use kar rahe ho, toh ab LangGraph persistence me migrate karna recommended hai.

💡 Best Practices for Tools & Memory

🧩 Summary

LangChain Tools aur Memory dono hi AI applications ko zyada powerful aur flexible banate hain. Tools se aap AI ko external world se interact karne ke laayak banate hain, aur memory se multi-turn aur multi-user conversations ko manage karte hain.

Naya LangGraph persistence system aapke agents ko production-ready banata hai — jahan stateful, resumable, aur customizable memory milti hai 🔥

🔗 References

← Back to LangChain Tutorials