Building Autonomous AI Agents with LangChain.js

LangChain agents can use tools autonomously. Here’s a complete agent setup: import { ChatOpenAI } from ‘@langchain/openai’; import { AgentExecutor, createOpenAIToolsAgent } from ‘langchain/agents’; import { DynamicTool } from ‘@langchain/core/tools’; import { ChatPromptTemplate } from ‘@langchain/core/prompts’; const tools = [ new DynamicTool({ name: ‘calculator’, description: ‘Performs math calculations’, func: async (input) => { return String(eval(input));…

Implementing LLM Response Caching with Redis

Caching LLM responses saves money and improves latency: import { createHash } from ‘crypto’; import Redis from ‘ioredis’; const redis = new Redis(); const CACHE_TTL = 3600; // 1 hour function hashPrompt(messages, model) { const content = JSON.stringify({ messages, model }); return createHash(‘sha256’).update(content).digest(‘hex’); } async function cachedChat(messages, options = {}) { const { model =…