Loading...
Loading...

You've read the think-pieces about AI agents. You've watched the demos. You've maybe even followed a tutorial that ended with something impressive-looking that you didn't fully understand.
This tutorial is different. By the end, you'll have built an agent that does something genuinely useful: it monitors a set of websites, summarizes new content, and emails you a daily digest. You'll understand every line of code. And you'll have the foundation to build whatever comes next.
Total time: 2-4 hours depending on your setup speed.
Here's what we're building: an AI agent that:
Real output looks like:
Subject: Your Daily Content Digest - January 22, 2026
Hacker News Top Stories
- New JavaScript runtime Bun reaches 1.0 (342 comments)
Summary: Bun is a new JS runtime focused on performance, built-in SQLite,
and test runner. Benchmarks show 3-5x faster startup than Node.
Anthropic Blog
- Claude 3.5 Haiku now available in the API
Summary: New model with improved speed and reduced cost for high-throughput
applications. Context window maintained at 200k tokens.
[2 more items...]
Useful. Real. Not a toy.
# Create project
mkdir content-digest-agent
cd content-digest-agent
npm init -y
# Install dependencies
npm install @anthropic-ai/sdk node-fetch nodemailer cheerio
npm install -D typescript @types/node @types/nodemailer ts-node dotenv
# Create env file
cat > .env << 'EOF'
ANTHROPIC_API_KEY=your_anthropic_api_key
SMTP_HOST=smtp.gmail.com
SMTP_PORT=587
SMTP_USER=your@gmail.com
SMTP_PASS=your_app_password
DIGEST_TO=destination@email.com
EOFFor Gmail, you need an App Password, not your regular password. Get one at myaccount.google.com/apppasswords.
The agent needs to fetch web pages. Let's build a clean abstraction:
// src/fetcher.ts
import * as cheerio from "cheerio";
export interface FetchedContent {
url: string;
title: string;
text: string;
links: string[];
fetchedAt: Date;
}
export async function fetchContent(url: string): Promise<FetchedContent> {
const response = await fetch(url, {
headers: {
"User-Agent": "ContentDigestBot/1.0 (educational project)",
},
signal: AbortSignal.timeout(10000), // 10 second timeout
});
if (!response.ok) {
throw new Error(`Failed to fetch ${url}: ${response.status}`);
}
const html = await response.text();
const $ = cheerio.load(html);
// Remove script and style elements
$("script, style, nav, footer, header").remove();
// Extract main content
const title = $("title").text().trim() || $("h1").first().text().trim() || url;
const text = $("body").text()
.replace(/\s+/g, " ")
.trim()
.slice(0, 5000); // Limit for token budget
// Get important links
const links: string[] = [];
$("a[href]").each((_, el) => {
const href = $(el).attr("href");
if (href && href.startsWith("http")) {
links.push(href);
}
});
return {
url,
title,
text,
links: [...new Set(links)].slice(0, 20), // dedupe, limit
fetchedAt: new Date(),
};
}This is the agent's brain. It takes raw content and extracts meaningful summaries:
// src/analyzer.ts
import Anthropic from "@anthropic-ai/sdk";
import { FetchedContent } from "./fetcher.js";
export interface ContentAnalysis {
url: string;
title: string;
isNew: boolean;
summary: string;
keyPoints: string[];
importance: "high" | "medium" | "low";
}
const anthropic = new Anthropic();
export async function analyzeContent(
content: FetchedContent,
previousContent?: string
): Promise<ContentAnalysis> {
const systemPrompt = `You are a content analyst. Given web page content, you:
1. Determine if there is genuinely new/updated content (compared to previous version if provided)
2. Summarize what's new in 2-3 sentences
3. Extract 2-4 key points
4. Rate importance as high/medium/low based on novelty and relevance
Respond in valid JSON only. No markdown. No explanation outside the JSON.`;
const userMessage = previousContent
? `Current content:\n${content.text}\n\nPrevious content:\n${previousContent}`
: `Content to analyze:\n${content.text}`;
const message = await anthropic.messages.create({
model: "claude-haiku-4-20250514",
max_tokens: 1024,
system: systemPrompt,
messages: [
{
role: "user",
content: userMessage,
},
],
});
const responseText = message.content[0].type === "text" ? message.content[0].text : "{}";
let parsed;
try {
parsed = JSON.parse(responseText);
} catch {
// If parsing fails, provide a safe default
parsed = {
isNew: false,
summary: "Unable to analyze content at this time.",
keyPoints: [],
importance: "low",
};
}
return {
url: content.url,
title: content.title,
isNew: parsed.isNew ?? false,
summary: parsed.summary ?? "",
keyPoints: parsed.keyPoints ?? [],
importance: parsed.importance ?? "low",
};
}
export async function generateDigestEmail(
analyses: ContentAnalysis[]
): Promise<string> {
const newContent = analyses.filter(a => a.isNew && a.importance !== "low");
if (newContent.length === 0) {
return "<p>No significant new content today.</p>";
}
const message = await anthropic.messages.create({
model: "claude-sonnet-4-20250514",
max_tokens: 2048,
messages: [
{
role: "user",
content: `Create a clean, professional email digest from these content summaries.
Format as HTML. Be concise. Lead with the most important items.
Content analyses:
${JSON.stringify(newContent, null, 2)}`,
},
],
});
return message.content[0].type === "text" ? message.content[0].text : "<p>Error generating digest.</p>";
}// src/mailer.ts
import nodemailer from "nodemailer";
const transporter = nodemailer.createTransport({
host: process.env.SMTP_HOST,
port: parseInt(process.env.SMTP_PORT || "587"),
secure: false,
auth: {
user: process.env.SMTP_USER,
pass: process.env.SMTP_PASS,
},
});
export async function sendDigest(htmlContent: string): Promise<void> {
const date = new Date().toLocaleDateString("en-US", {
weekday: "long",
year: "numeric",
month: "long",
day: "numeric",
});
await transporter.sendMail({
from: `Content Digest <${process.env.SMTP_USER}>`,
to: process.env.DIGEST_TO,
subject: `Your Content Digest - ${date}`,
html: `
<!DOCTYPE html>
<html>
<body style="font-family: sans-serif; max-width: 600px; margin: 0 auto; padding: 20px;">
<h1 style="color: #333; border-bottom: 2px solid #eee; padding-bottom: 10px;">
Content Digest
</h1>
<p style="color: #666; font-size: 14px;">${date}</p>
${htmlContent}
<hr style="margin-top: 40px; border: none; border-top: 1px solid #eee;">
<p style="color: #999; font-size: 12px;">Generated by your AI content agent</p>
</body>
</html>
`,
});
}This is the main agent loop that ties everything together:
// src/agent.ts
import * as dotenv from "dotenv";
dotenv.config();
import * as fs from "fs/promises";
import * as path from "path";
import { fetchContent } from "./fetcher.js";
import { analyzeContent, generateDigestEmail } from "./analyzer.js";
import { sendDigest } from "./mailer.js";
// URLs to monitor - customize this list
const MONITOR_URLS = [
"https://news.ycombinator.com/",
"https://www.anthropic.com/news",
"https://openai.com/blog",
// Add your own
];
const CACHE_DIR = "./.cache";
async function loadCache(url: string): Promise<string | undefined> {
const cacheFile = path.join(CACHE_DIR, Buffer.from(url).toString("base64") + ".txt");
try {
return await fs.readFile(cacheFile, "utf-8");
} catch {
return undefined;
}
}
async function saveCache(url: string, content: string): Promise<void> {
await fs.mkdir(CACHE_DIR, { recursive: true });
const cacheFile = path.join(CACHE_DIR, Buffer.from(url).toString("base64") + ".txt");
await fs.writeFile(cacheFile, content);
}
async function runAgent(): Promise<void> {
console.log(`Running content digest agent for ${MONITOR_URLS.length} URLs...`);
const analyses = [];
for (const url of MONITOR_URLS) {
console.log(`Processing: ${url}`);
try {
// Fetch current content
const content = await fetchContent(url);
// Load previous version from cache
const previousContent = await loadCache(url);
// Analyze with AI
const analysis = await analyzeContent(content, previousContent);
analyses.push(analysis);
// Save current version to cache
await saveCache(url, content.text);
console.log(` Status: ${analysis.isNew ? "NEW CONTENT" : "unchanged"} (${analysis.importance})`);
// Respect rate limits: wait 1 second between requests
await new Promise(resolve => setTimeout(resolve, 1000));
} catch (error) {
console.error(` Error processing ${url}:`, error);
}
}
// Generate and send digest
const newItems = analyses.filter(a => a.isNew).length;
console.log(`\nFound ${newItems} sources with new content`);
if (newItems > 0) {
const emailHtml = await generateDigestEmail(analyses);
await sendDigest(emailHtml);
console.log("Digest sent successfully!");
} else {
console.log("No new content. Skipping email.");
}
}
runAgent().catch(console.error);# Run once
npx ts-node src/agent.ts
# Schedule daily with cron (on Linux/Mac)
# Add to crontab: crontab -e
0 8 * * * cd /path/to/content-digest-agent && npx ts-node src/agent.ts >> /tmp/digest.log 2>&1
# Or run on a schedule with Node
// Add to agent.ts:
const INTERVAL_HOURS = 24;
setInterval(runAgent, INTERVAL_HOURS * 60 * 60 * 1000);
runAgent(); // Run immediatelyThis is a foundation. Here's what you can extend:
More sources. RSS feeds, Twitter/X API, Reddit API, Slack channels, Notion pages. Each source needs a fetcher that returns FetchedContent.
Better filtering. Add keyword filters, topic categories, or let Claude decide relevance based on your stated interests.
Different outputs. Slack webhook instead of email. Notion database instead of a digest. Push notification to your phone.
Smarter caching. Use a SQLite database instead of flat files. Store historical content for trend analysis.
Tool use. Give the agent MCP tools (see the MCP tutorial) so it can search for related content, look up definitions, or cross-reference what it finds.
The pattern you've built here is the pattern. Fetch. Analyze. Act. Everything else is customization.
Beyond the specific content digest use case, you've built:
This is the anatomy of a real agent. The AI (Claude) provides intelligence. The code provides structure, reliability, and connection to the world. Together, they do something neither could do alone.
Your next agent will be faster to build. The one after that even faster. The concepts compound.

MCP is how AI agents talk to external services. Here's how to set up MCP servers from scratch with working code, not hand-waving explanations.

Stop wrapping ChatGPT in a text box and calling it an agent. Here's how to build real agents with perception, reasoning, tools, and memory.

The demo worked perfectly. Three weeks into production, they pulled it. The gap between prototype and production is always the same set of problems.
Stop reading about AI and start building with it. Book a free discovery call and see how AI agents can accelerate your business.