Back to Curriculum

Trend Discovery Engines: Automating Cultural Relevance

In 2026, content viralitiy depends on Low-Latency Relevance. If you are 24 hours late to a trend, you are irrelevant. In this lesson, we learn how to architect an autonomous discovery engine that grounds AI scripts in real-time Pakistani trends.

🏗️ The Discovery Pipeline

  1. Google Trends (pytrends): Pulls the top 5 trending keywords in Pakistan every hour.
  2. NewsAPI / GNews: Scans for local headlines related to those keywords.
  3. Reddit/Social Scraper: Extracts sentiment and "localized lingo" from active threads (e.g., r/pakistan).

🛠️ Technical Snippet: Pytrends Logic for PK

from pytrends.request import TrendReq

pytrends = TrendReq(hl='en-US', tz=360)
kw_list = ["Pakistan"]

# Get daily search trends
trending_searches_df = pytrends.trending_searches(pn='pakistan')
top_trends = trending_searches_df[0].tolist()[:5]

print(f"Active PK Trends: {top_trends}")

🔍 Nuance: Sentiment Weighting

A trend is only useful if it has high "Sentiment Velocity." Our discovery engine scores trends based on upvotes and comments to determine if a topic is worth generating a full video for.


⚡ Practice Lab: The Trend Grounder

  1. Identify: Find one trending topic in Pakistan right now.
  2. Command: Ask an AI to "Draft a 15-second Reel script about this topic."
  3. Refine: Add the instruction: "Inject 3 specific details from the latest news headlines about this topic to ensure authenticity."
  4. Analyze: Note how the "News Grounding" makes the script feel alive.

📝 Homework: The Trend JSON

Build a Python script that pulls the top trend from Google Trends (PK) and outputs a JSON object containing: {trend_name, primary_reason, suggested_hook}.