You are viewing a single comment's thread from:

RE: LeoThread 2025-11-05 15-48

in LeoFinance21 days ago

Part 3/14:

Cord’s story highlights his knack for problem-solving and innovation. After establishing his ISP, he pivoted into developing web crawlers—one notably called "grub"—which aimed to aggregate news and internet data. His ambition was to understand how news changed over time and, intriguingly, to use sentiment analysis for day trading applications. These explorations led to early experiments in distributed web crawling—an idea inspired by SETI@home—where computational tasks for web indexing were decentralized across volunteers’ machines.