Part 3/14:
Cord’s story highlights his knack for problem-solving and innovation. After establishing his ISP, he pivoted into developing web crawlers—one notably called "grub"—which aimed to aggregate news and internet data. His ambition was to understand how news changed over time and, intriguingly, to use sentiment analysis for day trading applications. These explorations led to early experiments in distributed web crawling—an idea inspired by SETI@home—where computational tasks for web indexing were decentralized across volunteers’ machines.