From ClawHub to Loom: How I Executed a Seamless Project-Wide Rename in Python
Renaming ClawHub to Loom taught me how to refactor a mature Python codebase without breaking imports, tests, or team momentum.
Renaming ClawHub to Loom taught me how to refactor a mature Python codebase without breaking imports, tests, or team momentum.
I built a lightweight, real-time dashboard to monitor GhostGraph's distributed scraping workers using FastAPI, Redis Streams, and server-sent events.
I replaced sequential HTTP fetching with asyncio-powered concurrency—5 requests at a time—and slashed my crawl times by 70%.
I stopped silent network hangs in my Python crawler by layering signal-based hard timeouts over curl_cffi and adding IP rotation to preserve throughput.
How atomic Redis operations fixed state corruption during worker shutdowns in my distributed Vultr Crawler.
I replaced ARQ with a lightweight Redis Streams polling worker—cutting 6k+ lines and improving reliability across my scraping fleet.
How I replaced raw Postgres queries with a type-safe repository pattern in a production scraper—improving testability and long-term maintainability.
How I ripped out a brittle third-party integration and replaced it with a unified, maintainable worker model.
I replaced ARQ with my custom event-driven framework Motia to gain control, clarity, and reliability in my scraping workflows.
I replaced a tangled mess of Python workers with ARQ and Redis, cutting complexity and boosting reliability in my scraping pipeline.
How I used LLMs and ARQ to build a self-adapting, scalable web scraper that survives real-world site changes.
I turned a script-driven scraper into a fully observable web interface using FastAPI and server-rendered templates—no frontend framework needed.