Design a service that every hour generates a set of CSV files from a relational database and uploads them to a collection of partner FTP/SFTP servers so that downstream batch jobs at those partners can consume them. The service must guarantee that partners never see partial or corrupt files, must finish the entire hourly cycle within 60 minutes even when some partners are slow or flaky, and must automatically retry transient failures without human intervention. You should assume: 1–10 GB of new data per hour, 10–50 partner endpoints, file sizes ranging from a few KB to multi-GB, and partner scripts that poll their inbound directories every few minutes. Walk through the high-level architecture, the data flow, the retry and error-handling strategy, and how you would monitor and page on-call engineers when the hourly SLA is missed.