It was a sudden, massive spike in DDoS (distributed denial of service) attacks against our system. Basically, instead of trying to hack into our system (which hasn't happened), they just hammer us with so many bogus connection requests that our system has trouble responding to regular user/visitor traffic -- like a "flash mob" packing a store with so many people at once that regular customers can't do business there.
By engaging a "botnet" of malware-infected PCs around the world to perform the attack, the bogus connection requests don't all originate with just one or a few IPs that can easily be blocked, so it took some analysis to figure out which types of connection were the bogus ones and shut them out. Worse, our usual uptime monitoring tools weren't detecting that our system was down -- as it technically wasn't, just veeerrry sluggish to respond -- so we weren't even aware of the problem until our morning tech support started their shift and alerted DevOps personnel.
Dealing with DDoS traffic is unfortunately a daily fact of life for any ecommerce service these days, so yesterday's event was just more and worse of that than usual, and for no apparent reason. We've added some new system monitoring tools to catch this sort of thing better and earlier in the future, and we're evaluating options to block DDoS traffic in a more automated and effective manner.