Without seeing more this seems like it could be solved by not recomputing the entire history to add on data. Depends what kind of math you are doing however.
Some sort of check point system could likely save significant IO.
What am I missing that requires you to recompute all data every day?
It was receiving huge volumes of data from each financial client, and matching it all up to try and find certain things. And match it with existing historical data. Not ads or online tracking. So the loop was adding this data and recalculating everything, It had to be done sequentially, I can't remember the exact reason, but it was a good one.
I was only there a few months as they were so dysfunctional I jumped to another job offer I'd received. We were having endless sprint meetings to "plan" this work when all it needed was someone experienced like me refactoring it for a coupe of days. There was a lot of junior devs with senior developer titles as everyone invariably got promoted every year. The funny thing about the sprint cards was all the tasks I put up a 1 for, the other developers put up a 10, and all the ones I put up 10s for the other developers put up 1s. That's what happens when you let junior devs have a say, no comprehension of what's hard or not.
Before I went I did point out the multiple pretty obvious O(n²) loops they had in the main calculation loop the results of which could easily be cached but I don't know if it went whoosh over their heads.
I'm pretty certain if they'd just let me got on with it instead of holding up sprint cards in the meetings they'd have been down to doing the whole lot is a 1/2 hour run a day, even in the short time I was there.
In my experience the first run of optimizing something like that usually doesn't take long and has huge benefits.
Some sort of check point system could likely save significant IO.
What am I missing that requires you to recompute all data every day?