The irony is Karpathy presents the limit/epsilon definition of derivatives in the first half hour (quite well IMO and he never actually says “epsilon”) which is very much a nuts and bolts kind of explanation in calculus.
That said, when most people say differential equations they’re usually thinking of analytical solutions which is very much not necessary for practical ML.
I would say the limit epsilon derivative is exactly the sort of thing grandparent post is talking about. It's quite intuitive and doesn't require hardly any mathematical foundation at all, other than basic geometry and algebra. You can understand topics that build on that simple concept without understanding the more formal derivative definitions.
That said, when most people say differential equations they’re usually thinking of analytical solutions which is very much not necessary for practical ML.