Humans have faced, and will continue to face, a number of existential threats as they move through time.
The first were disease, famine, and environmental.
The current one is dire, and it’s basically the outpacing of our weapons over our maturity. We might kill ourselves with nuclear and/or biological weapons before we are smart enough not to, in other words.
Assuming we make it through that, next phase seems to be major asteroids, which hopefully we’d be getting pretty good at detecting and countering by that point.
Next is the big one…the sun is going to burn out, and we have to be gone by that time.
And then the next start will burn out as well, and so on and so on.
But that’s not the real problem. The real problem is entropy. The real problem is that at some point all energy will have equalized in the universe to such a degree as to become unusable.
At that point there’s no solution other than the penultimate solution: leaving this universe.
I don’t know if that’s possible, but string theory and other models seem to offer the possibility.
What I do know is that if we ever generate true AI on the scale of film and dream and nightmare, it’s primary mission will be escaping these various life-ending events, with heat death being the worst of them all.