My five-year-old still struggles with idea that his older sister will someday want to live on her own (or that he might!), but he has seemed to finally come to terms with the fact that the Sun will eventually destroy the Earth. He’s certainly not the only kid I’ve met who has had to really grapple with this idea; obviously the Sun is supposed to always be up there in the sky keeping us warm, not tearing itself apart before collapsing into a smaller star in five billion years. It’s a lot to take in, but kids aren’t alone in wondering about this scenario. In fact, astronomers have only recently made sense of some aspects of our Sun’s eventual demise, since the models we had to explain a star’s life cycle didn’t always match what was being observed in space.
Shattered, but sparkling, stars
As a star begins to run out of fuel for its continuous chain of fusion reactions, one of the more dramatic steps of its decline is to become a planetary nebula. This process involves a star ejecting as much as half its mass into space in the form of gas and dust, leaving a smaller core behind. That core, as my third grader was happy to inform me, will then be a white dwarf, making for a significantly chillier solar system. However, the precise nature of the ejected gas and dust is what had astronomers scratching their heads, because they all seem to be too uniformly bright when we see planetary nebulae in other solar systems.
While our own Sun thankfully has a lot of life left in it, we have found planetary nebulae in other solar systems in other galaxies. For a period of around ten thousand years, they emit enough light to be detected, although those light levels oddly don’t vary all that much. According to our calculations, smaller stars should become smaller, dimmer planetary nebulae, but that wasn’t what astronomers were observing. Instead, the light was so consistently bright from these ejected debris fields that we could use the measured light levels to estimate how far away a galaxy was. So why weren’t older, smaller stars as dim as we thought they should be?
Too small to shine
The answer seems to be that, in a way, we were actually over-estimating smaller stars’ brightness. It’s not that lower mass stars are emitting too much light, but that they actually don’t emit much light at all. The bright lights we were seeing were all coming from planetary nebulae over a minimum mass, which is why they were all emitted a minimum amount of light. The truly petite planetary nebulae never got bright enough to see, even accounting for newer understandings of how quickly the ejected debris can heat up. In short, the glow of a planetary nebulae has a minimum mass threshold to emit a detectable amount of light.
As it turns out, that minimum mass seems to be just around the size of our own Sun. If our Sun were just a few percent smaller, its transition into a planetary nebula would be a much dimmer affair. For better or for worse, our Sun’s collapse will be a big enough event to be observed from distant galaxies.
My kids asked: Would people be able to move to Jupiter to be safe?
Well, Jupiter probably wouldn’t work as a gas giant, but what about a moon like Europa? That may actually be too close to the Sun as well, although the issue would arise before we have a planetary nebulae. In around four billion years, our Sun is expected to start expanding into a red giant, vaporizing all the planets up to the Earth in the process. Technically, this will also mean that the Sun’s surface will cool a bit, but even then Mars and Jupiter would both be too close to have habitable temperatures. The moons of Saturn, Uranus or Neptune might work though, assuming life forms in a few billion years could make that trip but not handle the journey to a younger star altogether.
Source: What will happen when our sun dies? by University of Manchester, Science Daily