Techno Blender
Digitally Yours.
0 93

One of the most obvious things about looking at stars in the sky is that they’re not all the same brightness. A handful are so bright that you can easily see them even in a big city’s washed-out sky, while others are so faint that they’re invisible unless you’re stargazing on a moonless night from an essentially light-pollution-free locale (if you can find one). This varying visibility of stars is so obvious you may not have given it much thought.

Astronomers, however, think about it a lot. And astronomers, being scientists, decided they had to quantify it; in other words, throw math at it.

The first person we know did this was Greek polymath Hipparchus, who created a star map noting the brightness of various stars more than two millennia ago. A few centuries later, another Greek astronomer, Ptolemy, attempted to classify stars using a six-tier scale, assigning the brightest stars to the first tier and the faintest ones to the sixth. This was the true origin of the magnitude scale, which astronomers still use today.

Outside of astronomers, however, the magnitude scale sees little use—perhaps because it’s confusingly nonlinear! In other words, a magnitude 1.0 star is not six times brighter than a magnitude 6.0 star but rather brighter by a factor of 100. Although this logarithmic scaling (in which each tier is a multiplicative factor fainter than the one preceding it) may be counterintuitive, it is actually quite convenient: a linear scale encompassing the enormous range of stellar brightness would require far too many tiers to be useful.

The magnitude scale’s multiplicative factor is about 2.512. So a magnitude 1.0 star is 2.512 times brighter than a magnitude 2.0 star, and so on. By the time you get to magnitude 6.0, you get a star that’s 1 / (2.512 x 2.512 x 2.512 x 2.512 x 2.512), or about one one-hundredth, as bright as magnitude 1.0.

An obvious issue with this scale is that a fainter star merits a larger number. But you get used to this pretty quickly after a couple of nights under the stars—or, perhaps more realistically, after your first or second semester of graduate-level astronomy coursework.

Modernizing Ptolemy’s scale, however, required anchoring it to a precise, quantitative definition of how bright any star is at a given magnitude. In the 1850s astronomers chose the bright star Vega (which we met a couple of weeks ago in the Summer Triangle) for this purpose. In this system, by definition, Vega has a magnitude of 0.0. Fainter stars have a positive magnitude, and the handful of brighter stars have a negative magnitude.

The brightest star in the night sky is Sirius, visible to Northern Hemisphere observers in winter skies. It has a magnitude of –1.46. The next brightest is Canopus at –0.72, followed by the multiple stars of Alpha Centauri (which appear as just one to the unaided eye) at –0.3. The sun has an astonishing magnitude of –26.74. If you do the math, you’ll find it’s about 50 billion times brighter than Vega!

In general, when viewing from a dark, moonless site, most people with normal eyesight will see stars down to magnitude 6.0. There are roughly 9,000 stars in the sky brighter than “mag 6” (as those in the know slangily say), but not all are visible to the naked eye at the same time: we can only see the stars that are above our horizon. Plus, stars near the sun are invisible (seeing stars in the daytime is, at best, very difficult), so this cuts into the total number that are viewable at any one time. In general, that’s about 2,000.

That may seem woefully lower than what you’d expect! When stargazing from a dark site for the first time, it seems like the sky is brimming with stars by the millions. But humans have a miserable ability to grasp large numbers—especially, perhaps, when we’re already overwhelmed by an awe-inspiring view of the starry vault.

Some people with exceptional eyesight can see objects fainter than mag 6. The gorgeous spiral galaxy M81 has been reliably seen by very keen-eyed observers, and it shines at a feeble magnitude of 6.8, less than half as bright as the faintest star most folks can spot.

If you want to figure out your personal perceptual ranking for all this, simply visit a dark site away from city lights and make sure the moon isn’t up. You’ll also need time to let your eyes adapt to the darkness. It can take a few minutes for your pupils to widen to let in as much light as possible. Also, in dark conditions, the protein rhodopsin (sometimes delightfully called “visual purple” because of its color) coats your retinas, boosting their sensitivity to light. That process can take up to 20 minutes to reach its full potential—unless, that is, you make the mistake of spoiling it with too-bright light! White light is particularly effective at destroying rhodopsin, whereas red light leaves it relatively intact—which is why observational astronomers (and anyone else regularly laboring in dark conditions) often use red illumination as they work.

You can find a good star map online pretty easily; any search engine will point you to one. There are also several great astronomy apps you can download to identify stars and their brightness using a smartphone—just be sure to enable the “red light” settings to preserve your night vision! With these tools in hand, orient yourself by finding a familiar constellation overhead and settle down for your search. Start at a bright star (say, Vega or Deneb in the summer or Sirius or Rigel in winter), then look around for fainter stars nearby. Check your map and make a note of the faintest star you can see.

How low can you go? Where I used to live in Colorado, I could see mag 5 stars from my driveway after I got dark-adapted, but light pollution from nearby towns made fainter stars invisible. Generally speaking, the farther out you are from civilization, the fainter the stars you can see. Maybe you’re one of the lucky few who can see mag 7 stars. But if not, don’t fret. With binoculars and telescopes, you can see much fainter stars because they gather and concentrate more light into your eyes. With these tools, far fainter objects can be seen, and you’ll find stars everywhere you look in the heavens above.

Here’s wishing you clear skies, cool temperatures, a wonderful night spent under the stars—and a new appreciation of the magnitude of what you’re seeing.

This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of Scientific American.

One of the most obvious things about looking at stars in the sky is that they’re not all the same brightness. A handful are so bright that you can easily see them even in a big city’s washed-out sky, while others are so faint that they’re invisible unless you’re stargazing on a moonless night from an essentially light-pollution-free locale (if you can find one). This varying visibility of stars is so obvious you may not have given it much thought.

Astronomers, however, think about it a lot. And astronomers, being scientists, decided they had to quantify it; in other words, throw math at it.

The first person we know did this was Greek polymath Hipparchus, who created a star map noting the brightness of various stars more than two millennia ago. A few centuries later, another Greek astronomer, Ptolemy, attempted to classify stars using a six-tier scale, assigning the brightest stars to the first tier and the faintest ones to the sixth. This was the true origin of the magnitude scale, which astronomers still use today.

Outside of astronomers, however, the magnitude scale sees little use—perhaps because it’s confusingly nonlinear! In other words, a magnitude 1.0 star is not six times brighter than a magnitude 6.0 star but rather brighter by a factor of 100. Although this logarithmic scaling (in which each tier is a multiplicative factor fainter than the one preceding it) may be counterintuitive, it is actually quite convenient: a linear scale encompassing the enormous range of stellar brightness would require far too many tiers to be useful.

The magnitude scale’s multiplicative factor is about 2.512. So a magnitude 1.0 star is 2.512 times brighter than a magnitude 2.0 star, and so on. By the time you get to magnitude 6.0, you get a star that’s 1 / (2.512 x 2.512 x 2.512 x 2.512 x 2.512), or about one one-hundredth, as bright as magnitude 1.0.

An obvious issue with this scale is that a fainter star merits a larger number. But you get used to this pretty quickly after a couple of nights under the stars—or, perhaps more realistically, after your first or second semester of graduate-level astronomy coursework.

Modernizing Ptolemy’s scale, however, required anchoring it to a precise, quantitative definition of how bright any star is at a given magnitude. In the 1850s astronomers chose the bright star Vega (which we met a couple of weeks ago in the Summer Triangle) for this purpose. In this system, by definition, Vega has a magnitude of 0.0. Fainter stars have a positive magnitude, and the handful of brighter stars have a negative magnitude.

The brightest star in the night sky is Sirius, visible to Northern Hemisphere observers in winter skies. It has a magnitude of –1.46. The next brightest is Canopus at –0.72, followed by the multiple stars of Alpha Centauri (which appear as just one to the unaided eye) at –0.3. The sun has an astonishing magnitude of –26.74. If you do the math, you’ll find it’s about 50 billion times brighter than Vega!

In general, when viewing from a dark, moonless site, most people with normal eyesight will see stars down to magnitude 6.0. There are roughly 9,000 stars in the sky brighter than “mag 6” (as those in the know slangily say), but not all are visible to the naked eye at the same time: we can only see the stars that are above our horizon. Plus, stars near the sun are invisible (seeing stars in the daytime is, at best, very difficult), so this cuts into the total number that are viewable at any one time. In general, that’s about 2,000.

That may seem woefully lower than what you’d expect! When stargazing from a dark site for the first time, it seems like the sky is brimming with stars by the millions. But humans have a miserable ability to grasp large numbers—especially, perhaps, when we’re already overwhelmed by an awe-inspiring view of the starry vault.

Some people with exceptional eyesight can see objects fainter than mag 6. The gorgeous spiral galaxy M81 has been reliably seen by very keen-eyed observers, and it shines at a feeble magnitude of 6.8, less than half as bright as the faintest star most folks can spot.

If you want to figure out your personal perceptual ranking for all this, simply visit a dark site away from city lights and make sure the moon isn’t up. You’ll also need time to let your eyes adapt to the darkness. It can take a few minutes for your pupils to widen to let in as much light as possible. Also, in dark conditions, the protein rhodopsin (sometimes delightfully called “visual purple” because of its color) coats your retinas, boosting their sensitivity to light. That process can take up to 20 minutes to reach its full potential—unless, that is, you make the mistake of spoiling it with too-bright light! White light is particularly effective at destroying rhodopsin, whereas red light leaves it relatively intact—which is why observational astronomers (and anyone else regularly laboring in dark conditions) often use red illumination as they work.

You can find a good star map online pretty easily; any search engine will point you to one. There are also several great astronomy apps you can download to identify stars and their brightness using a smartphone—just be sure to enable the “red light” settings to preserve your night vision! With these tools in hand, orient yourself by finding a familiar constellation overhead and settle down for your search. Start at a bright star (say, Vega or Deneb in the summer or Sirius or Rigel in winter), then look around for fainter stars nearby. Check your map and make a note of the faintest star you can see.

How low can you go? Where I used to live in Colorado, I could see mag 5 stars from my driveway after I got dark-adapted, but light pollution from nearby towns made fainter stars invisible. Generally speaking, the farther out you are from civilization, the fainter the stars you can see. Maybe you’re one of the lucky few who can see mag 7 stars. But if not, don’t fret. With binoculars and telescopes, you can see much fainter stars because they gather and concentrate more light into your eyes. With these tools, far fainter objects can be seen, and you’ll find stars everywhere you look in the heavens above.

Here’s wishing you clear skies, cool temperatures, a wonderful night spent under the stars—and a new appreciation of the magnitude of what you’re seeing.

This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of Scientific American.