Why did camping lanterns used to be radioactive?

Why did camping lanterns used to be radioactive?

Traditional camping lanterns used radioactive thorium to produce their bright white light.

Before LEDs, gas lanterns used mesh pouches called mantles coated in thorium nitrate. When heated by a flame, the thorium glowed with an intense white light. While the radiation was low, the main danger was inhaling radioactive dust if a brittle mantle broke. Most modern brands now use safer alternatives like yttrium.
Nerd Mode
The use of thorium in gas mantles dates back to 1885 when Austrian chemist Carl Auer von Welsbach patented the technology. He discovered that a mesh of 99% thorium dioxide and 1% cerium dioxide produced the most efficient incandescent light. This specific mixture is known as the Welsbach mixture and it revolutionized lighting before electricity became widespread.Thorium is a naturally occurring radioactive element that emits alpha particles. When the gas flame heats the mantle to high temperatures, the thorium converts heat into visible light through a process called candoluminescence. This allows the lantern to produce a brilliant white glow that is significantly brighter than a standard yellow chemical flame.Safety concerns regarding thorium became more prominent in the late 20th century. While the gamma radiation emitted from a single mantle is negligible, the alpha radiation becomes a health hazard if the thorium is ingested or inhaled. This often happened when old, brittle mantles crumbled into fine dust during replacement or transport.In response to these risks, major manufacturers like Coleman began phasing out thorium in the 1990s. They replaced it with non-radioactive yttrium or zirconium compounds. Although these alternatives are slightly less efficient at producing light, they eliminate the radioactive risks associated with manufacturing and disposal.
Verified Fact FP-0008456 · Feb 20, 2026

- Technology -

outdoors thorium technology
Press Space for next fact