Inventions affect our everyday lives. Most of them are the result of long and laborious research that begins with curiosity and, in the best case scenarios, ends with a groundbreaking innovation. In the course of this process, it frequently happens that, purely by chance, exciting discoveries are made that have little to do with the original research objectives. An interesting example of this happened to the Austrian scientist Friedrich Reinitzer when he was experimenting with carrots at the end of the 19th century: He discovered liquid crystals, without which, today’s display technologies could never have been realized. If we believe their progenitors, Teflon, X-rays, and even microwave ovens owe their existence to pure coincidence. What these inventors, however, all do have in common is the curiosity that motivated them to explore the unknown. An the courage to trust in pure chance.
Here are Eight Coincidental Discoveries That Affect Our Everyday Lives
Liquid Crystals
A stroke of luck paves the way to sharper, thinner and more touch-sensitive displays – and a technology that continues to go from strength to strength. The reasons for this lie in the molecular magic hidden away behind the thin transparent panels of monitor screens: liquid crystals enable us to view bright and brilliant displays of text and pictures on the screens of smartphones, tablets and flat screen TVs and monitors. But who would have thought that Friedrich Reinitzer, a botanist, and chemist, discovered the “apparently living crystals” as long ago as in 1888, while conducting research into the substances contained in carrots?
At first, hardly anybody paid any attention to what the Austrian scientist had discovered. “An idiosyncrasy“, was how the professional world at the time described the newly discovered substances. They possess liquid form at certain temperatures, but also simultaneously display typical properties of solid crystals: what’s more, their translucency also varies according to their electrical orientation in an electric field. Ultimately, the German physicist Otto Lehman coined the name “liquid-crystalline” for this entirely new state of matter.
Crystals as Photoelectric Cells in Display Panels
Only decades after their discovery, scientists finally began to investigate the particular properties of liquid crystals and employed them as photoelectric cells in monitor screens. In 1968, the US engineer George Heilmeier presented the first liquid crystal display to the world. In the nineteen-seventies, when scientists at Merck KGaA, Darmstadt, Germany successfully developed more stable liquid crystals with faster responses and the ability to display much sharper images, the triumphal march of Liquid Crystal Displays (LCD) into the future progressed in leaps and bounds. Today, Merck KGaA, Darmstadt, Germany is one of the world’s leading suppliers of liquid crystals – and the trend towards ever larger, ultra-high resolution monitor screens, panels and 3D TVs shows absolutely no signs of slowing down.


Teflon: How Useless Crumbs Revolutionized Modern Cooking
If your fried egg or pancakes slide effortlessly out of the skillet and on to your plate, that could well be due to a rather special material: Teflon. That this heat-resistant, non-stick polytetrafluoroethylene (PTFE) coating was a spin-off from the US space program has since proved to be merely a rumor. The true story is that it was discovered in 1938 by the US chemist Roy Plunkett during his search for a low-risk, non-toxic and non-volatile alternative to ammonia and sulfur dioxide as a coolant for refrigerators.
While experimenting with the gas tetrafluoroethylene (TFE), Plunkett found some colorless crumbs in his reagent glass that awakened his curiosity. He determined that the gas had been polymerized to PTFE, aka Teflon, in cold storage. But whatever he tried, the crumbs just lay there and showed absolutely no propensity to react with anything at all, so he finally wrote them off as being utterly useless. It was the early 1940s before scientists found a use for the exceptionally non-reactive material: During the construction of the first atomic bomb, it was used to protect the vessels in which uranium hexafluoride was stored against corrosion.
A Non-Reactive All-Rounder With an Unexpected Future
According to this story, this is how the worldwide kitchen revolution with Teflon began: French chemist and angler Marc Grégoire coated his fishing line with PTFE to make it easier to untangle. His wife, Colette thought it was a good idea, too – for the kitchen. In 1954, she and Georgette Wamant were granted a patent for coating pots and pans with Teflon. Even today, the hardwearing, non-stick universal coating can be found doing an excellent job in a multitude of applications and areas – in the kitchen, in hospitals, for particularly long-lasting medical implants, and, in the building industry, as a weather-resistant coating for facades.
Microwaves: A Melted Chocolate Bar Points the Way
Or how a small oven achieves boiling point in only a couple of seconds: Microwaves make the water molecules in rice, sauces, or meat dance about. This produces heat – and your favorite deep frozen dinner from the fridge is as hot as you like it in minutes. This kitchen revolution also came about purely by chance. And was a stroke of luck for engineer Percy Spencer, whose curiosity led him to file around 300 patents in the course of his life.
His Successful Idea Began When He Put His Hand in His Pocket
In the 1940s, the US engineer was experimenting with megatrons, oscillators that generated the high-power output needed for radar equipment in US military aircraft in the Second World War. He began to wonder why a chocolate bar he had in his pants pocket melted so quickly while he worked. That generators produce heat while they are running was nothing particularly new. Nevertheless, Spencer was the first to recognize that this heat could be used economically for heating food. The idea for the microwave oven was born. In 1947, the engineer present his first model. Almost six feet tall and weighing in at around 750 pounds, it looked more like a cupboard than anything else.
The first dish prepared in a microwave oven was popcorn. The second was an egg – which went off with a bang while the scientist looked on. Before the microwave oven began its illustrious career as an indispensable feature of kitchens around the globe, it was primarily manufactured for use in passenger planes. Today, more than 70 percent of all German households have a microwave oven in the kitchen.
Nuclear Fission: From an Atomic Breakthrough to a Global Threat
The atomic age began in 1938 with a coincidental discovery: the proof that an atom could be split. The German chemist Otto Hahn, his assistant Fritz Strassmann and physicist Lise Meitner had no idea how successful their experiments would be as they searched for a new element. They bombarded the radioactive element Uranium with neutrons – and were suddenly confronted with a mystery. Instead of a new element, they found the much lighter elements Barium and Krypton among the products of the reaction.
The conclusion the three pioneers came to was spectacular: they had succeeded in splitting an atom. The trio of scientists also observed something else: the energy released in the process amounted to 200 megaelectron volts (MeV) and released neutrons that triggered a chain reaction. This multiplied the number of fissions. The amount of energy released in the reaction was absolutely gigantic. It exceeded the amount of energy produced by burning coal millions of times over.
Nuclear Energy – the Bright Future Soon Becomes a Dark Threat to Life on Our Planet
In the USA, the world’s first experimental reactor to utilize this new source of energy, the Chicago Pile-1 constructed by Enrico Fermi, went critical in 1942. Nuclear fission also soon awakened the interest of the military: 1945 saw the first successful detonation of a nuclear weapon by the United States Army at a testing site in New Mexico. Shortly afterwards, US atom bombs devastated the Japanese cities of Hiroshima and Nagasaki with an enormous toll of human lives. Following the meltdown of the reactor at the nuclear power station in the Ukrainian city of Chernobyl in 1986, the incident at the Japanese Fukushima nuclear plant in 2011 opened the eyes of the world to the dangers of nuclear power generation. In the aftermath of Fukushima, Germany made the decision to opt out of nuclear power and gradually phase out all its atomic power stations.
X-Rays: The Mystery of the Strange Blue Light
Or how to look through things and see the invisible: Following the groundbreaking discovery of this form of radiation by Wilhelm Conrad Röntgen, the term ‘X-ray eyes’ soon became a familiar catchphrase – and Röntgen’s pioneering discovery changed the face of medicine more or less overnight. The German physicist discovered the strange blue light in 1885, purely by coincidence: At the time, he was investigating the recently discovered electron beams in his workshop. One of his experiments produced a blue light that made a number of pieces of crystal that happened to be in the room begin to glow. They even continued to glow after Röntgen completely covered the source of this mysterious light with a piece of black card.
Röntgen concluded that these rays were able to pass through solid materials – and wondered whether they could possibly also pass through the human body. As his wife laid her hand on a sensitized photographic plate, it was made transparent by the mysterious light he gave the name of ‘X-rays’. This X-ray picture of a hand with a wedding ring astounded the world. This chance discovery brought Röntgen a Nobel Prize for physics, revolutionized diagnostic medicine, and paved the way to other important new findings, for instance, the discovery and investigation of radioactivity.
Looking inside the DNA Helix
The first X-ray pictures predominantly showed the bones of the human body. Today, modern equipment even makes it possible to take a closer look at the body’s various organs. What’s more, computer tomography now enables the world of medicine to create three-dimensional X-ray pictures of the internal structures of the human body. X-rays continue to provide groundbreaking insights in areas from research science to cancer therapy – and have opened up new areas of investigation such as the analysis of the DNA helix.
Photography: The First Photograph Thanks Its Existence to a Thunderstorm
Painting pictures with light – a dream that inspired experimenters and inventors for centuries. As long ago as in 400 BC, the Greek savant Aristotle is aware of the principle of the Camera Obscura – the “dark room” that was the forerunner of today’s cameras. Leonardo da Vinci, the Italian genius, described the pinhole camera as the equivalent of the human eye. But, throughout all these centuries, one burning question remained unanswered: How could this image painted by light be conserved for longer periods of time?
The first to come up with an answer was the French researcher and inventor Nicéphore Nièpce, who managed to permanently capture a lasting image on paper in the 19th century – after an exposure time of eight hours. In 1837, the giant step forward towards the snapshots we know today was made by another Frenchman, Louis Jacques Mandé Daguerre – once again, purely due to an unexpected stroke of luck: Daguerre, by profession a painter, was caught outdoors in a thunderstorm while exposing a copper plate sensitized to light by silver iodide. The storm forced Daguerre to stop the exposure sooner than planned, and on returning to his studio, stowed the exposed plate away in the cupboard where he kept his chemicals.
On The Way to Digital Photography
Next day, he could hardly believe his eyes: the scene captured on the plate was still visible! Daguerre discovered that vapors from spilled mercury in the cupboard must have reacted with the silver iodide to fix the image on the plate. The Daguerreotype process, named after its inventor, made it possible to expose photographs within only a few minutes, and stands as a milestone in the history of photography. Around the middle of the 19th century, the British scientist, inventor, and pioneer of photography, William Henry Fox Talbot, brought his own, negative-positive process on to the market, which soon overtook and replaced the Daguerreotype process – it remained the standard for photography until it was relegated to a niche existence by the advent of digital photography.
The Archimedean Principle: Buoyancy in the Bathtub
Or why does a ship made from thousands of tons of steel stay afloat? Because the water it displaces creates the buoyancy that stops it sinking. In scientific terms: the volume of water displaced by the hull is greater than the overall mass of the vessel. And that’s why it floats on the surface. This is a description of the Archimedean Principle in simple words. A principle that lets us calculate whether bodies will sink, be suspended, or float in gases or fluids.
This principle also owes its existence to a chance meeting of curiosity and coincidence. 2000 years ago, the King of Syracuse commissioned Archimedes, the Greek master of many sciences, to find an answer to the following question: namely whether the crown the king ordered is really made of pure gold through and through? As the sage took a bath in a bathtub filled to the rim with water, his brainwave put an end to the king’s worries: the water spilled over the rim and the idea for the principle of buoyancy came to Archimedes in a flash. As the legend goes, Archimedes sprang from his bath and dashed naked through the streets of Syracuse shouting “Eureka!” – “I’ve found it!”.
A Milestone of Physics Reveals a Dishonest Goldsmith
The king’s challenge is quickly resolved, without any damage to his precious crown: One after the other, Archimedes sank the kings head wear and a bar of gold of the same weight in a vessel filled to the brim with water. Each time, he caught and measured the amount of water that overflowed. The result: The suspect crown displaced more water than the bar of gold – which revealed that its volume was greater, despite its identical weight. Archimedes rightly concluded that the crown was made of material with a lower density (lower mass per unit volume) – and that some less precious metal was fraudulently alloyed with the gold of the crown. He then put pen to papyrus to write down what still stands today as a milestone event in the history of science.
The Law of Gravity: The Apple That Fell from the Tree of Knowledge
Even today, the mystery remains as to what gravity really is and where it comes from. Isaac Newton, a student of mathematics, was painfully reminded of the existence of a universal force when an apple fell on his head one day in 1665. Newton’s mind began to work overtime. Just why do all objects fall towards the center of the Earth? How far away from the planet can the force of attraction still be felt? Could this be an explanation for why the Moon remains fixed in its orbit around the Earth? And why the other planets have a stable orbit around the Sun?
The Sun Exerts a Force of Attraction on the Planets – Just like the Earth Does on the Apple
Late, after becoming a professor, the exceptional British scientist first described how the force of gravity acts on the Earth and throughout the universe in mathematical terms. In his work, “Philosophiæ Naturalis Principia Mathematica” (The Mathematical Principles of Natural Philosophy), published in 1687, he provided the first comprehensive explanation for the force of gravity on the planet Earth, why the Moon circles the Earth, and why the planets maintain their orbits around the Sun. He postulated the theory that, without the attractive force of the Sun, the planets would fly out of the solar system in a straight line. Yet the burning star at the center of our solar system exerts a pull on its planets – in the same way as the Earth’s attractive force makes the apple fall down, and not up.
The appearance of Newton’s “Three Laws of Motion” and his “Law of Gravity” made it possible to calculate the orbits of planets and the speed at which they circle the Sun. Thanks to Newton’s work, phenomena such as tidal forces and deviations in the distance of the Moon from the Earth can also be plausibly explained. Newton’s Laws went on to become the fundamental mainstays of classical physics.