Why we do not know whether the Big Bang was the beginning of the universe, or not

Warm-up: Laws of nature have their limit

Newton’s law of motion (F=ma) is OK as long as you don’t go “too fast”.  What does “too fast” mean here? It means slower than the speed of light. So as long as you want to compute what happens to your car, this plane or this train, no problem, Newton is good. High speed trains in Spain run a best at 0.000028% of the speed of light. So yes, it’s slower.

But when you play with things approaching the speed of light, Newton progressively gets it wrong. In the equation “F=ma”, Special Relativity multiplies the mass “m” by a term which is 1 for small velocities, but progressively goes to infinity as you approach c (“c” is the symbol for the speed of light). No wonder its changes the math.

We know this thanks to Einstein’s theory of Special Relativity, and to the zillions of experiments which proved it right since 1905. In plasma physics for example, my field of research, there’s no way you can understand nowadays intense laser-plasma experiments, without Special Relativity.


Newton’s law of gravity also has allergies. It’s when the gravitational field becomes “too strong”. It wants it “low”. Here, “low” means you’re much farther from the central mass than its “Schwarzschild radius”. Important detail: this Schwarzschild radius is proportional to the central mass. It grows with that mass.

This radius is usually so small that it fits way inside the mass itself. For example, the Schwarzschild radius of the Sun is only 3 km. But if you’re close enough to the Sun, like Mercury, you can detect tiny, tiny, deviations from Newton’s law of gravitation. Einstein’s “General Relativity” (GR) solved this.

The bottom line? Every physical theory we know has its limits. Newton doesn’t like too fast a motion, or too close to the Sun. Maxwell’s equations also have their limits, etc. And GR Relativity, does it have limits? Yes. A little Thought experiment shows it. Here it goes.


General Relativity has its limit also

Take a mass M with the electric charge of a proton. Put an electron around. Quantum Mechanics (QM) tells the electron will settle from the mass M at a distance equal to the so-called “Bohr radius”. Then it can jump between energy levels and all that, but the Bohr radius is the typical distance it will orbit from the central mass M.

The Bohr radius does not depend on the central mass M. Only on its electric charge.

Now, increase the central mass. A lot. If M is an everyday proton, its Schwarzschild radius is so incredibly smaller than the Bohr radius that no one cares about gravity. But since the Schwarzschild radius grows with M and the Bohr radius does not, the Schwarzschild radius must eventually reach the Bohr radius, for a large enough M (see figure below).

Clearly, squeezing such a mass inside its Bohr or Schwarzschild radius implies an immense density. But in principle, it’s inescapable. As you push up density, both radii will eventually merge.


Rs and a0

For a large enough central mass M, the Schwarzschild radius must catch up with the Bohr radius. The left figure is absolutely not to scale.


Now, for such a huge mass, what do QM and GR say? The Bohr radius was computed forgetting about gravity. And the Schwarzschild was computed forgetting about QM. But now that both radii are equal, we can no longer be so forgetful. How should we then modify QM and GR to describe the motion of the electrons?

No one knows.

GR is no longer valid when QM effects must be accounted for, and vice versa. Like Newton’s law of motion is no longer valid when you approach the speed of light. Like Newton’s law of gravity is no longer valid when you’re too close to the Sun.

But while Einstein found how to extend Newton if you go to fast, or you’re too close to your sun, we still don’t know how to extend GR when QM has a word.

In case you’re familiar with the double slit experiment in QM, here’s another thought experiment which shows the same: GR has its limit, which we could write “GR+QM=?”.

Now we can talk about the Big Bang.


What does it have to do with the Big Bang?

I won’t elaborate too much on the Big Bang here. Let me just remind that the idea was born with the observation of the expansion of the universe. That some have tried to interpret these observations otherwise, without success. And that such observations are perfectly in line with GR.

Since the universe is expanding today, we just have to rewind the movie to find any 2 points of it must have been closer to each other in the past. That’s precisely what GR tells us. In addition, GR provides the mathematical description of how this distance changed with time (see the famous “FLRW metric”).

It just happens that at time t=0, the distance between any 2 points goes to 0. This is called the “Big Bang singularity”. So the question comes: is it physical? In other words, can we trust GR it really happened, like we trust GR when our GPS tells we’re or there?

No. The reason for this is simple. As we approach t=0, the universe squeezes matter in an ever-decreasing volume. So the density goes up, together with the temperature. Not only they go up, but they mathematically go up to infinity at the singularity. Now, if the density goes to infinity, sooner or later you must reach a point where our thought experiment describe above applies. You must reach a point where GR and QM have to work together. Where GR can no longer ignore QM effects. That is, when GR fails.

There’s no way around. The FLRW metric fails before it reaches t=0. We cannot trust it down to t=0. GR alone cannot tell whether the BB singularity really happened, or not. It becomes blind before.


Any way out?

To resolve the singularity, that is, to know what really happened instead of these mathematical infinites at t=0, we need to marry GR with QM. Many bright people have been working on this for decades (Einstein included), so far without success, probably in part because it is extremely difficult to make experiments or observations that could help discriminate between various options.

The good old days when you could test GR with Mercury’s orbit and QM in your kitchen, are gone. It took billions of dollars to test de Standard Model and to find the Higgs boson at CERN, and still, the accelerator they used for that is way too small to test any proposal of GR/QM unification.

So there are candidates out there, like String Theory or Loop Quantum Gravity, all of them untested, be it through observation or experiment. Interestingly, both String theory and Loop Quantum Gravity could give a “Big Bounce”.  Other kinds of “bounce cosmology” are also proposed. Stephen Hawking also had a proposal, this one with a real beginning. But again, nothing certain, for nothing successfully tested like GR or QM.

One last word about the “BGV theorem“, frequently cited in this context. It is classical, which means it does not account for QM. Just GR here. Its conclusions are therefore very useful, but we know they do not describe the real world (it’s always useful to know the behavior of a model, even if you know it does not describe the real world). That’s what Sean Carroll tries to explain to William Lane Craig around min 58 of this debate. He even shows the picture below, where Alan Guth, the “G” of BGV, tells he thinks the universe may be past eternal but that basically, we don’t know.


We may close further this case with this text from Avi Loeb, who teaches cosmology at Harvard, and Paul Steinhard who does the same thing at Princeton. Toward the end, we find this sentence,

“Although most cosmologists assume a bang, there is currently no evidence—zero—to say whether the event that occurred 13.7 billion years ago was a bang or a bounce”

Finally, those who worry about entropy can rest assure that Alan Guth, Avi Loeb, Neil Turok or Paul Steinhard, to name a few, also know about it and for example, read this.

The current scientific answer to the question “did the universe had a beginning?” is therefore simple, and for simple reasons.

Current science simply doesn’t know.



No, decay rates don’t change easily

It is often said that the decay rates used in radio dating may have varied over time. Let’s see that.

Warm up

The nucleus of an atom is determined solely by the number of protons and neutrons it contains. Around the nucleus gyrate some electrons, equal in number to the number of protons in the nucleus. For quantum mechanical reasons, these electrons are arranged in shells, pretty much like an onion (or Shrek).


(Very) schematic representation of an atom. There may be more than 3 shells, and more than one electron per shell. The total number of electrons is equal to the number of protons in the nucleus. During a disintegration, the ejected particle (black point) first meets the electrons of the inner shell.

Some nuclei, like our good old carbon-12 (6 protons, 6 neutrons) are stable. Put a billion in a box. Go for a walk and then take a look: the billion is still there.

Others, like the famous carbon-14 (6 protons, but 8 neutrons) are unstable. Put a billion in a box. Go for a walk and then take a look: there is no longer one billion. Some have changed to nitrogen-14 (7 protons, 7 neutrons). If your walk lasted about 5,730 years, you will find that half of your C14 nuclei have changed to nitrogen. This duration, 5 730 years, is the half-life of C14.

Radioactivity is therefore a completely natural phenomenon (don’t tell Greenpeace), changing one nucleus into another. Out of the 3,000 or so known nuclei, about 250 are stable. The others are unstable, that is, radioactive.

When a radioactive nucleus decays, it ejects something. The most common decay modes are:

  • Ejection of two neutrons and two protons. This is “alpha” decay.
  • Ejection of an electron. This is “beta minus” decay.
  • Ejection of a positron (the antiparticle of the electron). This is the “beta plus” decay.

The decay rate sets the pace at which the process occurs. The half-life of a radioactive atom is the time it takes for half of a sample to be transmuted. It is the fruit of the laws of physics. There are therefore 2 possibilities for half-lives to change:

  1. The laws themselves change, or
  2. Within the limits of these laws, something alters the half-lives.

Have the laws of nuclear physics changed?

Let me here recycle this article on the speed of light. There are literally millions of observational evidences that the laws of physics, nuclear physics included, have been the same for billions of years. Let me emphasize the observations of decay events million light years away, which rates are the ones we observe here and now (this is in the article).

Let me also remind that changing the laws of nature implies losing the fine-tuning argument, as well as energy conservation (this is also in the article).

The laws of nuclear physics have not changed in the last billion years. This comes from observation. Let’s move on to the second possibility.

Can decay rates change… even in the laws of physics don’t?

Yes, they can. Let’s see that.

Radioactivity is a purely quantum phenomenon. Half-lifes depend on the ejection probability of the particle leaving the nucleus. Doing so, this particle will first meet the electrons of the inner shell of the atom. The ones closest to the nucleus.

We can imagine that if we remove these internal electrons by completely ionizing the atom (that is, removing all of its electrons), the ejected particle runs into a modified environment as it leaves the nucleus, which could alter the probability it had to get out in the first place. Quite like it may be easier, or more difficult, to leave a room depending on whether the next room is crowded or not. And indeed, it can happen.

The record in this respect belongs to Rhenium-187 (75 protons, 112 neutrons). It undergoes beta-minus decay and turns to Osmium-187. Its half-life is 42 billion years. In 1987, theorists computed the half-life of the same atom, but stripped of its 75 electrons. They found… 14 years!! The experiment, very difficult to perform, was made in 1996 and found 33 years. An excellent theory/experiment agreement, considering the variation of the half-life at stake (tens of billions of years -> ten years) and the challenges presented by this kind of experiments.

For the curious mind, this dramatic change is explained, in part only, by the fact that the electron ejected during the beta-minus decay is more easily released from the nucleus if it does not meet other electrons when exiting (electrons repel each other). The rest is a matter of quantum mechanics.

A good starting point to learn more on decay rates variations is the Wikipedia section on the subject or this article (…or Chapter 5 of my book).

Can we then trust radioactive dating?

If, then, the laws of nature allow for large variations of half-lives, how can we consider them constant when dating?

Simply because the conditions required to change the half-lives are extreme. At first glance, one could think, “Removing all the electrons? It doesn’t seem very complicated”. In fact, it is. To deprive hydrogen atoms from their unique electron, for example, a temperature of some 10,000 degrees Kelvin is required. To do the same with the 6 electrons of carbon, about 400,000 degrees. And for Rhenium-187, it will take more than 65 million degrees to remove its 75 electrons and divide its half-life by about 1 billion [1]. And these are lower temperature limits.

We could draw a parallel with tree dating through rings counting. Trees can burn, right? So how can rings counting be a reliable dating method? Simply because if I have a tree trunk before me, it means it did not burn.

So if the bone I want to date with C14 is in my hands, it means it has never been heated to more than 400,000 degrees. The electrons of the inner layers have been quiet, together with the half-life.


Let’s conclude commenting on some alleged variations of decay rates with the Earth-Sun distance. Two comments about it:

  • The claimed variation is only 0.1%. Such error would translate to the dating, adding a 0.1% uncertainty to the final result. Considering that C14 already has error bars of the order of 10%, we see that plus or minus 0.1% will not change much. It’s as if you were told that the length of a bridge might have changed by one millimeter, while you can only measure it to within 1-meter accuracy anyway. You would politely acknowledge, while thinking that you will worry about it the day you’ll have millimeter precision.
  • When people make an important observation like this one, others try to reproduce it. If 1 team, then 2, then 3 independently confirm the result, no doubt something is happening. But if other teams fail to reproduce the observation, there is a problem.
    In this case, those who tried to repeat the measurements did not detect any change of the decay rates. The reader can check it by having a look at the articles citing the original work.

A seemingly more established oddity (yet, seldom, if ever, mentioned in radio dating discussions) is the co-called GSI anomaly. It has to do with unexpected variations of quite exotic decay rates, of highly ionized atoms (so, exotic decay of exotic atoms). Several options have been proposed to explain the observations, but to date, none of them has been considered to settle the issue. Indeed, people would be happy if known physics could not explain the thing, for it would mean we have, at last, an experiment breaking the “Standard Model”.

At any rate, such enigmas imply by no means that we don’t understand mundane decay rates like C14, in the same way that our current inability to explain high temperature superconductivity does not mean we don’t understand how the processor of my laptop works.

Summary: Observations tell the laws of nature have been the same for billions of years. These laws do allow decay rates to vary, but in such extreme conditions that if the object I want to date had been through them, it would be a pile of ashes.

We can count on the temporal constancy of half-lives for radiometric dating.


[1] These temperatures are what it takes to ionize a macroscopic number of atoms, that is, the typical amount of atoms you find in a real life object (10^23, Avogadro’s number). In the Re187 experiment, scientists observed far less atoms than that (only 10^8), and used other techniques to ionize them.

No, the speed of light (in vacuum) hasn’t changed lately

You live 200 kilometers away from me. You just arrived, and you drove an average 100 km/h. So you left 2 hours ago. Therefore, you existed 2 hours ago. Simple.

There are many stars farther than 6,000 light years. Light travels… at the speed of light. So it left at least 6,000 years ago. Therefore, the universe existed at least 6,000 years ago. Simple.

6,000 light-years, in cosmological terms, it’s really next door. If the observable universe were as big as France from north to south (about 1,000 km), 6,000 light-years would literally be the tip of my nose. 6 centimeters. How can we explain that I can see farther than the tip of my nose, if light appeared only 6,000 years ago? This is, for young earth creationists, the “starlight problem”.

A solution often proposed is that light did not always go the same speed. C, the letter that stands for the speed of light, would have changed in the past. That’s what I want to talk about here.

The blue and the red sections can be read separately.

When possible and/or needed, I link the (free) arXiv version of the (usually behind a paywall) peer-reviewed papers.

What do observations say?

The speed of light is not a minor parameter that can be changed without touching the rest of physics. This is, for example, the only parameter involved in Maxwell’s equations, which describe the propagation of light. This is, again for example, one of the 3 parameters that appear in Einstein’s equations that govern gravitation (General Relativity). Changing c means changing the laws of nature.

The constancy of the laws of nature in time is not a dogma. Indeed, as physicist Paul Dirac wrote,

It is usually assumed that the laws of nature have always been the same as they are now. There is no justification for this. The laws may be changing, and in particular quantities which are considered to be constants of nature may be varying with cosmological time. Such variations would completely upset the model makers.

Paul Dirac, On Methods in Theoretical Physics, June 1968, Trieste

It is precisely because many are well aware of this, that the search for clues pointing to a variation of fundamental constants, c included, has been the subject of many studies.

Recently, in 2010, a 145-pages article reviewed the state of the art in this matter. Here are some of the main points which indicate that no variation of c has been detected, including on cosmological times (that is, billion years time scale):

  • Each atom nucleus, each atom, each molecule, emits light on a series of wavelengths of its own. This is called its “spectrum”. A bit like its barcode. And whether we observe the tip of my nose (6,000 light-years), the end of my desk (100,000) or even farther, we see the same bar codes. The same nuclei, the same atoms, the same molecules, governed by the same laws which depend, among others, on c.
    The number of observations is mind blowing. The SIMBAD database, for example, contains spectral measurements for more than 9 million celestial objects in our galaxy (less than 100,000 light-years around – the end of my desk).
    The NASA/IPAC Extragalactic Database is a database of objects outside our galaxy, that is, more than 100,000 light-years away. It contains more than 200 million entries.
    The Sloan Digital Sky Survey is another extragalactic database, with about 3 million spectra.
  • The laws of nuclear physics also depend on the speed of light. They determine how nuclei split (as in our nuclear power plants), merge (as in the center of the sun, or in laboratories), or disintegrate when radioactive. Nuclear astrophysics is a growing discipline that studies astrophysical nuclear events.
    Just an example. When a “Type 1a supernova” explodes, we observe the radioactive decay of Nickel to Cobalt, then of Cobalt to Iron. Observations allow to measure the half-life of these elements on site. It is the same we observe on earth [1]. A search of this type of supernova on this database returns more than 10,000 entries.
  • The equations of General Relativity also depend on c. When two very dense stars turn around each other, they lose energy because they emit gravitational waves. As a result, the period of rotation decreases by an amount that depends on the speed of light.
    This kind of duo is sometimes called “binary pulsar”. A census conducted in 2008 numbered 160. The most famous one is the first that was discovered, in 1975. Located 21,000 light years away, it is under scrutiny since then. You will find below 40 years of comparison between the predictions of General Relativity concerning the period of rotation (the continuous line), and the observations (the black points). The horizontal line at the top of the graph is the prediction of Newtonian gravitation (nothing changes).

Weisberg_2016_ApJ_829_55Weisberg & Huang, ApJ, 2016

So, what do observations say? We’re not talking about a few here and there, suggesting that perhaps, the speed of the light did not vary too much in the past. We’re talking about literally millions of observations showing the same.

Let’s finish this section with a rather remarkable and very emblematic observation.

Mass bends light. The heavier it is, the more it bends. That’s how one can observe gravitational lensing in a situation like this,


Now, imagine a star explodes. A supernova. Suppose also that something heavy (a few galaxies, for example) is between me and the star. The supernova sends me light both by the blue path and the red path. And as light is bent, I see double (the red and blue mirages).

If you look closely, you will see that the blue path is shorter than the red. So I should see the blue glow before the red. If, at the time I detect the blue mirage, I have enough information on the whole system, I can predict the moment the red image should appear.

That’s exactly what happened in 2014-2015 with the Refsdal supernova. In 2014, astronomers observed a supernova 10 billion light-years away. Its light had been curved by a cluster of galaxies, so that four mirages were visible. The study of the observations allowed to predict that another mirage would appear a year later, around November 2015. A year later, the forecast mirage appeared. Needless to say, the speed of light plays a leading role in the timing: a variation of just 1%, along a path of only 1 million light years, would result in a nearly 10,000 years shift on arrival. Being able to pin down the appearance of the 5th mirage with a few weeks precision, does not leave much room for speed of light variation.

The question is thus unambiguously resolved by observations: the speed of light has not changed for more than ten billion years, at least [2] . Perhaps one day a tiny variation will be detected [3] , “tiny”, that is, tinier than the error margins of current measurements. At any rate, a multiplication of c by 10, 100 or 1,000, necessary for the light emitted 60,000, 600,000 or 6,000,000 light-years away, to come to us in less than 6,000 years, is completely excluded by observations.

Extra mile

At this point, those who have enough can leave. Others can make a break and/or keep reading. Even if the deal is done, I would like to address in addition some conceptual problems implied by a variation of c. Of course, if observations were telling that c has changed, we would have to face these issues. But this is not the case, while it seems to me that the proponents of a variable c are usually not aware of them.

What about the fine tuning?

The fine tuning of the laws of nature is a popular argument for the existence of God. And it definitely implies that the speed of light has not changed. One could indeed list the many reasons why c could not have varied much, simply following the rationals of the fine-tuning argument.

Again, if observations had showed that c had changed, the fine-tuning argument would just have to come back where it came from. But this is not the case, and it is therefore completely incoherent to hold the fine tuning in one hand, while playing with the speed of light with the other.

And yet, they spin (the skaters)

Here on earth, experience shows that the laws of nature do not change when we change places, times, or directions.

  • Will your hair dryer behave differently at home and at a friend’s house? No. Nothing happens when you change places.
  • Will your hair dryer behave differently when you stand in front of the bathroom mirror, or when you turn your back on it? No. Nothing happens when you change direction.
  • Will your hair dryer behave differently in the morning and in the afternoon? No. Nothing happens when you change the moment.

It all seems obvious, and yet it has amazing consequences. For example, imagine that tomorrow the gravitational constant G goes up. I put a weight on a ladder today. It costs me a certain amount of energy proportional to G. Tomorrow, when G is bigger, I let it down… recovering more energy than I spent to lift it. Conclusion: if G changes, energy is not conserved. Now, a few questions:

  • Question 1: If something else than G changes, like c for example, is energy not conserved either? No, it is not conserved either. Emmy Noether actually proved in 1918 that energy is conserved if and only if the laws of nature do not change over time.
  • Question 2: The fact that the laws of physics do not change over time gives the conservation of energy. OK. And the fact that these laws do not change when we change places, or when we change direction, also gives the conservation of something? Yes. According to Noether’s theorem, the independence of the place gives the conservation of momentum. And the independence of the orientation gives the conservation of the angular momentum (which makes skaters spin faster when they close their arms, hence the title of the paragraph).

What does it have to do with our problem? If light goes faster when it comes to earth than when it leaves it, it implies Maxwell’s equations are not invariant by a change of orientation. If light went faster in the past, it implies the same equations are not invariant in time. And if these equations can do whatever they want beyond the tip of my nose (6,000 light years), while here on earth they keep quiet, it means they are not invariant either when we change places. And we could say exactly the same with the laws of General Relativity, or nuclear physics.

So if the speed of the light changes, then energy is not conserved, momentum either, and skaters do not spin faster when they tighten their arms. In addition, the duos energy/time, momentum/position and angular momentum/orientation are found in quantum mechanics as conjugate quantities via the Heisenberg uncertainty principle. It is not a mere coincidence.

Bottom line: Many consequences should encourage some to think twice before naively proclaiming, “well, the speed of light must have changed after all”.


Let’s conclude by commenting on some experiments often mentioned in relation to our topic.

  • In her Harvard lab, Lene Hau has fun putting photons in bizarre substances they take forever to come through. Journalistic version: “Lene Hau slows down the light! Alert! Einstein was wrong! The speed of light changes!”.
    Well, no.
    It is “just” that the medium involved absorbs photons, re-emits them, re-absorbs them, re-re-emits them, re-re-absorbs them, etc., so that the poor guys cannot move forward. Quite like a photon emitted at the center of the sun can take hundreds of thousand years to get out. But Maxwell’s equations are the same at Harvard as in your living room. That’s why Lene Hau’s smartphone keeps working in her lab.
  • Same thing with this experiment, in which researchers have “slowed down light”. Here again, Maxwell’s equations are the same in their laboratory than at home. It is “just” that by skillfully tailoring several lasers, they have succeeded in slowing down the speed at which the beams carry the energy. This is called “group velocity”. But no, c has not changed [4]. And their smartphones work fine in their lab.
  • Other experiments got superluminal velocities, that is, light going faster than good old physics says… in a medium. It’s always in a medium. But the c of Maxwell’s equations hasn’t changed.

The speed of light in vacuum remained the same in all these labs. More on these experiments here.

So observations are clear: light in vacuum has been going the same speed for ten billion years, at least. And the direction of propagation does not change anything.
May the young earth creationists admit it as soon as possible.


[1] More in The World Is Not Six Thousand Years Old So What?

[2] We can actually go back to a few milliseconds after the Big Bang, as known physics successfully describes the observational consequences of primordial nucleosynthesis. Physics before that is unknown territory, in which can flourish theories with varying speed of light, that observations cannot refute or confirm so far.

[3] Some would have detected tiny variations of the “fine structure constant”, in which the speed of light enters, of the order of 0.001%, 10 billion years ago. But the observations are very debated.

[4] Which is why this article was published in “Scientific Reports” (the less demanding of the “Nature” journals), and why no one will get a Nobel Prize out of it.

There is no debate

In the 1980’s, the academic Robert Faurisson acquired a certain notoriety by denying the existence of the gas chambers. The French historian Paul Veyne, world expert in ancient history, wrote these lines about him,

If [Faurisson’s] legend is to be believed, after penning obscure lucubrations on the subject of Rimbaud and Lautréamont, he achieved some notoriety in 1980 by maintaining that Auschwitz never existed. He was roundly castigated. I protest that the poor man was close to his truth. He was close, as a matter of fact, to a type of crank that historians who study the past two centuries sometimes encounter: anticlericals who deny the historicity of Christ (which irritates me, atheist that I am) and addled brains who deny the existence of Socrates, Joan of Arc, Shakespeare, or Molière, get excited about Atlantis, or discover monuments erected by extraterrestrials on Easter Island.

Paul Veyne is categorical: there is no debate among historians about the existence of Jesus, as there is no debate about that of Socrates, Joan of Arc, Shakespeare or Molière. Does this mean absolutely no one doubts it? That the number of educated people who deny the existence of Jesus is exactly zero? Of course not. There are some. And Paul Veyne calls them “illuminated”. One current example is Richard Carrier, PhD in ancient history from Columbia University. I hope he did not read Veyne.

The absence of an academic debate on an issue does not mean that no one disagrees. Regarding the historicity of Jesus, for example, we see that even an atheist like Paul Veyne calls “illuminated” those who deny it. A consensus never reaches exactly 100%.

Another example: Suppose I decide to support the flat earth thesis. Suppose in addition that the professor and friend who occupies the office next to mine, does the same. Could flat earth supporters claim the academy is divided on the subject? Would it be legitimate to claim that scientists disagree? Of course not. It would simply happen that two weirdos have invented a debate that does not exist.


Where am I heading? I often talk to people who don’t really know what is the consensus of the scientific community on the age of the universe. They read on young earth creationist sites that some experts, people with PhDs, university professors, say that the world is 6,000 years old, and they are troubled, which I understand very well. I fully understand how they can have the following questions:

The answers to these 3 questions are: No, no and no. No physicist in the aforementioned universities will tell you that the world may be 6,000 years old. The question never arises in conferences. As for the number of articles defending the thesis of a 6,000 year old universe in the journals Nature, Science, etc., this number is: 0.

By clicking on the names of the universities, you will come to the web of the physics department of these institutions. I invite you to check for yourself: no faculty, no research group, no one, is dedicated to examining the possibility of a 6,000 years old universe.

As has been said, a consensus never reaches exactly 100%. It is therefore perfectly possible to find a PhD in Astrophysics (like Jason Lisle), or a university professor (like Andy McIntosh), who are “young earth”. But they remain a tiny, tiny, minority. And without wishing to offend, the overwhelming majority takes even less seriously the idea of ​​a young universe, than historians take seriously the “illuminated” who deny the existence of Jesus or Shakespeare.

Scientists have a lot of questions, but “is the universe 6,000 years old? ” is really not one of them. And physics or astronomy are not the only fields of knowledge involved. Many other disciplines such as history, geology, archeology, glaciology, dendrochronology have concluded the same thing a long time ago.

There is no debate.