Scientific modeling is a powerful tool. Using the laws of physics, and computers if you want things to be much more convenient, you can predict how nature will behave and how it has behaved in the past. Recently, astronomers modeled the universe starting from 12 million years after the Big Bang, and let it run to present day, 13.7 billion years later. The result is what I would call beautiful. A great match between observation and theory. Here is a video by Nature journal explaining this:
As explained in the video, the simulated properties of the universe matches a lot with the properties of a real life universe, so we know that the laws of physics we have developed is in the ball park area of correctness. It is also explained that it is not perfect, and that is because even though we got a lot of it right, our knowledge of the universe is not perfect. That doesn’t mean science is wrong, it means it is incomplete, and we have to do more detective work in order to work out the rules of nature.
Here is another great match between data and models, this time between real life yearly average temperature and two models crunched by computers using the laws of physics. One of the models is temperature with human forcing of CO2, and the other one only includes natural factors. As you can see, real world data matches the one with human forcing, and it also “predicts” past temperatures very closely (shout out to badatronomy’s great post on climate change):
Another point of note is, all scientific models and measurements have uncertainties in them, but the data itself remains most of the time within the boundaries of the errors, so it is a good fit.
One of my favorites is the modeling of the ENSO without taking account of it. Meaning, you model the Earth with the ocean and atmosphere, plug in the laws of physics and various conditions, and ENSO will naturally occur in the simulation even though we don’t the exact mechanism for it. That is the predictive power of science.