“The most dangerous prison is the one where you can’t see the bars.” Dr. Tariq Ramadan These are confusing times. A short reflection on what one stands for and what they believe in is bound to generate a self-induced state … Continue reading
There is nothing that anyone can say that will get under the skin of the science-worshipping militant new atheist movement and have them all riled up, like questioning the validity of evolutionary theory. So much anger and aggression by some, and … Continue reading
There is a mantra that is constantly recited by the new science-worshiping atheists, which comes in different versions while essentially saying the same thing. Apparently, the evidence provided by science so far suggests that the existence of God is unlikely. Hence, the sound position one must hold on God’s existence until the evidence suggests otherwise is in the negative or at the very least neutral. It is as if science is the only acceptable methodology of acquiring knowledge. Furthermore, apparently the burden of proof is on the believer. It is the believer who must provide the proof that God exists because it is the believer who proposed His existence in the first place. Until the proof is provided, we must again hold the negative position or at least stay neutral. This, according to atheism, is the rational approach.
Before tackling the evidence issue, we should first examine the proposition that the burden of proof is on the one making the claim. This indeed can be justifiably considered to be the rational position to hold under usual circumstances. But is it necessary under all circumstances? John Stewart Mill wrote an interesting essay in 1869 entitled “The Subjection of Women”, in which he was addressing the issue of the rights of women in society at the time. Before tackling the matter, he acknowledged that proposing equal rights for both genders is unorthodox given the values upheld by just about everyone in his society, including intellectuals. Even women themselves were indoctrinated in such a way that they viewed their lesser status and lack of rights in society as natural order. So before getting into the subject and addressing how his society has become that way and why it should change, he brought up the issue of the burden of proof. In circumstances in which the great majority hold a belief of any kind, what the dissenting voice must do is bring forth their case and prove it. In other words, rather than the accused being innocent until proven guilty, they become guilty until proven innocent. So rather than throwing up his hands and saying that what his society is doing is incorrect, Mill wrote a lengthy treatise to establish that the proper thing for a society is to have women enjoy equal rights as men.
Where this comes in with atheists is straightforward. The proposition that the believer must provide evidence for God, and until then we should remain either atheists or agnostic is problematic when we have had, since the beginning of history records, humans having engaged in some form of religious rituals and having acknowledged the existence of God. This is not an appeal to antiquity or to popularity. It is simply a pointing to a fact of life. It does not establish the truth of the claim that God exists, nor does it point to the fallaciousness of the claim that God does not exist. All this fact does is put the burden of proof on the atheist rather than the believer since he is the one dissenting from the great majority. The atheist must undertake the task of proving the negative, which contrary to what many atheists seem to think, is not an impossibility.
When it comes to the issue of evidence, it seems to be a word that is used in an unrestricted fashion by atheists to support their final conclusions. If anything, this reflects either a simpleton mind, or an ulterior motive driving atheists. The broad term “evidence” is defined as a body of information that points towards the validity of a claim. However, because there are several types of evidence, we cannot simply make the broad request of anyone to provide “evidence”. Depending on the matter at hand, we must define the type of evidence we need in order to be convinced. At the very least, we should qualify whether we want direct evidence, or can be persuaded by indirect evidence if it is strong enough.
But let us take this a step back further than just defining “evidence’ and identifying which types are there. The question we should address first if we want to restrict ourselves to pure logic and avoid all fallacies is: can any evidence, irrespective of type, be sufficient to prove that God exists? It would seem that from the atheist perspective the answer to this question is a simple “No”. Regardless of what type of evidence they are provided with, atheists will always have some response to it, which can be frustrating to the believer as they list all the reasons why a belief in God is actually the more rational position to hold. The problem is not with the evidence from an empirical sense. The problem is with the rationalization process that comes after being presented with the evidence, which leads us into a discussion about the nature of knowledge, which we can address elsewhere.
Hubris is the state of the science-worshipping atheists. In a sense, it can be understood why they have become this way. Being able to observe the deep universe, examine the cosmos, send all kinds of technology into outer space, while at the same time be able to manipulate genomes, alter bacteria, play around with viruses, and control diseases are all very impressive feats. Mankind has surpassed its wildest historical imaginations of what it can accomplish. But this does not necessarily mean that our intellectual capacity or ability to reason has also progressed. In fact, it can be argued that we have become more confused as a generation and many are deluded into thinking that they are rational. The vast majority of books put out and read by science-worshipping atheists are sophistical in their reasoning and embarrassing in their conclusions.
The interesting question for the science-worshipping atheist to answer is whether it is about the belief in God, or the consequences of believing in God that are a problem. In other words, is it a matter of acknowledging God’s existence, or a matter of acknowledging what it means to their life after acknowledging God’s existence? It would be interesting to see if for an atheist it is more about an attachment to a hedonistic lifestyle where the caprices and desires are fulfilled without restriction, combined with a hubristic vision of their intellectual capacities and human abilities, than it is being about the actual question of God’s existence. It is reported that in his last recorded interviews, Jean Paul Sartre said he had reflected upon his reasons for atheism and found them to be childish. Whether this is true or not is debatable. But it does reflect a type of recognition of the human condition when it comes to our desires. We become like children when we want something and don’t have a barrier stopping us. And when the parent puts the barrier we get upset. This continues until we can finally move out of the house so we can be “free”. That is what Sartre was talking about. Atheism to him was just a delusion of freedom that dissipated when he finally was being intellectually honest with himself.
It seems that science-worshipping atheists are using science and what they call “reason” as veils to cover their dissatisfaction with being told to bow to something greater than they can comprehend. More interestingly, they also seem to be using these veils to cover up their discontent with being told “do this and don’t do that”. This is not to say that some of their other problems with “religion” might be justified. But given their being taken with sophistical reasoning, they readily confuse actions, statements, and even beliefs of individuals with what their respective religions actually teach. Moreover, what these science-worshipping atheists failed to recognize is that in their rejection of God, they have taken their own egos to be gods and became autodeists. Now they are organizing to form their own brand of religion full of moral theory and even practices, which they began to call people to so they can be “saved”. How ironic?
I used to be religious and believe in God, but then I came to university and took biology and learned about evolution and the real truth behind how we came about. I mean, when you think about it, evolution is a fact that everyone must embrace and abandon this idea that some “Creator” brought us into being. After all, look at how similar our genome is to our primate relatives, especially the findings about the fusion of the human chromosome 2 arising from two separate chromosomes in our common ancestor with the chimpanzee, which retains those chromosomes separately. And don’t get me started on the Bonobo ape. These are but a few lines of evidence that when you combine with others, you would either be blind or a fool not to accept evolution as fact. It is a fact that we are nothing more than apes. As Christopher Hitchens will vehemently assert it: “I’m an ape and so are you”, it must be accepted that we are a product of evolution, and those old fairy-tales you read in your Bible or Quran must be recognized for what they are; fairy-tales.
We live in an age where more and more people are suffering from what is in Arabic known as jah’l murakkab or “compounded ignorance”, to which I will add the word syndrome at the end. A patient suffering from Compounded Ignorance Syndrome (CIS) will typically present themselves with several symptoms, most of which are in their intellectual capacities, including fallacious logic, poor vocabulary and grammar, lack of reading in their background, and frustration when engaging in conversation with anyone opposing their view. Most of all, the diagnostic feature of CIS is the patient’s ignorance of their own ignorance. CIS is contracted by an initial state of arrogance followed by typing in the search field of Google or YouTube and similar websites, and watching short videos about various topics. In addition, the media serves as a major source of “factual” information. This in turn results in learning incomplete parts about a subject, thus giving the ecstatic and deceptive feeling that one now knows something, which they build upon further with more incomplete knowledge, eventually resulting in a compounding of such partial knowledge and giving rise to CIS. All of this is in addition to a shortened attention span that makes it increasingly difficult to sit down and read a full book, or engage in a lengthy conversation about any subject.
This idea of evolution as fact is quite interesting. In Science, there is a general consensus that a good theory should not be abandoned until a better one replaces it. This is quite a valid position to take, especially since a theory is not called a theory unless it has had multiple lines of evidence supporting it. But here is the problem with this approach: while this theory can be valid, having fanatical overt confidence in it can interfere with the development of new and equally if not better alternatives to it. Furthermore, as Einstein said once:
“It is the theory which decides what we can observe”
In other words, once you have a particular expectation based on the theory you have in mind, you will invariably interpret any set of data you get in a way that fits that theory. Soon enough, as a scientist, you will no longer be seeking to understand how anything works as it actually does. Rather, you will instead explain how it works based on that theory, even if the theory is completely false. What you observe will undoubtedly be determined and obscured by your conviction in the theory you’re embracing.
Interestingly enough, people like Richard Dawkins and his admirers speak of how our impressionable brains are deceived by so-called “myths” of old times and that we just project onto the world our own preconceived notions, and we see patterns and purpose in the world based on them, thus giving rise to our faith and belief in a greater meaning, all of which are “hogwash” conclusions built on falsities. Well, based on how scientists work in their fields and how theories influence their conclusions as mentioned above, that can also be similarly called “hogwash”, which I’ll explain the reason for shortly.
As far as it is currently known in Science, there are 4 bases naturally used for DNA. Along with amino acids and some other compounds, these are the most important building blocks of life on Earth. Without them, you can’t survive and can’t pass on your genetic information to your offspring. This is what is required to sustain your life as an individual, and also to be able to pass on your code to sustain the human species. Obviously I haven’t added everything that is required for this function here, but I want to keep it simple. If you took those 4 bases of DNA and put them into a computer program to randomly generate a sequence, and then did it again for a second random sequence and a third, then compared the three against each other, you will undoubtedly see a degree of similarity. This will simply be a result of being limited to 4 bases.
Here is the question that poses itself: is the degree of similarity a result of the three sequences sharing a common ancestor sequence, or is it a result of using common building blocks?
What if you generated the first sequence on Monday, the second on Wednesday, and the third on Friday, and it just so happened that the sequence on Wednesday was more similar to the one on Monday than your Friday sequence is to the one generated on Monday? By that you can easily construct a nice tree to show a timeline and then connect them together as related sequences and claim the second and third sequence to share the first as their common ancestor but the second sequence is older than the third in its line of “evolution“. All this time spent reasoning and constructing diagrams and linking based on similarity and age of when which sequence arose first, but never realizing that they’re actually not related at all and that the computer just generated three random sequences using the same four bases.
Did you know that you share a 40% similarity in your DNA to a lettuce? The human genome is around 85% similar to a mouse and 94% to a chimpanzee. Can you really affirm that the only reason is because they share a common ancestor? Has this explanation gotten its grip on some brains so much to the point where they can’t see any other explanation?
I think that the deception of progress and confusion between what Science is and what Technology is, has resulted in many to falsely believe that Science is the activity of arriving at ultimate Truths:
You must accept that through Science we have realized many truths about our nature and place in the animal kingdom. As scientists we have to be humble and recognize that there are plenty of things we do not know and in time we will know the great mystery behind all of it.
This rhetoric is quite common among Science fanatics. Pull up some clips of Richard Dawkins or Sam Harris or Christopher Hitchens on YouTube, and you’ll find them uttering something along those lines. They like to emphasize how humble they are in their search for the Truth and that anyone else not adopting their view is the arrogant one.
Here is a tip: if you recognize that you are humble, that’s a major sign that you are most definitely not.
Unfortunately, these statements are then taken by patients suffering from CIS and like a parrot they repeat them all over the place in conversations with others in an attempt to sound intelligent. Here is a question for a CIS patient: do you understand the difference between being a realist and an instrumentalist in Science? Because you’re coming off as a realist, and since you’re afflicted with a case of CIS, there is a very good chance that you probably didn’t know the difference between the two, which means you didn’t really think about it. Yet, here you are preaching with such fervor as if you have.
A quick read into near history will reveal that despite our illusion of progress, we are regressing. The assertion that although we have much that we don’t know at the moment, and through Science we will know is not new. David Hilbert, who was an astounding mathematician in his own right, made this statement in the first half of the 20th century. Another mathematician, who was equally brilliant, Kurt Gödel formulated a nice little proof that refuted Hilbert’s statement and called it the Incompleteness Theorem. In laymen’s terms, the Incompleteness Theorem states that there are certain given facts in life that are true, but which cannot be proven to be true and have to be accepted as such. The reason for that is due to the inherent shortcomings of the system. For example, we take that 1+1=2 as a true assumption. But from a mathematical standpoint, you cannot prove it to be true because you will perpetually fall into circular reasoning, which is fallacious from a logical standpoint. Hilbert had a huge problem with this theorem and attempted in a great many pages of his Matematica Britannica to prove that 1+1=2. He failed at it simply for the reason that he was using math to prove math. The way out of it is you have to step out of that system, and use a different one to prove it. In this case, I could give you one apple, then follow it by another apple and ask how many apples you now have, which would be two. But to use a pencil and paper and use math principles to prove math principles is logically fallacious. You have to come up with a different system.
Now that brings up a problem. What if the fundamental assumptions being used that are to be taken as true, but cannot be proved, happened to be wrong? What if what you thought were appropriate conclusions, happened to be built on a base of faulty assumptions? How can you for sure know that what you’re finding out with Science is the Truth? Is it by repeated observation of phenomena and then making conclusions based on that? Let’s examine a common observation then: does the sun rise in the east and set in the west? Countless observations since the first human being recorded it indicate it to be so. But what about when Earth’s polarity switches and the north and south directions are flipped? At that point you would have to say that the sun rises in the west and sets in the east.
You know what’s interesting? The sun doesn’t rise and set. The Earth rotates and that gives the impression that the sun is rising and setting, but after many complicated calculations we have realized that the sun is stationary with respect to the Earth and the rest of the planets, and it’s the planets respective rotations and revolutions around themselves and around the sun that give rise to several phenomena.
Do you feel hot or cold? I can complain about how hot it is in the room while at the same time you complain about how cold it is. So which is it?… Who’s right?
Is Science objective?… Really?
We live in an age that despite its apparent progress due to technological advances and increased access to various fields of knowledge has left many in an ill-equipped state when it comes to basic skills in logic and coherent lines of reasoning. While it might come naturally to some people, most will need to be educated in critical thinking in order for them to be able to identify fallacious claims in any proposed idea. This is necessary so as not to become convinced by inherently problematic claims. Nowadays, the increased following for a movement behind Science as a system to replace all others because it’s “rational” and “objective” is evidence for the increased ignorance of how Science operates at its rudimentary levels. There are implicit assumptions in Science that even scientists are either not aware of because they were never explicitly taught to them, or are no longer paying attention to them because they are just part of the reality they already know about Science, and they choose to just live with it.
Let me see if I can explain what I mean by the following. A basic concept in logic is that which deals with deductive vs. inductive reasoning. An example of a deductive line of reasoning is:
Cordova is a city in Spain
Spain is part of Europe
Therefore, Cordova is in Europe
The first two lines are called the premises of the argument, and the third is the conclusion. In deductive reasoning, the premises necessarily entail the conclusion, and if the premises are true, then the conclusion must be true. Inductive reasoning on the other hand is a little different. For example:
The first 5 eggs in the box are rotten
All eggs in the box have the same expiration date
Therefore, the sixth egg must be rotten
The difference with inductive reasoning is that even if the premises are true, they do not mean that the conclusion is necessarily true. In other words, the conclusion was induced based on prior trials. It is conceivable that the sixth egg is not in fact rotten.
The implication of the difference between the two lines of reasoning is that one can use the word prove ONLY with deductive reasoning. On the other hand, one CANNOT prove using inductive reasoning. Rather, it’s best to use the phrases “there is good evidence for such and such”, or “these findings suggest such and such”.
Inductive reasoning is the most common form of reasoning and everyone uses it on a daily basis. It’s based on having previous experience of something working in a certain way every time in the past and inducing that to the future. For example, when one turns their wheel in the car anticlockwise, it’s assumed that the car will turn left because it has done so every time the wheel was turned anticlockwise in the past. After some reflection, it becomes easy to see why inductive reasoning predominates in the real world. If we were to rely on deductive reasoning for everything, life would not go on. Inversely, if we did not trust in inductive reasoning, despite the fact that it CANNOT prove the subsequent consequence of a habitual action, life would also not work. After all, no sane person would jump from the top of a tall building, because they accept the inductive conclusion that they would fall and possibly die.
What’s the point of this lesson in logic? Well, in Science, most of what is done is based on inductive reasoning. However, if the claim that Science is a completely rational activity, then surely the foundations of it must be built on a rational basis. The question that presents itself is:
How rational is inductive reasoning?
An examination of it has puzzled philosophers of Science because the conclusion was that inductive reasoning is really a reliance on faith. No one can prove that just because something worked out a certain way in the past that it necessarily must work the same way in the future. What makes it worse is that the observations and data collected from any experiment are as Thomas Kuhn puts it “theory-laden”. Furthermore, there is not a single theory in Science that has unanimous confirmation through the data. Depending on which theory one is examining, it can be anywhere from very minimal contradictory data, to a great body that would require a re-examination of that theory.
To examine this idea of Science being based on faith, it’s appropriate to point to David Hume (1711-1776), who proposed what is known as “Hume’s Problem”. Hume suggested that our trust in inductive reasoning stems from our presupposition of what he called “Uniformity of Nature” (UN). What this means is that we assume that all things being equal, our future observations will be similar to our current and previous observations. In other words, just because every time the wheel is turned anticlockwise the car has turned left, in the future the same will be observed. But what if we can think of an alternative reality? What if we can think of a universe where UN is not the norm? The mind can conceive of a place where the laws of gravity do not hold all the time but are random, or where if objects such as billiard balls collide they stick together, etc. Since we can conceive of such a place, it follows that UN can’t be proven in a strict sense like, for example, Pythagoras’ theorem can be.
Some might say that although we can’t prove the truth of UN, it has held up until now, and so we have good grounds to assume that it’s true. The problem with this is that this very statement in trying to give support for UN assumes UN. In logic this is called “begging the question”, or “circular reasoning”. It’s using the very thing one is trying to prove as a proof, which doesn’t hold up.
So what was Hume’s point of all of this? He basically said that when it comes to inductive reasoning, there is no rational rhyme or reason to believe that it’s appropriate. Our trust in inductive reasoning rests on blind faith that has no rational justification whatsoever. And that’s how the greatest majority of Science is conducted. The troubling prospect here for people like Dawkins and Harris is that if this is kept at the forefront of their minds, they can NEVER say the word “Truth” about anything in Science, simply because if one scientist gives a particular interpretation of certain scientific findings, another can give a completely different and many times an opposing account, while equally valid from a scientific perspective. In other words, it’s all relative.
What is the reality of the Science that Dawkins, Harris, and the like are trying to push down everyone’s throat? It is a so-called rational inquiry, subjective in nature yet promoted as objective, resting on irrationality and blind-faith, whose followers and advocates use to taunt and rebuke religious people who they call “irrational”. The only thing that comes to mind in response to that is an Arabic proverb that says:
“شر البلية ما يضحك”
“The worst of calamities are those which make you laugh”
You know what’s interesting? Philosophy comes from the Greek philosophia, which means “love of logic”. The more I go through this, the more I find myself realizing why Atheists like Dawkins don’t like philosophy and find it useless – they’re the most illogical people on the planet!
UVic Cellular Neuroscience