Top 10 signs of evolution in modern man

Through history, as natural selection played its part in the development of modern man, many of the useful functions and parts of the human body become unnecessary. What is most fascinating is that many of these parts of the body still remain in some form so we can see the progress of evolution. This list covers the ten most significant evolutionary changes that have taken place – leaving signs behind them.

Here’s just one example:

Goose Bumps
Goose bumpsHumans get goose bumps when they are cold, frightened, angry, or in awe. Many other creatures get goose bumps for the same reason, for example this is why a cat or dog’s hair stands on end and the cause behind a porcupine’s quills raising. In cold situations, the rising hair traps air between the hairs and skin, creating insulation and warmth. In response to fear, goose bumps make an animal appear larger – hopefully scaring away the enemy. Humans no longer benefit from goose bumps and they are simply left over from our past when we were not clothed and needed to scare our own natural enemies. Natural selection removed the thick hair but left behind the mechanism for controlling it.

The complete list

via

18 thoughts on “Top 10 signs of evolution in modern man”

  1. Wow I love the second one in via. Once upon a time. Yep thats evolution a complete fairy tale. Yes like the second law of thermo dynamics is completely opposed in evolution, and no I am not coming from a christian perspective just a common sense one.

  2. Ralph, the second law of thermodynamics IS compatible with evolution. Whoever told you that does not know what he is talking about.

  3. the 2nd law could actually help prove it. From entropy we get information theory which gives us error-correction and error-distance, which might give us information on how DNA changes over time.

  4. But Al and Isiah it must have an increase in information which is an impossiblity Raw energy cannot generate the specified complex information in living things. Undirected energy just speeds up destruction. Just standing out in the sun won’t make you more complex—the human body lacks the mechanisms to harness raw solar energy. If you stood in the sun too long, you would get skin cancer, because the sun’s undirected energy will cause mutations. (Mutations are copying errors in the genes that nearly always lose information). Similarly, undirected energy flow through an alleged primordial soup will break down the complex molecules of life faster than they are formed.

    It’s like trying to run a car by pouring petrol on it and setting it alight. No, a car will run only if the energy in petrol is harnessed via the pistons, crankshaft, etc. A bull in a china shop is also raw energy. But if the bull were harnessed to a generator, and the electricity directed a pottery-producing machine, then its energy could be used to make things.

  5. Ralpha:
    Entropy and information content are directly proportional. Mathematicains define the amount of information gain a system recieves by the degree of suprise of the input. A simple example: If i tell you that it is snowing where i live now (i live in buffalo) you didnt get a lot of information since this is not a low probability event (an unsuprising one) however If I tell you that Buffalo was just destroyed by a race of atomic supermen you recieve much more information since this is highly unlikely.

    Now we can establish a link between probability and information gain very easily. Lets suppose that we can list every possible input a system can receive. Say its a computer that can receive one letter of text at a time. So there are only 26 possible inputs (ignoring punctuation stuff to keep it simple). Now the letter E occurs in the english language approximately 4 times as often as the letter C. Hence if the system receives the letter C the system gets 4 times as much data as it would if it got the letter E. If we normalize the values such that the sum of their probabilities equals 1 we can create a probability distrobution of them. A guy did this here:
    http://www.askoxford.com/asktheexperts/faq/aboutwords/frequency

    Now with our probability distribution we can build a simple information gain chart. By simply taking the probability subtracting it from 1 (or 100%).

    To unify entropy and information theory is a little more tricky. Entropy is defined by the degree of disorder in the system. The greater the disorder the higher the entropy. The problem comes from how to define entropy. I could give you the chemist version but sadly it wont help you for any other system. so i will give you the general version. A system gains entropy if the amount of data it takes to represent it grows in size. So for example If I were to open paint brush and draw a 1 unit long black circle that takes very little information to represent it. To transmit it all I would have to do is write a simple script to generate a black circle one unit length centered around a specific coordinate. however If I take the same drawing and move a small number of black pixels to the outside of the circle and leave whitespace in its place the problem becomes much much harder. Now I am forced to account for nearly every pixel on the page. You can try this for yourself make two files and zip both of them and see which has more kb. The second one is going to have a great deal more even through it has the same number of black and white pixels.

    Now as we increase the amount of entropy in the system the information content increases as well. Thus probability entropy and information theory are unified.

    This of course leads us to your objection. “Raw energy cannot generate the specified complex information”. Which is very valid. Entropy is a very very creepy idea. There really doesnt seem to be any entropic force in the universe however it acts like a number state vector in many ways. Which is odd. However understand that in the end entropy is information. The more disorder the system is the higher the amount of information the system contains. Undirected energy applied to a chemical system causes the rate of chemical reactions to increase (and ones that are not possible at lower energy levels) which increases the amount of entropy which raises the information content.

    A problem a lot of creationist typically have with evolution is the idea that life seems to be a anti-entropy force. The problem with this idea is that it doesnt account for other seemingly anti-entropy forces. Such as an air conditioning. Life is able to seemingly beat entropy by applying work. Not much different then any cooling system. So where do we get the work energy? The sun for the most part. The sun is a massive generator of energy life on earth uses that energy to create local packets of lower entropy then the surrounding environment. Entropy can be moved place to place it just takes work.

    Since this is too long already I wont go on about error-coding and error distance. Unless someone wants to hear about that.

    Peace on earth

Leave a Comment