The IQ was calculated as chronological age divided by mental age, which was then multiplied by 100, with average scores being 100.
options: TrueFalse
The correct answer and explanation is:
Correct Answer: False
The statement given is incorrect. The correct formula for calculating IQ in the original method is:
IQ = (Mental Age / Chronological Age) × 100
This method was developed by French psychologist Alfred Binet and later adapted by Lewis Terman at Stanford University, leading to the creation of the Stanford-Binet Intelligence Scale. The purpose was to determine a child’s intellectual development compared to others of the same age. For instance, if a 10-year-old child has a mental age of 12, their IQ would be:
(12 / 10) × 100 = 120
This method worked well for children but was not practical for adults. Mental age does not increase indefinitely as people age, so using chronological age in the denominator for adults often gave misleadingly low scores. For example, a 30-year-old with a mental age of 30 would have an IQ of 100, but if their mental age stayed constant at 30 over time, their IQ would decline artificially as they aged.
Because of this limitation, modern IQ tests do not use the original mental age formula. Instead, they use standardized scoring based on a normal distribution, where the average IQ score is set at 100 and the standard deviation is usually 15. This method compares a person’s performance to that of a representative sample in their age group.
In modern tests like the WAIS (Wechsler Adult Intelligence Scale) or WISC (Wechsler Intelligence Scale for Children), IQ scores reflect how far someone’s test results deviate from the average performance of their age peers, not a simple ratio of mental to chronological age.
So, while the formula quoted resembles the structure of an IQ calculation, the statement is false because it has the formula reversed and does not describe how IQ is currently measured.