At Lectica (the nonprofit that owns me), we create formative developmental assessments that are scored for developmental level (Lectical Level) with CLAS—our computerized developmental scoring system. CLAS relies upon a large curated developmental dictionary—the Lectical Dictionary—in which meaning chunks, which we call items, are assigned to developmental phases to create density profiles. (I’ll explain these later). Density profiles are used, along with a proprietary algorithm, to score assessment responses. The basic approach was developed during my post-doc research at UC Berkeley, which was published in 2004.
Over the years, CLAS, my colleagues, and I have scored over 50,000 assessments, interviews, and other texts. A great deal of our work has involved research on development in childhood and adolescence. During the course of this research, we have noticed that many children attending public schools appear to be trying to learn material that their minds aren’t ready for—in the sense that they do not have the foundational knowledge and skills required to understand and make use of what they are learning. In other words, they don’t have a mental network that’s ready to receive and integrate the ideas presented in their lessons.
One of the indicators of this problem, which I have written about in the past, is a lack of Clarity in students’ assessment responses. We evaluate Clarity on four dimensions, including logical coherence, clarity of argumentation, framing, and persuasiveness. I have shown elsewhere that students whose Clarity scores are low develop more slowly than students whose Clarity scores are high. At the time of that research, my colleagues and I suspected that the low Clarity scores were a consequence of requiring children to learn concepts that their minds weren’t ready for, but we did not yet have the tools that would allow us to take a deeper look.
Today, this is no longer the case. The Lectical Dictionary and density profiles we developed to determine the Lectical Level of assessment responses make it possible to examine patterns in these responses with a metaphorical microscope. We can now do things like comparing the density profiles of students who are developing more slowly to those of students who are developing more quickly, which makes it possible to address questions like:
- How do the density profiles of students who are developing at different rates differ?
- If there are differences, when do they first appear?
- Do students with unusual density profiles also have low clarity scores?
- Do these differences seem to be related to the complexity level of the material students are learning in school?
In this article, I begin the process of addressing questions like these by looking more closely at density profiles created from the interview responses of 18 8-year-olds.
First, a little background…
Learning as a constructive process
According to Piagetian theory, we learn by building upon what we already know (or can do). During the last century, hundreds of scholars influenced by Piaget repeatedly showed that mental development is an active constructive process in which skills and ideas become increasingly complex, abstract, and integrated over time. Research has also demonstrated considerable variation in the developmental trajectories of individuals, as the figure below helps to demonstrate. People of the same age or educational level can be many years apart in their developmental attainment. (You can ignore the extreme outliers. We still have some data cleaning to do.)
You may notice that we have used grade instead of age in the graph above. This is because, in adulthood, age is weakly correlated with Lectical Level. As we age, education, life experience, love of learning, and learning skills explain an increasing amount of the variance in developmental level.
If we believe that adult human beings need well-developed minds and accept that mental development is an active constructive process in which skills and ideas become increasingly complex, abstract, and integrated over time, then we should design education to support this constructive process. Moreover, if age is a poor proxy for developmental level, we should either abandon age segregation or learn how to customize learning to meet the needs of each child.
Yet, despite ample evidence of children’s developmental diversity, most schools group children by age and expect these children to learn the same material at the same time.
The good news is that there are likely to be some students in every classroom whose minds are prepared to make good use of the material being taught (typically about 20%, according to Dr. Kurt Fischer). My colleagues and I would say that these students are in the Goldilocks Zone, in which their current skills and ideas provide a strong platform for the material being taught. Unfortunately, most of the remaining students—the bulk of tomorrow’s citizens—will experience a disruption in their mental development. Because their mental platforms aren’t ready for the new material, they will be forced to memorize vocabulary and procedures rather than integrating the new knowledge in a way that supports ongoing mental development. Others—our most gifted students—will disengage from material that fails to challenge them adequately. Tragically, this also disrupts mental development.
There are other ways in which age-based education disrupts mental development; it also has consequences for socialization (a lack of peer relationships for outlier students) and psychological health (a chronic sense of failure, chronic boredom, alienation, a lack of the “feeling of necessity” required to detect understanding, learning trauma, and the lack of an earned sense of competence), but I will not be exploring these issues here.
About density profiles
As mentioned above, CLAS—our electronic developmental assessment system—creates and scores density profiles.
CLAS makes use of the Lectical Dictionary—Lectica’s curated developmental dictionary of meanings. The Lectical Dictionary is composed of hundreds of thousands of units of meaning called “Lectical Items” — words or phrases that carry meaning. Each Lectical Item is assigned to a Lectical Phase (1/4 of a Lectical Level), based on a combination of empirical evidence and the judgment of our analysts. The goal is to assign each Lectical Item to the lowest phase at which the simplest meaning it carries is likely to be useful.
For example, an examination of Lectical Items related to evidence reveals easily observed progressions in the development of its meaning:
- Phase 09b: something that I know is true
- Phase 09c: good information that comes from something people have seen or proved (same as a fact)
- Phase 09d: more or less proven facts that you can use to persuade others
- Phase 10a: information that comes from good research and can be used to support arguments
- Phase 10b: information that comes from different kinds of research and sources and needs to be evaluated before you use it to support arguments
Words that relate to evidence, like true, information, prove, proven, facts, research, sources, and evaluate enter the lexicon during different developmental phases.
Lectical Dictionary entries begin with first speech at phase 05d and cover 05d to 11d (05d, 06a, 06b, 06c, etc.). A particular density profile yields a particular Lectical Score.
The density profile CLAS uses to determine a Lectical Score is based on the distribution of Lectical Items in an individual’s assessment or interview responses. To create the density profile, we determine the number of unique Lectical Items in a performance, then divide the number of unique Lectical Items in each phase by this total.
Interestingly, the distribution of Lectical Items from different phases within a given performance can be thought of as evidence of the historical pattern of an individual’s development. In the density profiles shown below, we can surmise that most of the the 06c Lectical Items were acquired before the 06d Lectical Items, and most of the the 06d Lectical Items were acquired before the 07a Lectical Items, and so on.
The figure below shows a density profile for an 8-year-old student. When we consistently see patterns like this one, we feel fairly confident that an individual’s mental development is on track. The individual appears to have acquired new meanings gradually, as they would if they were constructing these meanings by building upon earlier meanings.
The second density profile, below, is not what we would expect to see if the student was acquiring new meanings gradually. This student’s profile is wonky. The student seems to have acquired 08d and 09a meanings in the absence of 08c meanings. In other words, the 08d and 09a densities suggest that new words are being added to this student’s vocabulary, but they are unlikely to be connected to meanings the student has previously constructed. When we consistently see patterns like this one, we can be fairly confident that a student is memorizing vocabulary rather than building new meanings.
It is also possible for students to parrot language posed in dilemmas or questions they are responding to. When conducting research, this is something we always need to take into consideration.
Learning outside the Goldilocks Zone
What exactly happens to children when their minds aren’t prepared for the material being taught in school? This is a big, complicated question. I won’t be able to fully answer it here, but I’d like to take a step toward an answer by examining the performances of a group of 8-year-olds who have been introduced to material that is well beyond their Goldilocks Zones.
All of the children included in this analysis attended public schools in Massachusetts and participated in a study of science reasoning conducted in 2013. All were in the third grade.
All of the students in the participating schools had just completed a unit about the water cycle. They were interviewed using Piaget’s critical method, which requires the interviewer to probe responses for explanations in a way that uncovers the respondent’s thinking process.
For this analysis, we scored the interview responses of 18 students, randomly selected from those that had been conducted for the original study. (The sample size was limited in order to make the individual density profiles easier to view. I’m not sure this worked. 😉)
All of the interviews were scored with CLAS.
The following chart shows the density distributions for children’s responses. I’ve stretched the graph vertically, to make it easier to examine the individual density profiles of the students. (If we stretched the individual examples above, they would look like the charts below.) As you can see, densities decrease rapidly from 06c to 07c. You will also notice that there is not a great deal of variation in the densities up to about 07b. Between 07b and 07d, the profiles of most students seem to be curving in a bit of a cluster, although the distributions of 5 or 6 students look more random. All of the densities above 07c look more random than those below 07c. Above 07d, however, something dramatic happens to most of the profiles. Notice the spurts at 08a, 08c, 08d, and 09a, and the number of densities that drop to 0 between 07d to 09a. Indeed, almost all of the density profiles in this group display the same pattern.
Interestingly, all four of the spurts represent vocabulary that was taught in the unit and included in the assessment prompts. If we omitted this vocabulary, few of the students would have used any Lectical Items assigned to phases above 08c.
The average Lectical Score for 8th graders in our full sample (which is currently composed primarily of children attending inner-city public schools) is around 835 (the same average as the sample in the figure).
Despite the fact that the average Lectical Score is in the lower half of level 8, most students in our sample use vocabulary with meanings that are unavailable until higher levels, suggesting that they have learned something from the water cycle unit. They may even have memorized terminology and definitions well enough to pass a correctness-focused test about the water cycle. Yet, in September of the following year, the 4th-grade teachers of these students complained that none of them seemed to remember a single thing about the water cycle when they arrived in the 4th grade.
The wonky density profiles and teacher reports aren’t the only indicators that the ideas being taught in the water cycle unit were out of range for the students in our sample. A brief look at how these students were using some of the vocabulary associated with the unit (particle, evaporate, evaporation, vapor, condensation, and condense) provides additional support. (In parentheses, I’ve indicated the Lectical Phase each of the following terms is assigned to in the Lectical Dictionary.)
Particle (08d term)
- Student 6: little pieces of something
- Student 1: I think the sun gets too hot and the little particles of air turn into a cloud
- Student 4: I don’t really know what particles mean.
- Student 5: You would see ten particles turning into water, I guess.
- Student 2: The smallest particles could go to ice. Since they’re smaller it might be harder for them to get into that snow place.
Evaporate (09a term)
- Student 11: turns into air, goes into the air, goes up if it’s hot.
- Student 1: the sun causes hotness and the water gets dry and dry and then it evaporates and turns into air.
- Student 2: Because if all of the water is in really small parts, and when it evaporates, it will be in the air.
- Student 6: it evaporated down instead of up. Because from the hot water, like the sun is hot, so it evaporated up from the hot water.
Evaporation (09b term)
- Student 11: a kind of stuff, water moving in air, makes something move through air.
- Student 1: evaporation causes the hot water to rise up which would melt the ice to drop in little droplets like rain.
- Student 5: The heat is the evaporation from the heat, going to the clouds.
Vapor (09b term)
- Student 11: Something invisible
- Student 9: If you go through a cloud, it’s vapor but you don’t really feel it, there’s not as much. It’s mostly just vapor, it’s not cloud.
- Student 10: What does vapor mean?
Condensation (09b term)
- Student 8: something that causes things to get thicker or closer together, droplets of water.
- Student 7: Because they were like the ice would melt and it would form into the other water and the other water would bring up or something like that.
- Student 5: The water’s heat is making sweat drops come up from the inside.
Condense (09b term)
- Student 4: the water goes together and then becomes a fluffy thing in the cloud and then after that, the cloud condenses into the water and they make a whiter cloud.
- Student 3: Well it’s sort of like condensed milk. It’s sort of … it’s like … it’s almost like pudding, almost.
These responses indicate that students do not understand what the terms they are discussing mean. Some students try to map the new concept onto something they know, “Well it’s sort of like condensed milk,” is clearly an attempt to explain condensation. Others admit defeat, “What does vapor mean?” Still others try to repeat back a textbook description, “Because if all of the water is in really small parts, and when it evaporates, it will be in the air.” However, none of these efforts suggests understanding.
From a Piagetian perspective, the wonky density profiles, teacher reports, and lack of understanding evident in students’ responses is not surprising. It is unlikely that these children learned much about the water cycle, because they weren’t ready to integrate the new ideas presented in the unit. They had not yet created the platform for these ideas. At best, their learning time had been wasted.
A reasonable person might argue that a little wasted learning time does not seem like an important disruption of learning. The unit was only 2 weeks long. Not that much learning time was wasted. But what if most of the units being taught in this school were also beyond students’ capabilities? Would this constitute a meaningful disruption of learning? And what if this went on for another 9 years? What kind of impact would all of this wasted learning time have on mental development? What about its psychological effects?
Neither these questions nor those I posed at the beginning of this article can be answered based on the evidence presented here. I’ll need to dig deeper into the data and look at it from different angles in order to answer any of these questions convincingly. I see several more articles on the horizon…
There is one more thing in the density distributions above that bothers me. It involves the difference between the similarity of children’s density distributions prior to 07c and their divergence above 07c. Densities prior to 07c are clearly not identical; it definitely does not look like these children have all developed at the same rate, but the slopes are similar. In fact, up to 07c changes in the densities from phase to phase pretty much follow patterns that fit our developmental model. Here’s the thing: The typical developmental scores of children entering kindergarten is in the 07c–08a range. This is the precise range in which the density patterns of students begin to look a lot bumpier. Could this be evidence that schooling begins to disrupt development from day 1?
Over the next weeks and months, I’ll be taking a longitudinal look at the density profiles and responses of children who were tested multiple times over several years to examine the question: “What is the relation between the shape of density profiles and mental development over time?”