In the last blog post I outlined several problems with our education system. How do we change the education system to lay the foundation for the improved performance that we desperately need? It boils down to two basic changes in how we manage our education empire. The two changes are to retool our education leaders to be competent change leaders and to retool the way legislators and the bureaucracy they create specify the education process. I am convinced that with great leadership from a “retooled” leadership cadre much improvement can be made. Greatly improved leadership ability would allow the education system to take the initial improvement steps that can be done without other facilitating changes. Greatly improved leadership will also lay the foundation to take advantage of the changes I propose in how the legislators and education bureaucracy manage our education process. Removing these two roadblocks to improved performance would facilitate our attempt to leapfrog the global competition effectively preparing our kids to compete well in the “real world.”
We differ from most of our competitor nations in that they tend to have education “ministries” in the central government that call the shots on curricula, standards, etc. without much outside interference. For our strongest competitors there is a large commitment to improving performance at a rapid pace because they realize that educating their children well is a competitive factor in the global arena. They decide what to do and implement it quickly. We talk problems to death so that ideas are often obsolete before our extremely ponderous process gets around to doing anything. The current “triple jeopardy” approach to managing and funding our education endeavors creates a straightjacket that greatly limits the freedom of educators to address their own unique situations and to prioritize their efforts accordingly. The Federal government, State government and the local School Boards all have to get their two cents into the equation. And oh, the time it takes and the convoluted approaches that result. The laws and the regulations written by the bureaucracy are a danger to the future of our kids. I find it ridiculous that on the one hand we say to educators, for example, “You must fix the achievement gap problem” and then specify a process so tightly that they have no freedom to make the changes required to meet that demand. It is like telling a man wearing leg irons that he needs to win the Olympic hundred meter dash or be penalized severely. Of course in the “soft management” environment of our education system the penalty for the educators is akin to 50 lashes with a wet noodle. However, the penalty paid by the kids who are not educated to their potential is a huge due bill that is a blight on our society.
The hubris involved in legislators (advised by “experts” from the education power groups all with vested interests to protect) believing they can specify a top down process that will work effectively and efficiently in all situations is ludicrous. This cobbled together, top down approach hasn’t worked for anything but continuing the status quo with glacial changes for the past many decades. The idea that you can specify and control the education process before hand adequately to provide for all of the myriad impediments that arise and cause the need for a detour from the original plan is idiotic. William Oncken Jr., the famous management writer and trainer, observed that you can only control performance when and where it is happening. That is, you need competent leaders free to act to achieve the desired result. Yet, whenever pressure builds to “solve the education problem” the players all revert to the same failed process that virtually guarantees that no real progress will be made. Thus, they form task groups, take testimony, hold meetings around the state to get public input and then the legislature specifies more tight processes which won’t work in the real world. Einstein’s quote about doing the same thing over and over expecting a different result being insane comes to mind.
If we would think about it with objectivity, the laws need to specify results not process. You see, it is the results that matter, not the processes by which they were obtained as long as they are ethical and legal. It is very likely that any number of processes can work to meet the goals. The advantage of specifying results is that you unleash the organizations to use creativity to develop new processes some of which will be far superior to those “dictated in the past by the legislators and their bureaucratic minions.” This, if the laws specified the desired results coupled with appropriate performance incentives would free the system to make very large improvements quickly. Most organizations will gravitate quickly to the best processes developed by the “creative winners” because the penalties for not performing well will demand it. This is a much cleaner approach from an organizational hygiene perspective as well. That is, people freed to really perform well instead of being in a “robot, do this, do that” rut will have much higher morale and also more fun in doing their jobs. This transformation of education workplaces from “sociological zoos” to productive and fun places to work and succeed is vital to serving the mission of educating our kids to their potential.
This type of improved process development is impossible within the time-honored “we’ll set up a blue-ribbon panel to study the problem and recommend a solution” approach that has been tried unsuccessfully over many decades. Competition among practitioners in the real world will develop superior methods to achieve the desired results much more quickly. This is our big opportunity to finally develop the methods to start beating the competition because we could tap into the vast creative resource that is currently ignored and suppressed. While our global competitors have a focused top down process that is more effective than our current diffuse top down process, they will not be able to compete with an education system that is free to develop positive alternatives in thousands of different organizations across the land where competent change leaders manage the teams required to make the improvements we desire.
Therefore, it is imperative that we demand that the legislators move to a results orientation in the education laws they pass instead of the process approach they have been using ubiquitously. Also, because the retooling of education leaders to implant the skills and coaching needed to be truly effective change leaders is a prerequisite to leading that development of positive alternatives it must be done immediately as well. This two-pronged approach could make our education performance improve very quickly to the great benefit of the kids and our society.
Copyright ©Paul Richardson 2008
Monday, December 29, 2008
Saturday, December 20, 2008
The Education Blizzard
If you are a typical busy person, you are bombarded with a blizzard of data on how our schools and especially the ones in your own neighborhood are performing. Occasionally you will see information on how our students do compared to their global peers, how well they are prepared for college, how our state compares to other states on the national achievement tests (NAEP), how certain curricula are “the answer” because they are “research-based” and on and on. This virtual blizzard of data makes it difficult to see the reality of our education performance unless you have the time and disposition to spend considerable effort working to boil down the mountain of data to information useful for understanding the reality of our education performance. I have been digging for over 5 years in great depth and believe I have a good “feel” for the reality of our education situation. Here are some truths I have uncovered:
• Education insiders virtually always put a positive slant on any information they provide on their performance and a negative slant on anything they use to justify less than stellar performance (their mantra is “we confess it is everyone else’s fault”). The “reports” from the educators contain errors of both commission and omission. That is, positive data are shown with a magnifying glass and negative data are ignored or suppressed.
• The education schools started with a bias emphasizing process or method and basically show very little concern for preparing their graduates to have a competent grasp of the subject matter to be taught. Any subject courses within education schools are so weak that they have little or no value. This has been going on for decades and won’t change as long as teacher certification is predicated on ed school training as the main requirement.
• School districts espouse fancy mission statements, long laundry lists of what they call goals but aren’t because they set no timeline or tight metrics to measure success or failure. These are often posted prominently in board meeting venues and on district websites but are ignored when it comes to acting to bring them about. For example, one local district with lots of “excellent” rated schools in the State Accountability Reports says, “The mission of . . . School District . . . , dedicated to national and international standards of excellence, is to educate every student through a comprehensive and academically challenging curriculum taught in a safe and nurturing environment. We challenge students to pursue dreams, succeed with integrity, and contribute meaningfully to a diverse society.” Being the best in Colorado is not good enough. Does being “among the best of the poor” in international terms fulfill the stated mission? Hardly.
• Colorado CSAP standards are very weak compared to the achievement test standards of many other states. Thus, we use a very short yardstick to measure performance. If you download The Proficiency Illusion, from the Fordham Institute and Assessing the Role of K-12 Academic Standards in States: Workshop Summary from the National Academies Press Online you will get access to information on how poorly Colorado ranks as to the rigor of our state achievement testing.
• Nationally our kids, as mentioned in the previous blog post, Walking in Place, do poorly versus their global competitors. When you couple that with the CSAP low standard you have a huge gap between what Colorado kids need in their education and what they are being provided.
• The State Accountability Reports give Excellent ratings to many schools. This is a “graded on the curve” approach which only says they are in the top tier of schools in Colorado. This says nothing about how they do compared to the “real” global standard our kids face today because that report would not be pleasant to behold and would generate pressure to improve greatly.
• The research-based assertion has to be treated with suspicion. As I mentioned in an earlier blog, the What Works Clearinghouse at the US Dept of Education reports on all sorts of problems with education research. The two main categories in their findings are slanted studies to the benefit of the research sponsor (provider of book, curricula, etc.) and poorly done from a statistical rigor point of view. But the biggest problem I see (and one that I have seen no one else talk about) is that most education research starts with a poorly conceived research question which is the foundation upon which any research rests. For example, the constructivist math curricula I wrote about earlier does pass the test of the research question asked, “Do constructivist math curricula for elementary level students train students to be able to do simple math operations (with the help of a calculator) such as addition, subtraction, multiplication and division?” The answer is yes, however, the research question that wasn’t asked but should have been is, “Do the constructivist math curricula train students in doing the simple math operations while providing a strong foundation for the further study of higher levels of math, starting with algebra? In other words is the new curriculum an improvement over the one that has been optimized over hundreds of years to provide a seamless transition to higher and higher levels of math study? The answer there is a resounding NO! One thing the constructivist math curricula do though is to mask the lack of math subject knowledge present in too many teachers which makes it a popular choice among the “education professional experts” because it is less demanding of them. Thus, my assertion is that the slant in education research is worst in slanting the research question.
Okay, we have some problems that if not addressed mean our kids will not be able to compete well in the global economy. If it were up to me there are several things I would change immediately. However, it isn’t up to me, but it is up to us. I recommend as a first step that we demand that Colorado set the CSAP standards to the level of the state doing the best on the NAEP tests immediately. We don’t need another time wasting task force to determine what to do. Of course, the whining from the education establishment would be very loud. However, not taking the action because it might inconvenience some educators for the benefit of the kids is not acceptable to me and shouldn’t be to you.
Many other steps should be taken but the above is a good first step and by itself would cause lots of other problems to be faced objectively for the first time.
Copyright ©Paul Richardson 2008
• Education insiders virtually always put a positive slant on any information they provide on their performance and a negative slant on anything they use to justify less than stellar performance (their mantra is “we confess it is everyone else’s fault”). The “reports” from the educators contain errors of both commission and omission. That is, positive data are shown with a magnifying glass and negative data are ignored or suppressed.
• The education schools started with a bias emphasizing process or method and basically show very little concern for preparing their graduates to have a competent grasp of the subject matter to be taught. Any subject courses within education schools are so weak that they have little or no value. This has been going on for decades and won’t change as long as teacher certification is predicated on ed school training as the main requirement.
• School districts espouse fancy mission statements, long laundry lists of what they call goals but aren’t because they set no timeline or tight metrics to measure success or failure. These are often posted prominently in board meeting venues and on district websites but are ignored when it comes to acting to bring them about. For example, one local district with lots of “excellent” rated schools in the State Accountability Reports says, “The mission of . . . School District . . . , dedicated to national and international standards of excellence, is to educate every student through a comprehensive and academically challenging curriculum taught in a safe and nurturing environment. We challenge students to pursue dreams, succeed with integrity, and contribute meaningfully to a diverse society.” Being the best in Colorado is not good enough. Does being “among the best of the poor” in international terms fulfill the stated mission? Hardly.
• Colorado CSAP standards are very weak compared to the achievement test standards of many other states. Thus, we use a very short yardstick to measure performance. If you download The Proficiency Illusion, from the Fordham Institute and Assessing the Role of K-12 Academic Standards in States: Workshop Summary from the National Academies Press Online you will get access to information on how poorly Colorado ranks as to the rigor of our state achievement testing.
• Nationally our kids, as mentioned in the previous blog post, Walking in Place, do poorly versus their global competitors. When you couple that with the CSAP low standard you have a huge gap between what Colorado kids need in their education and what they are being provided.
• The State Accountability Reports give Excellent ratings to many schools. This is a “graded on the curve” approach which only says they are in the top tier of schools in Colorado. This says nothing about how they do compared to the “real” global standard our kids face today because that report would not be pleasant to behold and would generate pressure to improve greatly.
• The research-based assertion has to be treated with suspicion. As I mentioned in an earlier blog, the What Works Clearinghouse at the US Dept of Education reports on all sorts of problems with education research. The two main categories in their findings are slanted studies to the benefit of the research sponsor (provider of book, curricula, etc.) and poorly done from a statistical rigor point of view. But the biggest problem I see (and one that I have seen no one else talk about) is that most education research starts with a poorly conceived research question which is the foundation upon which any research rests. For example, the constructivist math curricula I wrote about earlier does pass the test of the research question asked, “Do constructivist math curricula for elementary level students train students to be able to do simple math operations (with the help of a calculator) such as addition, subtraction, multiplication and division?” The answer is yes, however, the research question that wasn’t asked but should have been is, “Do the constructivist math curricula train students in doing the simple math operations while providing a strong foundation for the further study of higher levels of math, starting with algebra? In other words is the new curriculum an improvement over the one that has been optimized over hundreds of years to provide a seamless transition to higher and higher levels of math study? The answer there is a resounding NO! One thing the constructivist math curricula do though is to mask the lack of math subject knowledge present in too many teachers which makes it a popular choice among the “education professional experts” because it is less demanding of them. Thus, my assertion is that the slant in education research is worst in slanting the research question.
Okay, we have some problems that if not addressed mean our kids will not be able to compete well in the global economy. If it were up to me there are several things I would change immediately. However, it isn’t up to me, but it is up to us. I recommend as a first step that we demand that Colorado set the CSAP standards to the level of the state doing the best on the NAEP tests immediately. We don’t need another time wasting task force to determine what to do. Of course, the whining from the education establishment would be very loud. However, not taking the action because it might inconvenience some educators for the benefit of the kids is not acceptable to me and shouldn’t be to you.
Many other steps should be taken but the above is a good first step and by itself would cause lots of other problems to be faced objectively for the first time.
Copyright ©Paul Richardson 2008
Saturday, December 13, 2008
Walking in Place
Two reports on our K-12 education performance have come out in the last few days. The first was the report on the latest round of TIMSS testing. That is, the Trends in International Math and Science Study test assesses the performance of students in 4th and 8th grades providing an international comparison allowing us to compare how American children are doing against their peers in 35 other countries that participate. TIMSS classifies student into four categories: advanced, high, intermediate, and low. On the surface the results look good; for 4th grade 10% of our kids scored advanced which is twice the rate for the median, for 8th grade 6 percent of our kids were advanced, our scores were significantly higher than in the last test cycle. Sounds pretty good, right? The other international test of significance is the PISA which includes the OECD countries who are our biggest trading partners and also our biggest competitors. The TIMSS testing includes other countries both at the high end and at the low end. On the high side; Chinese Taipei, Singapore and Hong Kong SAR, but also includes less developed countries like Jordan, Romania, Morocco, and South Africa and only about a dozen of the 30 OECD countries.
What difference do we see between the tests? Our 8th graders scored 508 on TIMSS math versus the average of 500. On the PISA, though, our 15 year-olds were 24 points below the OECD average math score. Thus, the overall competition in the OECD countries (those that matter most if we are to compete effectively) is tougher. A couple more stats to round out the picture include that the 90th percentile of 23 of the 30 countries is higher than for our kids. Also, only 1.3% of U.S. students were in the highest proficiency level in 2006 PISA math (the last time it was given). This was half the OECD average, in the same range as Greece, Mexico, Portugal and Turkey.
Mark Schneider of the American Institute for Research computed effect sizes for some of the education performance of different entities. Effect size as he uses it is the “standardized difference” between the means of the distributions of different groups. It is expressed in standard deviation terms. Thus, when he reports an effect size of 1.1 for U.S. versus Hong Kong for 4th grade math he means that the mean of Hong Kong 4th grade math scores is 1.1 standard deviations above the mean of the U.S. students. This is a huge difference meaning that 86.4% of Hong Kong students score above the mean U.S. score. Other comparisons for 4th grade he gives are the effect size for Massachusetts vs. Mississippi of 0.8 based on the NAEP, effect size for U.S. public schools with lowest levels of poverty vs. U.S. schools with the highest levels of poverty is 1.5 based on TIMSS data. This means that in the lowest poverty schools 93.3% of kids would score above the mean score for those in the highest poverty schools. He gives other examples for 4th and 8th grade students but you get the picture. We have a huge amount of improvement to make to prepare our kids to compete with their global peers.
The second big report coming out this week was the college remediation report from The Colorado Commission on Higher Education. The title of the press release, “College remediation rates stuck at 30 percent” well summarizes the report's finding that remediation levels are not improving. This report gives lots of data from both 2 year and 4 year colleges. “At two-year schools, 53 percent of students had to take at least one remedial course. The overall rate was about 21 percent at four-year schools. The biggest remediation required is for math although reading and writing have significant levels as well. The report also tabulates remediation rates by where students went to high school. The lowest rate was 5.6 percent at Jefferson County’s D’Evelyn High School; the highest was 80.8 percent at Denver’s West High. Also, remember that the report probably doesn’t capture the full extent of the remediation problem. The document notes “the data do not include recent graduates who enrolled in an out-of-state college, delayed entry into higher education for at least one year after completing high school, were not assessed [for remedial needs],” or for whom data was missing.”
For more data on your own school district and high schools you can download the full report “2008 LEGISLATIVE REPORT ON REMEDIAL EDUCATION December 11, 2008” The best performing high schools in the Pikes Peak Region appear to be in the 15% range for remediation while the rest range significantly higher.
Should we be satisfied with either the international testing results or the remediation results? NO! It is difficult to sort through the School Accountability Report "excellent" ratings for some local schools and realize that the criteria are weak compared to the "real world" that our kids face when they get out of school. But it is clear that until the public demands better the huge inertia in the education establishment will make any improvements at a walking pace which might barely maintain our poor place in the global education comparisons. We will need to make our pace of change much faster if we hope to gain on the competition. The current ostrich mentality of “maybe if we ignore it, the problem will go away” isn’t working.
What difference do we see between the tests? Our 8th graders scored 508 on TIMSS math versus the average of 500. On the PISA, though, our 15 year-olds were 24 points below the OECD average math score. Thus, the overall competition in the OECD countries (those that matter most if we are to compete effectively) is tougher. A couple more stats to round out the picture include that the 90th percentile of 23 of the 30 countries is higher than for our kids. Also, only 1.3% of U.S. students were in the highest proficiency level in 2006 PISA math (the last time it was given). This was half the OECD average, in the same range as Greece, Mexico, Portugal and Turkey.
Mark Schneider of the American Institute for Research computed effect sizes for some of the education performance of different entities. Effect size as he uses it is the “standardized difference” between the means of the distributions of different groups. It is expressed in standard deviation terms. Thus, when he reports an effect size of 1.1 for U.S. versus Hong Kong for 4th grade math he means that the mean of Hong Kong 4th grade math scores is 1.1 standard deviations above the mean of the U.S. students. This is a huge difference meaning that 86.4% of Hong Kong students score above the mean U.S. score. Other comparisons for 4th grade he gives are the effect size for Massachusetts vs. Mississippi of 0.8 based on the NAEP, effect size for U.S. public schools with lowest levels of poverty vs. U.S. schools with the highest levels of poverty is 1.5 based on TIMSS data. This means that in the lowest poverty schools 93.3% of kids would score above the mean score for those in the highest poverty schools. He gives other examples for 4th and 8th grade students but you get the picture. We have a huge amount of improvement to make to prepare our kids to compete with their global peers.
The second big report coming out this week was the college remediation report from The Colorado Commission on Higher Education. The title of the press release, “College remediation rates stuck at 30 percent” well summarizes the report's finding that remediation levels are not improving. This report gives lots of data from both 2 year and 4 year colleges. “At two-year schools, 53 percent of students had to take at least one remedial course. The overall rate was about 21 percent at four-year schools. The biggest remediation required is for math although reading and writing have significant levels as well. The report also tabulates remediation rates by where students went to high school. The lowest rate was 5.6 percent at Jefferson County’s D’Evelyn High School; the highest was 80.8 percent at Denver’s West High. Also, remember that the report probably doesn’t capture the full extent of the remediation problem. The document notes “the data do not include recent graduates who enrolled in an out-of-state college, delayed entry into higher education for at least one year after completing high school, were not assessed [for remedial needs],” or for whom data was missing.”
For more data on your own school district and high schools you can download the full report “2008 LEGISLATIVE REPORT ON REMEDIAL EDUCATION December 11, 2008” The best performing high schools in the Pikes Peak Region appear to be in the 15% range for remediation while the rest range significantly higher.
Should we be satisfied with either the international testing results or the remediation results? NO! It is difficult to sort through the School Accountability Report "excellent" ratings for some local schools and realize that the criteria are weak compared to the "real world" that our kids face when they get out of school. But it is clear that until the public demands better the huge inertia in the education establishment will make any improvements at a walking pace which might barely maintain our poor place in the global education comparisons. We will need to make our pace of change much faster if we hope to gain on the competition. The current ostrich mentality of “maybe if we ignore it, the problem will go away” isn’t working.
Subscribe to:
Posts (Atom)