Lexia® LETRS® Efficacy Research 7/20/2023 https://www.lexialearning.com/resources/research/lexia-letrs-efficacy-research LETRS: Weasel or Energy Efficient Light Bulb Efficacy = power to produce an effect. One of the impacts of the Read Act here in Minnesota is that many teachers will be forced to go to one of three state-approved re-education programs for professional development (see Figure 1). Figure 1. State-approved “professional development” for teachers. Approved Minnesota READ Act-Funded Professional Development Programs 1. CAREIALL: Advancing Language and Literacy – Center for Applied Research and Educational Improvement (CAREI University of Minnesota) 2. OL&LA: Online Language and Literacy Academy – Consortium on Reaching Excellence in Education (CORE) 3. LETRS: Language Essentials for Teachers of Reading and Spelling (Lexia) LETRS, LETRS for Administrators, and LETRS for Early Childhood Educators Evidenced-Based Professional Development Now I absolutely agree that continued professional development for teachers is essential for the health of our schools. It is not possible to create a finished teaching product in three semesters of any teacher preparation program. Teachers must receive continuous, legitimate, high-quality education in order to continue their evolutionary journey toward being and becoming a master teacher. If you’re not evolving, you are devolving. And since part of the Read Act calls for “Evidence-based" instruction, I am absolutely certain that we can all be highly confident and absolutely sure that the three professional development programs identified by the Minnesota Department of Education and listed in Figure 1 are indeed evidenced-based. No? Of course they are. Because if they weren’t, that would mean the whole Read Act was a mirage, wrapped in an illusion, and swaddled in a boondoggle. It would also mean that Representative Heather Edelson, the sponsor of this bill, was a gullible bunny with lips stained red from drinking all the profit-based Kool-Aid. So, what does it mean to be “evidence-based”? According to the Read Act, evidence-based means ... “the instruction or item described is based on reliable, trustworthy, and valid evidence and has demonstrated a record of success in increasing students' reading competency in the areas of phonological and phonemic awareness, phonics, vocabulary development, reading fluency, and reading comprehension” (The Read Act). Since LETRS is one of these state-approved professional development programs, and because this is listed on the Minnesota Department of Education website, we can all be absolutely certain that it’s “based on reliable, trustworthy, and valid evidence and has demonstrated a record of success in increasing students' reading competency in the areas of phonological and phonemic awareness, phonics, vocabulary development, reading fluency, and reading comprehension.” Science of Reading The Read Act mandates that all reading instruction in Minnesota be based on the science of reading. This sounds like a very good thing. We know that science is a good thing, and using science in reading instruction is a good thing. But what exactly is meant by the “science of reading”? The Reading League defines the science of reading as “a vast, interdisciplinary body of scientifically-based research about reading and issues related to reading and writing. This research has been conducted over the last five decades across the world, and it is derived from thousands of studies conducted in multiple languages. The science of reading has culminated in a preponderance of evidence to inform how proficient reading and writing develop; why some have difficulty; and how we can most effectively assess and teach and, therefore, improve student outcomes through prevention of and intervention for reading difficulties” (The Reading League). Just listen to the important words here: interdisciplinary, scientifically-based research, thousands of studies, preponderance of evidence, improve student outcomes, prevention of reading difficulties. Who could possibly argue with these important sounding words? But it’s still a bit unclear. What exactly is meant by the science of reading? For a better understanding, let’s turn to Dr. Timothy Shanahan. In a recent article he published in Reading Research Quarterly, the good Dr. Shanahan explained that the SoR as commonly understood today as the exclusive use of strategies that have been shown to be effective using controlled experimental research and conducted in actual classroom learning environments. Accordingly, this is the only type of research that should be used to design reading programs and make reading policy. This is very important. This enables us to most assuredly, most indubitably, without question, know that the three Professional Development Re-Education Programs mandated by the Minnesota Department of Education have all been “shown to be effective using controlled experimental research and conducted in actual learning environments. Of course! We would expect nothing less from the Minnesota Department of Education. LETRS This podcast examines one of these: Language Essentials for Teachers of Reading and Special (Lexia) or LETRS. I wanted to find the “reliable, trustworthy, and valid evidence” that “has demonstrated” that LETRS had “a record of success in increasing students' reading competency in the areas of phonological and phonemic awareness, phonics, vocabulary development, reading fluency, and reading comprehension”. I was eager to start reading all the research showing that LETRS professional development had a demonstrated record of success in increasing students’ reading competency. Specifically, I was looking for three things: 1. A vast, interdisciplinary body of scientifically-based research linking LETRS to improved teaching performance. 2. A vast, interdisciplinary body of scientifically-based research linking LETRS to improved student reading outcomes. 3. A vast, interdisciplinary body of scientifically-based research providing evidence that LETRS was more effective than other types of professional development in improving teacher performance or student reading outcomes. I knew that I would find the research that was conducted over the last five decades across the world, and derived from thousands of studies conducted in multiple languages to support the proposition that LETRS enabled teachers teach better, readers read better, and that LETRS was better than something else. But, alas alack, when I searched the database at Minnesota State University, nothing came up. Hmmm. Hmmm. Thank You Lexia Learning!! However, after a short Google search I did come across the Lexia® website (www.lexia.learning.com). Lexia publishes LETRS. According to the website, “For 40 years, Lexia® has led the science of reading revolution helping educators create real literacy change.” On their site is says that Lexia learning contains “Science of Reading K-8 Solutions.” Who doesn’t like solutions? And since it had pictures of happy smiling teachers, and happy smiling children, I just knew I could trust this to be an accurate and reliable source of information. On this website, I found what was called, Lexia® LETRS® Efficacy Research. It was published on 7/20/2023 (https://www.lexialearning.com/resources/research/lexia-letrs-efficacy-research). Efficacy means the power to produce an effect. Wow! This is exactly what I was looking for. Research to show the power to produce an effect, which in this case would be improved teaching and improved student outcomes. Thank you Lexia Learning. I knew I that could depend on you. LETRS Efficacy Research The Introduction to the Efficacy Research told me that “LETRS teaches the skills needed to master the fundamentals of reading instruction (p.1).” “Educators who complete LETRS gain the deep knowledge needed to be literacy and language experts in the science of reading.” Excellent! Now we’re cooking with beans. But then I ran across a confusing passage. It said, “Qualitative research and non-causal quantitative research can offer important and unique insights into the nuances of educator experiences and the factors that shape their use and perceptions of LETRS.” Well, yes. It is important that we embrace the full spectrum of research methodologies in coming to understand reading instruction. But this is not what the SoR allows. The Science of Reading is understood to be the exclusive use of strategies that have been shown to be effective using controlled experimental research and conducted in actual classroom learning environments. This is the only type of research that should be used to design reading programs and make reading policy. But there it was on page 1, “qualitative and non-causal quantitative research.” How was it possible to find causes using non-causal quantitative research? And is it possible that Lexia Learning and LETRS would be held to a much lesser scientific standard that the reading teachers in Minnesota? Key Findings The primary purpose of LETRS is to “improve teacher knowledge and instructional practice” (pg.3). The report says that “the weight of empirical evidence suggests it can improve teacher knowledge and instruction when used as intended” (p. 3). Empirical means data collected by observation or experience. You can measure it. And in this case, there was a weight of it. A weight of empirical evidence. That must mean there’s a whole bunch of evidence that can be Because I have nothing else going on in my life, I read the LETRS Efficacy Research and examined each of the 18 research studies published to date that Lexia said, “constitutes the evidence base for LETRS” (p. 1). Below are the five key findings with my comments below each. #1. Improved teacher knowledge and practice. Teachers who completed LETRS training demonstrated higher levels of knowledge and improved instructional practice across a variety of objective and self-rated measures. Analysis: This is not accurate. Instead, teachers who completed LETRS training demonstrated higher levels of LETRS knowledge, as we would expect. If you teach something, students are generally going to score higher on measures of that something. However, there was nothing in any of the research to suggest that LETRS knowledge was linked to improved teaching performance. Also, in most cases teachers completed a survey indicating their perceptions and beliefs about things. Self-rated measures are hardly empirical evidence. #2. LETRS often implemented with other interventions. Schools, districts, and states that implement LETRS often do so alongside other large-scale initiatives. Educators variously perceive these initiatives as helping or hindering LETRS implementation. Analysis: This key finding doesn’t tell us anything. LETRS is often implemented alongside other large-scale initiatives. Okay. So what? Educators have perceptions of these initiatives. Okay. Again, so what? There was nothing in the research that linked LETRS with teacher practice our student outcomes. Also, teachers perceived the initiatives as helping or hindering LETRS implementation. Okay. But this tells us nothing of the efficacy of letters. And the only choices they have is that it either helps, hinders, or does nothing. They said these initiatives are perceived as either helping or hindering LETRS implementation. Perceptions and implementations. So what? Also, I reviewed all the studies. This was NOT a major take-away from any. #3. Implementation linked to improved outcomes. Positive teacher outcomes were most likely to be observed in studies that reported moderate to high levels of implementation. Analysis: This key finding is finding is blatantly false. There were no teacher outcomes reported in these studies, only teacher’s perceptions of their knowledge or knowledge related to LETRS. There were no outcomes reported to teacher practice our student reading achievement. Perceptions are not outcomes. #4. Educators perceive LETRS learning to be essential. Studies that address educator perceptions of LETRS suggest that educators view their learning as playing a positive, if not essential, role in improving student reading. Analysis: Only the studies that addressed educators’ perceptions of LETRS. A perception is not a measurable outcomes. It is a perception. There is no hard data to tie LETRS to teacher performance or student outcomes. Also, this is a self-selected population. Only educators who took the LETRS course found it to be helpful. Unless it was mandated, they wouldn’t be taking the course unless they thought it might be unhelpful. The conclusion here is a stretch. #5. LETRS demonstrates remarkable adaptability. LETRS has been implemented in a variety of contexts, ranging from single schools to state-wide multicomponent literacy initiatives. While careful implementation planning is always warranted, challenging contexts may call for support from Lexia’s Customer Success Management Analysis: There was nothing in any of the 18 studies related to “remarkable adaptability”. Also, saying that some contexts may call for support from Lexia’s Customer Success Management is hardly a ringing endorsement for LETRS. It certainly says nothing about it’s efficacy. I’ll be looking specifically at the research in my next podcast. But at this point I’m asking myself if the people who wrote this report are weasels? Or energy efficient light bulbs. A weasel is defined as a deceitful or treacherous person. An energy efficient light bulb is one that is not too bright.