We talk all about more on Education.....

Showing posts with label evidence. Show all posts
Showing posts with label evidence. Show all posts
, , , ,

Return on Instruction (ROI) Revisited

The pursuit to improve never ends and nor should it. With all of the disruption we see as a result of the 4th Industrial Revolution, changes to how we educate kids have to be considered. The result has been districts, schools, and educators making a great deal of investment in an array of ideas, strategies, and solutions with the goal of improving learning for all kids. Obviously, this makes sense, and I am all for it. However, caution must be exerted when there is an urge to purchase the next “silver bullet” or embrace ideas that sound great on the surface but have little to show in terms of evidence of improvement at scale. Results, both qualitative and quantitative, matter, and this is something that everyone should be mindful of.

Over the years, I have not shied away from discussing the need to align ideas and strategies to research as well as evidence that shows in some way that there is an improvement in student outcomes. Seems fair and reasonable, right? You would think so as the teacher and principal in me know full well that results matter, especially when dealing with increased mandates, initiative overload, limited time, and lack of money. Herein lies why the allure of the “next best thing” is so compelling, and everyone is so quick to jump on the bandwagon. Just because something sounds or looks good doesn’t mean that it is, plain and simple. This applies to what is seen on social media, in marketing assets, and at conferences. Being a critical consumer is more important than ever. I’d even go as far as to say that it is our duty, something I elaborate greatly on in Digital Leadership


However, it is essential to go beyond just the consumption aspect as outlined above and be just as critical during the implementation phase. A sound strategic plan, not online focus, on where you want to go and how you will get there, but also a set of measures for success and a determination of how things went. A few years back I tackled this through strictly a technology lens and brought forward the concept of a Return on Instruction (ROI); borrowing from the term “return on investment” synonymous with virtually every profession.  In retrospect, this was shortsighted and not encompassing of all the many competing and complementing elements that are pursued simultaneously. Below is my evolved take:
 "When investing in technology, programs, professional development, and innovative ideas, there needs to be a Return on Instruction (ROI) that results in evidence of improved student learning outcomes." 
If you take a look at my original post, you will see that evidence comes in many forms, not just data. The bottom line is why make an investment to improve teaching, learning, and leadership but have nothing to show for it? That would prove to be quite frustrating, to say the least. To be clear, I am not talking about fluffy ideas or opinions, but actually substantive changes to practice that lead to real change. So how can you determine an ROI? Some guiding questions that might help are below:

  • How have instructional design and pedagogy changed?
  • How has the scaffolding of both questions and tasks changed?
  • How have student work and products changed?
  • How has assessment changed?
  • How has feedback changed?
  • How has the use of data changed?
  • How has the learning culture changed?
  • How has leadership changed?
  • How has meeting the needs of students who need specialized supports changed?
  • How has professional learning changed? 

The questions above will be answered differently as each district, school, and educator is unique as well as the respective culture. The key is to think broadly about financial and time investments to determine if in fact, they are paying off. Both are important. Another aspect to consider is realism. In the end, results matter.
Share:
Read More
, , ,

Are You a Critical Consumer?

Digital literacy is more important now than it has ever been.  The exponential evolution if the Internet and social media tools have allowed for the quick sharing of knowledge, ideas, images, videos, and opinions.  The result has been a double-edged sword.  In one respect everyone with a  smartphone has instant access to information at any time and from anywhere. I for one love the fact that I can get up to the minute news, sports scores, and weather in the palm of my hand. However, there is a downside that is beginning to plague society. We have seen an influx of misinformation, claims of “fake” news, inaccurate facts, distortion of the truth, broad claims, doctored results, and opinions with not much substance behind them.  Now more than ever we must not only teach our kids to be critical consumers of digital content, but we must also model the same. 



The education space is not immune to some of the prevalent issues and challenges described above.  This is not to say that amazing ideas and strategies aren’t being shared. In fact, I for one benefitted greatly as a principal when I learned about something shared on social media and then either implemented or adapted it in a way that bolstered the transformation efforts at my school.  Case in point.  As we explored moving towards Bring Your Own Device (BYOD) in 2010, I was able to glean powerful insights and evidence of efficacy from the Forsyth County School District in Georgia. The content they shared included policies, procedures, pedagogical techniques, and professional development, but more importantly, tangible improvement results.  Motivated and inspired I then began to seek out research and more examples of successful implementation that aligned with our goals while addressing specific challenges.

Going BYOD sounded like a great idea based on what I had either read or saw online. However, not everything I consumed addressed the realities we faced as a school. Some of were too “fluffy” or not practical.  It is important when reading a blog post or article to look beyond what in theory sounds good, but in practice might not lead to improvement.  Going beyond surface level opinions and ideas is really at the heart of critical consumption.  Since many of my queries when out through Twitter at the time that is how I received the majority of the information for consumption. As more and more tools and pathways have emerged to allow educators to share it is incumbent upon all of us to take a more in-depth look so that something isn’t done just for the sake of doing it or because it sounds really good.  
“Just because something sounds good on Twitter or looks good on Pinterest doesn’t mean it is an effective practice.”
The quote above has really helped ground my approach to what I consume and then ultimately use to improve professional practice.  It also extends well beyond social media to articles, books, keynotes, workshops, and presentations. We must acknowledge that with all of the great ideas and strategies there is an equal amount that just isn’t very good regardless of the hype surrounding them. By not good I mean that there will be difficulties in either implementing at scale or showing, not just talking about, better results.  To assist in taking a critical lens to what we see or hear consider the following questions:
  • Why is this idea or strategy good for my classroom, school, district, organization or professional growth?
  • How will it positively impact learners beyond just engagement?
  • Does it align to peer-reviewed research?
  • Is it realistic given culture, budgetary, demographic, socioeconomic, and facility challenges?
  • What qualitative and quantitative measures can be used as evidence to validate whether or not it is effective at improving outcomes? 
  • How can it be sustained and scaled?

Sharing will not and should not stop. Becoming a connected educator changed my entire trajectory thanks to what I was and continue to be able to glean from my Personal Learning Network (PLN) in addition an array of other means to get information discussed in this post. It is up to you to be a critical consumer to separate quality from what in theory seems like a great idea, but in practice won’t get the results that learners and educators are seeking. Sounding good just doesn’t cut it when the bold new world demands more from our learners.  

Share:
Read More
, , , , , , , ,

Measuring Impact with the Digital Practice Assessment (DPA)

Note: This post is directly related to my work at the International Center for Leadership in Education

Efficacy has been on my mind a great deal as of late, and as a result, it has been reflected in my writing.  When I think back to the successful digital transformation and implementation of innovative practices at my former school when I was a principal the key driver for us was the ability to show, not just talk about, evidence of improvement.  By combining both quantitative and qualitative measures, we were able to articulate the why, how, and what, as well as the detailed process that went into each respective change effort.  The “secret sauce” in all of this was the strategic use of digital tools to proactively share the details of our efforts and resulting impact.  


Image credit: http://www.assafh.org/

During my tenure as a principal, I was always in search of tools and processes to help measure the impact of the changes we were implementing.  Unfortunately, nothing existed.  As I work with schools and districts on a weekly basis, I am often asked how they can determine the impact and effectiveness of the many innovative initiatives they have in place. Practices such as BYOD, 1:1, blended learning, personalized learning, classroom and school redesign, branding, makerspaces, professional learning, etc.  This need served as a call to action of sorts and catalyzed my current work.  As Senior Fellow with the International Center for Leadership in Education (ICLE), I have worked with a fantastic team to develop services and tools to help districts, schools, and organizations across the world transform teaching, learning, and leadership.  One of these tools is the Digital Practice Assessment (DPA). 

The DPA creates the context for our work with leaders and teachers, providing authentic baseline data to support personalized professional learning. It begins by examining the strategies in place at each school or district that support student learning with technology in the areas of rigor, relevance, relationships, engagement, and overall culture. The process then moves to understanding the current leadership practices in place to successfully implement technology and innovative practices, aligned to the 7 Pillars of Digital Leadership & Learning (Student Learning, Learning Spaces & Environment, Professional Growth, Communication, Public Relations, Branding, and Opportunity). 


Through this proven model, our consultants can help schools and districts identify opportunities to begin their transformation or take their digital and innovation goals to the next level, leveraging the knowledge, experience, and practice of ICLE’s thought leadership. The DPA process consists of a combination of a self-reflection questionnaire rubric, on-site observations, and online inventories comprised of data and evidence collection. We then leverage evidence-based rubrics to observe leadership and instructional practices while collecting artifacts to provide evidence of effective digital learning and innovative professional practice. Once collected and analyzed, a detailed summary report outlining areas of success, focus opportunities, and recommended next steps will guide the professional learning partnership with ICLE, supporting the development of a strategic professional learning and implementation plan. 

Below is a summary of the DPA process:

Step 1: The Pillars of Digital Leadership Questionnaire is completed by the district or school. This 18 question rubric asks school leaders to reflect on their perceptions for where their school falls on a continuum from not yet started to well developed. During this reflective process, it is expected that school leadership teams collect and document aligned evidence for each item.  This information is completed and archived in the Professional Learning Portal (PLP), a free digital platform developed by ICLE to support schools in data collection,  reflection and goal setting, to grow and improve. The baseline evidence shared is in the context of digital leadership and learning (including examples of data, lesson plans, unit plans, student work, PLC minutes, rigorous digital performance tasks, walk-through forms, assessments, sample observations/evaluations, portfolios, PD plans, social media accounts, pictures, videos, press releases, media coverage, partnerships, etc). 

Step 2: On-site observations and interviews are conducted by consultants to validate perceptions and evidence collected for the seven Pillars of Digital Leadership Questionnaire, as well as targeted classroom observations of student learning, aligned to rigor, relevance and engagement. Additional data is collected and archived in the PLP during classroom observations. The idea is to engage school leaders in dialogue about their culture, student learning and digital integration, no matter where they are with their digital transformation. 

Step 3: The data and evidence are tightly aligned to ICLE’s research-based rubrics to provide a detailed view of where a district or school is with their digital transformation.The data and artifacts are analyzed, leading to a summary report that details the current state of practice at each school or in the district. 

Step 4: The DPA report is shared and discussed with the school leadership team. In partnership with ICLE, observations about the evidence collected are shared and discussed. During the strategic planning process, discussions focus on areas of strength and improvements to develop a tailored and personalized implementation plan.

Step 5: On-going professional learning is implemented and progress monitoring through the online Pillars of Digital Leadership Questionnaire is documented to determine the efficacy of the digital transformation.

The DPA process has been created to support districts and schools looking for ways to measure and articulate the impact of technology and innovation on practice.  While data is valuable, it moves beyond this as the only metric for success by actually taking a lens to an array of strategies and practices that combine to create a thriving learning culture.  

The DPA doesn’t just look at technology and innovation. It also provides insight on all elements of school culture and student learning.  In addition to being informed by a broad body of research and driven by evidence, the DPA process is also aligned to the following:


We don’t know where we are and how effective change is until steps are taken to look critically at practice. We hope that through the DPA process we can help you develop, refine, measure, and then share amazing examples that illustrate how efficacy has been attained.  

If you are looking for a method of determining where you are and where you want your district or school to be in the digital age, please contact Matt Thouin at ICLE (MThouin@leadered.com).  He can arrange for an interactive and detailed look at the DPA rubrics and process as well as the PLP platform from the convenience of your home or office.  We look forward to supporting you on your journey toward systemwide digital transformation. If you have any questions for me, please leave them in the comments below.

Copyright © by International Center for Leadership in Education, a division of Houghton Mifflin Harcourt. All rights reserved. 
Share:
Read More
, , , , , ,

Efficacy in Digital Learning

As a principal, the buck stopped with me.  I was reminded of this by numerous superintendents during my tenure as a school leader.  However, when we began moving forward with our digital transformation one particular superintendent asked me point blank what evidence I had that actually supported our claims that new equated to better. This not only stopped me in my tracks, but that moment in time provided the grounding that my school and I really needed.  For change to really be embraced by all stakeholders it is critical that we just don’t tell and claim that improvement is occurring, but that we also show. 

Accountability matters and is a reality in our work.  We are accountable first and foremost to our learners. As a supporter of the purposeful use of technology and innovative practices, I had to illustrate how effective these strategies were at improving learning.  Statements and claims didn’t cut it and this was more than fair.  It was at this time where the term efficacy kept finding its way into the conversation and my head. In the real world of education efficacy matters and it is important that this is part of the larger conversation when it comes to digital. It is a word that, in my opinion, has to be a part of our daily vocabulary and practice. Simply put, efficacy is the degree to which desired outcomes and goals are achieved. Applying this concept to digital learning can go a long way to solidifying the use of technology as an established practice, not just a frill or add-on.

The journey to efficacy begins and ends with the intended goal in mind and a strong pedagogical foundation.  Adding technology or new ideas without this in place will more than likely not result in achieving efficacy.  The Rigor Relevance Framework provides schools and educators with a check and balance system by providing a common language for all, creating a culture around a common vision, and establishing a critical lens through which to examine curriculum, instruction, and assessment. It represents a means to support innovative learning and digital practice as detailed in the description of Quad D learning:
Students have the competence to think in complex ways and to apply their knowledge and skills they have acquired. Even when confronted with perplexing unknowns, students are able to use extensive knowledge and skill to create solutions and take action that further develops their skills and knowledge.
Aligning digital to Quad D not only makes sense but also melds with a great deal of the conversation in digital and non-digital spaces as to why and how learning should change.  A framework like this emphasizes the importance of a strong pedagogical foundation while helping to move practice from isolated pockets of excellence to systemic elements that are scaled throughout the learning culture.  It also provides the means to evaluate and reflect in order to improve. 


Rigor Relevance Framework

Once an overall vision for digital learning is firmly in place you can begin to work on the structures and supports to ensure success.  This brings me back to efficacy.  The why is great, but the how and what have to be fleshed out.  Determining whether technology or innovative practices, in general, are effective matters.  Below I will highlight 5 key areas (essential questions, research, practicality, evidence/accountability, reflection)  that can put your classroom, school, district, or organization on a path to digital efficacy. 

Essential Questions

Questions provide context for where we want to go, how we’ll get there, and whether or not success is achieved.  Having more questions than answers is a natural part of the initial change process. Over time, however, concrete answers can illustrate that efficacy in digital learning has been achieved in some form or another.  Consider how you might respond to the questions below:

  • What evidence do we have to demonstrate the impact of technology on school culture?
  • How are we making learning relevant for our students?
  • How do we implement and support rigorous and relevant learning tasks that help students become Future Ready?
  • What is required to create spaces that model real-world environments and learning opportunities? 
  • What observable evidence can be used to measure the effect technology is having on student learning and achievement?
  • How can targeted feedback be provided to our teachers and students, so that technology can enhance learning?

Research

Research is prevalent in education for a reason.  It provides us all with a baseline as to what has been found to really work when it comes to student learning.  Now, there is good research and bad.  I get that. It is up to us as educators to sift through and then align the best and most practical studies out there to support the need to transform learning in the digital age. We can look to the past in order to inform current practice.  For example, so many of us are proponents of student ownership, project-based, and collaborative learning. Not only does digital support and enhance all of these, but research from Dewey, Vygotsky, Piaget, Papert, Bloom, and many others provide validation.  See the image below. For more on authorship learning click HERE.




One of the main reasons Tom Murray and I wrote Learning Transformed was to provide a sound research base that supports digital learning and the embracement of innovative practices.  The research of Linda Darling Hammond found that technology can have the most impact on our at-risk learners when it is used to support interactive learning, explore and create rather than to “drill and kill”, and constitutes the right blend of teachers and technology. This is just one of over 100 studies we highlight. Then there is the comprehensive analysis by John Hattie on effect size – a listing of the most effective instructional strategies that improve student learning outcomes all of which can be applied to digital learning. If efficacy is the goal, embracing a scholarly mindset to inform and influence our work, not drive it, is critical.

Practicality

All of what we do should align to the demands, and at times constraints, of the job.  This includes preparing students for success on standardized tests. If it’s not practical, the drive to implement new ideas and practices wanes or never materializes.  The creation of rigorous digital performance tasks that are aligned to standards and the scope and sequence found in the curriculum is just good practice. All good performance tasks include some form of assessment, either formative or summative, that provides the learner and educator with valuable information on standard and outcome attainment.  Again, this is just part of the job. 

The Rigor Relevance Framework assists in creating performance tasks that engage learners in critical thinking and problem solving while applying what they have learned in meaningful ways.  There is also natural alignment to incorporating student agency. This is exactly what so many of us are championing.  My colleague and good friend, Weston Kieschnick, has created a template that combines research and the practical aspect of performance task creation to assist you in creating your own.   Check it out HERE. You can use the template and go through the process of developing a rigorous digital performance task or just use it to inform as you design your own. 

Evidence and Accountability

As many of you know I do not shy away from openly discussing how important this area is. Just go back to my opening paragraph in this post for a refresher. Evidence and accountability are a part of every profession and quite frankly we need more of both in education to not only show efficacy in our work but to also scale needed change. Not everything has to or can be, measured. However, focusing on a Return on Instruction allows everyone to incorporate multiple measures, both qualitative and quantitative, to determine if improvement is in fact occurring. 

Reflection

When it is all said and done the most important thing we can all do is constantly reflect on our practice.  In terms of efficacy in digital learning consider these reflective questions from your particular lens:

  • Did my students learn? 
  • How do I know if my students learned? 
  • How do others know if my students learned? 
  • What can be done to improve? 
  • What point of view have I not considered?

Amazing things are happening in education, whether it be through digital learning or the implementation of innovative ideas.  We must always push ourselves to be better and strive for continuous improvement. The more we all push each other on the topic of efficacy, our collective goals we have for education, learning, and leadership can be achieved. 

Share:
Read More
, ,

Be the Example

The world is changed by your example, not by your opinion.” – Paulo Coelho

Change is hard.  I have been writing about this fact for years now.  It becomes even harder when we are not modeling the expectations that we set for others. This was the case for me early on during my days as a principal.  When it came to technology and innovation, I was great at telling others what they should be doing. After getting on Twitter in 2009 I realized that we had to be better for our kids.  As such I did what I was trained to do and what I thought was the most logical course of action to get buy-in from my staff.   I drafted memos and emails that provided guidance and examples.  I spent a great deal of time writing numerous detailed memos on everything from technology tools to improve assessment, developing a Personal Learning Network (PLN), and embracing innovative ideas. Then I waited.

The wait for any sort of change was never-ending.  I probably would have been still waiting if I hadn’t grabbed a teacher I had a good relationship with and asked him why no one was embracing all these new ideas and strategies I was pushing out. He was pretty blunt and to this day I am indebted to him. Basically he told me that no one was integrating technology or implementing innovative ideas because I wasn’t doing any of it myself.  His words and simple advice provided a great lesson in leadership.  



Asking others to do what we are not doing, or have not done, ourselves doesn’t lead to meaningful change. Research supports this claim. James Kouzes and Barry Posner have researched the topic of leadership for over 30 years looking at thousands of leaders in a wide range of industries throughout the world. Below are some key takeaways in relation to being the example:

Eloquent speeches about common values are not nearly enough. Exemplary leaders know that it’s their behavior that earns them respect. The real test is whether they do what they say; whether their words and deeds are consistent. Leaders set an example and build commitment through simple, daily acts that create progress and build momentum.
The personal-best projects we studied were distinguished by the fact that all of them required relentless effort, steadfastness, competence, and attention to detail.  It wasn’t the grand gesture that had the most lasting impact. Instead it was the power of spending time with someone, of working side-by-side with colleagues, of telling stories that made values come alive, of being highly visible during times of uncertainty, of handling critical incidents with grace and discipline, and of asking questions to get people to focus on values and priorities.

Leadership is not about telling people what do to, but instead taking them where they need to be. Setting an example through your own practice illustrates to others that change is a shared endeavor. It is about the collective where a title, position, and power don’t give someone a pass.  When it all is said and done leadership is about action, not talk and opinion (or memos and emails in my example). Setting an example and modeling are the first step. The next is a combination of support, accountability, and evidence that leads to efficacy. When everyone sees how the change(s) actually improve teaching, learning, and leadership the path to sustainability is started.

In my case I began to learn how to use certain technology tools after which I made myself available to then train my staff after school. I made my learning through a PLN visible and used the new acquired knowledge and skills during training sessions, faculty meetings, observation post-conferences, and evaluations.  The practice of modeling expectations actually strengthened the emails encouraging my staff to improve their practice. Over time change took hold and evidence of improvement bolstered our resolve to keep pushing the envelope. Together we were then able to show efficacy aligned to technology use and innovative ideas. 

Take time to reflect on whether or not your words are supported by appropriate actions.  Change is a collaborative process if it is to be successful.  Showing others that you are not just willing to learn, but how changes to practice actually improve teaching, learning, and leadership can and will have a lasting impact. Evidence matters and when aligned with the example you set no goal is out of reach. In the end it’s not about what is said, but what is done.  Be an example that empowers others to change.

Share:
Read More
, , ,

Own What You See

As a former science teacher I was always a fan of the scientific method.  It was a great process for students to actually do science in order to learn by designing an experiment to deeply explore observations and develop/answer questions.  The process itself was guided by inquiry, problem solving, and reflection. I fondly remember developing and testing out numerous hypotheses in the many science courses I took in high school and college. This type of learning was messy, unpredictable, and challenging, but it was also fun.  I think I refuted more hypotheses then validated, but the learning experience kept driving me to pursue eventual degrees and a teaching certificate in the sciences. 

Even though my science teaching days are long behind me, the scientific method has always stuck with me, as there are direct applications to leadership. Leaders must constantly make observations and own what they see. In the context of education, leaders must challenge the status quo if observations lead to a conclusion that a business as usual model is prevalent.  What is seen, or not, can be a powerful tool to develop critical questions that can drive needed change or improvement. 


This is extremely important regarding instruction.  As a leader do you really know or have a good handle on what is happening in your classrooms daily? Does your school or district work better for kids or adults? How do you know if technology and innovative practices are actually improving learner outcomes? Owning what you see requires improving observation and evaluation practices. The first step is to get into classrooms more to not only make observations, but to also begin collecting evidence that either validates or refutes the claims of improvement that are now heard more and more.  Getting into classrooms both formally and informally can provide a much-needed critical lens to support professional practice while also building powerful relationships in the process.

Owning what you see doesn’t just have to come from being physically present to make observations. Developing strategies to ensure a return on instruction through the collection of standards-aligned artifacts (lesson plans, projects, student work) and portfolios can clearly illustrate whether changes to professional practice are occurring or not.  Making observations and looking at evidence (or lack thereof) can lead to more questions that can drive change. This is a good start, but ultimately owning what you see requires action that results in improved outcomes. The more we can quantify this through multiple measures the better our chances are of initiating sustainable change that improves learning for all.

When you look around your building(s) or classrooms what do you see?

Share:
Read More