GADDIS EDUCATION CONSULTING
  • Home
  • About Us
  • Our Process
  • Services
    • K-12 School Districts
    • EdTech Solution Providers
    • Speaking & Faciitation
  • Thoughts, Ideas, and Lessons Learned
  • Contact Us

Thoughts, ideas, and lessons learned.

The Power of 15 Minutes

9/20/2025

0 Comments

 
When I reflect on my time leading education teams, one lesson stands out: the power of simply taking time to listen. On small teams, I made it a priority to get to know people as individuals—their families, their strengths, and even their quirks. Together, we built trust, solved problems, gave feedback openly, and had fun along the way.

But when I became Chief Technology Officer for the Wake County Public School System—suddenly responsible for 125 staff members plus contractors—I knew I needed a way to recreate that same sense of connection at scale. That’s when I discovered the “Power of 15 Minutes.”

Fifteen minutes may not sound like much, but it’s enough time to make someone feel seen, heard, and valued. Here’s how I structured the process:

The Steps
  1. Invite every team member to schedule 15 minutes with you.
  2. Share four guiding questions in advance so they can reflect before the conversation.
  3. Let them lead the discussion in whatever direction matters most to them.
  4. Take notes—not to judge, but to remember and honor what they shared.
  5. Look for patterns across all conversations, then bring those insights back to the group.

The Questions
  • Tell me a little about yourself.
  • What do you enjoy the most about your job?
  • What about your work would you change (outside of an increase in pay)?
  • What does support from me look like?

These simple conversations provided more than just surface-level connections. They revealed untapped strengths, identified barriers holding people back, and gave me direct insight into what kind of leadership the team needed.

Most importantly, when I brought the findings back to the team, they saw that their voices mattered. They understood that leadership wasn’t just about big strategies—it was about listening, learning, and acting together.

In the fast pace of education, we often feel like we don’t have time. But the truth is, 15 minutes can change everything.
0 Comments

The Path to Digital Wellness: A Shared Responsibility in K-12 Schools

6/30/2025

0 Comments

 
This article by Marlo was originally posted by RTM K12 here: https://rtmk12.com/the-path-to-digital-wellness-a-shared-responsibility-in-k-12-schools/
 
In today’s world, digital wellness is more than just managing screen time—it’s about cultivating a balanced, responsible, and safe digital life. For students, staff, and parents alike, digital wellness encompasses behavioral health, online safety, digital citizenship, and healthy screen habits. But who is responsible for ensuring digital wellness in K-12 schools? 


The answer is simple: 
everyone.
​

What is Digital Wellness?
Digital wellness refers to the intentional use of technology in a way that supports mental, emotional, and social well-being while minimizing risks. 
This includes:
Picture
Each of these areas requires the active participation of students, parents, and schools to create a well-rounded, informed, and safe digital culture.

Behavioral Health: Understanding the Inside Out
The way we interact in digital spaces isn’t just about technology—it’s about emotions, relationships, and decision-making. Social media posts can shape self-esteem, gaming can influence behavior, and online conversations can impact mental health. Navigating the digital world requires more than just technical skills; it requires emotional intelligence.

One powerful approach to fostering digital well-being is the CASEL Social-Emotional Learning Framework, which helps students build resilience and develop healthy digital habits. By focusing on five key competencies, students, educators, and families can better understand and manage the impact of technology on their emotional and social lives:

Self-Awareness
Recognizing how digital interactions influence emotions and behavior. For example, understanding how endless scrolling on social media can affect self-esteem or how excessive gaming may contribute to frustration or mood swings.

Self-Management
Learning to regulate emotions and screen time, ensuring a balanced and healthy relationship with technology. This means knowing when to take breaks, set limits, and engage in offline activities.

Responsible Decision-Making
Making intentional, ethical choices about how to interact online. Instead of reacting impulsively or engaging in negative digital behaviors, students can learn to choose positivity, kindness, and integrity.

Relationship Skills
Developing positive online interactions by practicing respect, effective communication, and conflict resolution. These skills help prevent cyberbullying, build supportive digital communities, and foster meaningful connections.

Social Awareness
Empathizing with others and understanding the diverse perspectives found in digital spaces. This includes recognizing how words and actions affect others and learning to engage in conversations that are inclusive and respectful.

When students develop these competencies, they become more mindful digital citizens – capable of managing their emotions, making thoughtful online choices, and engaging with others in meaningful ways. Educators can create supportive learning environments, helping students reflect on their digital behaviors, while parents can identify when digital habits may be causing emotional distress and step in with guidance and support.

Ultimately, digital wellness is about balance, awareness, and intention. By integrating these principles into our approach to technology, we can help students develop healthy, responsible, and emotionally intelligent online habits that serve them well beyond the screen.

Online Safety: Preparation Over Panic
The digital world is full of opportunities—but also risks. Cyber threats, scams, and privacy breaches are an ever-growing concern, yet many families and schools struggle to address them effectively. A survey in 2021 revealed that 16% of parents never or very rarely discuss online safety issues with their children, leaving kids to navigate these challenges largely on their own.

At the same time, children are spending more time online than ever before. Recent data shows that over half of teenagers (50.4%) spend four or more hours online every day, excluding schoolwork. With so much of their social and personal lives tied to digital spaces, the disconnect between screen time and proactive safety education presents a serious gap—one that can lead to real consequences.

So how do we bridge this divide? The solution isn’t fear—it’s education, open conversations, and intentional guidance at every level.
For students, this means learning how to identify risks, protect personal information, and practice safe online habits in everyday interactions. Whether it’s recognizing phishing scams, understanding privacy settings, or knowing when to disengage from harmful content, students need the tools to make smart digital decisions.

Parents, too, have a key role to play—not by blocking all access, but by modeling safe behaviors, setting clear boundaries, and creating an open dialogue about online experiences. When kids feel comfortable discussing their digital lives with trusted adults, they’re far more likely to seek guidance when challenges arise.

Schools must also take an active stance in online safety education. Cybersecurity awareness, digital literacy, and ethical technology use should be integrated into learning environments. Schools can provide students and families with resources, set expectations for responsible online behavior, and lead by example in how technology is used in classrooms.

Rather than scaring students into compliance, we must empower them to think critically, question what they see online, and navigate digital spaces with confidence. 

Digital Citizenship: More Than Just Following Rules
As the lines between digital life and real life continue to blur, how we behave online is just as important as how we interact in person. The digital world offers incredible opportunities for learning, connection, and creativity—but without guidance, it can also present challenges, from misinformation to cyberbullying. That’s why teaching responsible digital engagement isn’t optional—it’s essential.

For students, this means recognizing that every online action leaves a footprint. Whether it’s a social media post, a comment on a video, or an interaction in a group chat, their digital presence shapes how they are perceived and can have long-term consequences. Learning to respect others, think critically, and stand up against harmful behaviors like cyberbullying empowers students to be active, positive contributors in online spaces.

Parents play a crucial role in fostering ethical online behavior. Conversations about digital dilemmas—such as privacy, misinformation, and online interactions—help children build the skills they need to make responsible choices. Encouraging kids to think before they post, engage in respectful discussions, and recognize potential risks helps them navigate the online world with confidence and integrity.

Schools must also take the lead in integrating digital citizenship into the curriculum. From classroom discussions on media literacy and online ethics to developing technology agreements that set clear expectations for responsible use, schools can equip students with the tools they need to make informed choices in the digital space.

The digital world is not a separate reality—it’s an extension of how we learn, communicate, and interact. When students, parents, and educators work together, we create a culture where technology is used for good, empowering young people to be responsible, ethical, and courageous digital citizens.

Screen Time: It’s About Quality Over Quantity
In today’s digital world, screen time isn’t just about how many hours we spend on devices—it’s about how we use that time. Not all screen time is created equal, and the key to digital wellness is intentionality in how we engage with technology.

For students, this means learning to balance online learning, entertainment, and real-world experiences. It’s about recognizing the difference between meaningful engagement—such as collaborating on a school project or exploring creative digital tools—and passive scrolling that provides little value. Developing habits that prioritize active over passive screen time is essential for long-term well-being.

Parents, too, play a critical role in shaping these habits. Creating tech-free zones at home, setting reasonable screen time boundaries, and ensuring that digital content is enriching rather than distracting can help children develop a healthier relationship with technology. More than just limiting screen use, parents can model mindful consumption—demonstrating when and how to unplug for family time, conversation, and rest.

Schools have the opportunity to lead by example, incorporating thoughtful screen time policies that encourage active learning and engagement while ensuring that students also have time for movement and face-to-face interactions. Whether through structured technology use in the classroom or strategies that promote device-free moments, educators can help students see technology as a tool rather than a default.

The goal isn’t to eliminate screen time but to redefine it—shifting the focus from hours counted to experiences gained. When students, parents, and educators work together, we can create a digital landscape where technology enhances learning, supports creativity, and respects the need for balance.

Building a Culture of Digital Wellness Together
Digital wellness isn’t just an individual responsibility—it’s a shared commitment among students, parents, and schools. Every interaction, every lesson, and every conversation shapes how young people navigate their digital world.

For students, this means developing self-awareness and making informed choices about how they engage online. It’s about recognizing the impact of their digital actions, practicing safe behaviors, and using technology as a tool for learning and connection rather than distraction.

Parents play a crucial role in this journey, not just by setting rules but by actively participating in their child’s digital experiences. By modeling healthy tech habits, discussing online challenges openly, and providing guidance, parents can create a supportive environment where children feel empowered rather than policed.

Schools, as hubs of learning and growth, have the responsibility to weave digital wellness into their educational framework. By integrating digital literacy, fostering responsible online interactions, and supporting both students and staff in navigating the digital landscape, schools can ensure that technology enhances learning rather than hinders it.

When students, parents, and educators work together, digital wellness becomes more than just a concept—it becomes a culture. A culture where technology is used intentionally, safety is prioritized, and well-being is at the forefront. It’s not about restricting access or creating fear; it’s about equipping our children with the skills, knowledge, and confidence to thrive in a digital world.

The question isn’t just how we manage digital wellness—it’s who we choose to be in the digital spaces we create. And together, we can build a future where technology supports, rather than detracts from, our well-being.

Want to Learn More?
Explore these expert resources on digital wellness, online safety, digital citizenship, and screen time:

Social-Emotional Learning & Digital Wellness
  • Teachers’ Essential Guide to Social and Emotional Learning in Digital Life (Common Sense Media)
  • Social and Emotional Learning in Digital Life (Common Sense Media Presentation)
  • Too Much of a Good Thing: The Impact of Technology on Teens’ Mental Wellness (Clarity Child Guidance Center)​

Online Safety Resources
  • Online Safety for Students: Where to Begin (Learning.com)
  • Keeping Children Safe Online (NSPCC)
  • Staying Sharp as Parents and Talking to Children about Online Safety (Innocent Lives Foundation)

Digital Citizenship Resources
  • Digital for Good: Raising Kids to Thrive in an Online World
  • Common Sense Media Digital Citizenship Curriculum

​Screen Time & Media Balance
  • Screen Time and Children: How to Guide Your Child (Mayo Clinic)
  • Is Too Much Screen Time Bad for Kids? It’s Complicated (UCSF)
  • Create Your Own Family Media Plan (AAP)
 
Note: Generative AI was also used in the editing of this document. 
0 Comments

Navigating Generative AI in K–12 School Systems: A Guide for Executive Leadership

6/1/2025

0 Comments

 
This article was written by Marlo for Advanced Learning Partnerships. The original article can be found here: https://alplearn.com/story/leading-genai-in-k12/

Generative AI is a transformative force currently reshaping the landscape of K–12 education. As stewards of a rapidly changing system, this moment calls for more than curiosity from district leaders; it requires a clear-eyed, strategic response. The rapid acceleration of AI and its reach into every aspect of teaching, learning, and leadership demands deliberate leadership. Leaders must make a commitment to vision-setting, ethical modeling, and system-wide learning.

The question is no longer if AI will impact education—but how we will shape its role in service of students, equity, and learning.  As guardians of long-term vision and student outcomes, district administrators must lead the conversation around how AI fits into their community’s values, organizational systems, and educational goals. They must lead intentionally, ethically, and with collective purpose. To effectively prepare teachers and students for the future, our k12 leaders must demonstrate vision, guide effective change management, prioritize upskilling, and model responsible AI usage.


Leading with Vision
Educational leaders should first develop a clear vision explaining the importance and relevance of generative AI tools or policies within their district context. The vision should resist the temptation to be determined by the capabilities or trends surrounding particular AI tools themselves. Instead, the vision should be rooted in the overarching mission and vision of the school district. Similar to the use of digital tools, focusing solely on AI tools can lead to misaligned and ineffective educational initiatives. Prioritizing the district's mission and vision ensures generative AI strategically supports broader educational goals, ultimately benefiting students. 

District leaders should be individually and collectively discussing the following:
  • What is my district’s “why” for exploring generative AI?
  • How does AI relate to our core values as a community?
  • How can AI improve learning outcomes, increase access, and/or close opportunity gaps for students?
  • What does responsible innovation look like in our community?
  • How can we bring all voices to the table—teachers, students, families, and community partners—to co-create that vision?


Embracing Change Management
Implementing generative AI isn’t a technology project—it is the responsibility of the school community. It impacts curriculum, operations, assessment, communication, and data governance. And while it can be easier to relegate AI to a technology department, it will never be adopted in purposeful, safe, and responsible ways without the engagement of the whole system. That’s why change management is essential.

District and school leaders must prioritize the following actions as part of a change management process:
  • Create a shared sense of urgency without resorting to fear. What is your “compelling why” for integrating generative AI in your district and/or school?
  • Build a guiding coalition that includes instructional and technical voices. As you do, consider whose voices are missing and be intentional about including them.
  • Plan for iterative rollout, not a single “launch moment.” Many districts are staging how generative AI is implemented - often beginning with leadership, staff, and finally students.
  • Address resistance with empathy, transparency, and support. People fear what they do not know. The more information you can provide the school community, the better. As Brene Brown says, “Being clear is kind.”

Throughout an intentional AI adoption process, district leaders should continue to individually and collectively reflect and discuss the following:
  • Who's pushing back, and why? Is it fear, lack of training, security worries, or just not seeing the point? How can we address these specific concerns?
  • What are people worried about – job loss, ethical issues, how it works, or its accuracy in education? How can we build trust and explain things better to ease their concerns? 
  • Who's excited about new technology, especially generative AI? What do they do and how can you get them involved? How can we create spaces for them to share their positive experiences and get others on board? How can we recognize and celebrate these early adopters?

​Frameworks like Kotter’s 8-Step Change Model or the Knoster Model for Managing Complex Change can help leaders structure the rollout in a way that reduces friction and builds trust.
“Address resistance with empathy, transparency, and support. People fear what they do not know. The more information you can provide the school community, the better.”
Elevating Digital and AI Literacy Across the System
One of the greatest risks in AI adoption isn’t the technology itself—it’s the lack of preparedness among staff and students to use it critically and ethically.

A primary danger lies not in the AI itself, but in the potential for its misuse or uncritical acceptance due to insufficient preparation. Unfortunately, many districts and schools continue to view digital literacy as supplementary rather than foundational to a student’s education. This stems from an old adage that students are “digital natives.” 

In 2001, Marc Prensky coined the terms “digital natives” and “digital immigrants.” Digital natives are people who have grown up with access to technology and have a natural understanding of it; whereas digital immigrants were born before most technology was developed and generally face challenges when trying to adapt to digital environments. Why is this problematic? The term “digital natives” has come to imply that students come out of the womb knowing how to use technology which allows K-12 educators to abdicate their responsibilities for teaching appropriate and responsible use.


​This marginalization of digital literacy has created a precarious environment for the introduction of AI. If core competencies such as media literacy (the ability to analyze and evaluate information from various sources), data privacy (understanding and managing personal information in a digital world), and critical thinking (the capacity for reasoned judgment) are not already well-established within a school community, the development of robust AI literacy will be severely hampered.
Picture
AI literacy, which encompasses understanding AI's capabilities and limitations, recognizing its ethical implications, and using it effectively and responsibly, cannot thrive on a weak foundation of general digital competency. It requires a prior understanding of how information is created, disseminated, and consumed in digital spaces, as well as a well-developed sense of ethical conduct in technological interactions. Without a strong groundwork in these areas, the potential benefits of AI in education risk being overshadowed by its misuse, misunderstanding, or even the exacerbation of existing digital inequalities.

District leaders need to reflect individually and discuss collectively the following:
  • What assumptions are we making about students’ digital skills? How do those assumptions show up in our policies or instructional practices?
  • How well are our students currently taught to evaluate information, recognize bias, and question sources online? Who is doing this work?
  • What structures do we have in place to teach and reinforce digital privacy, data security, and ethical online behavior? How are we ensuring these lessons are developmentally appropriate and consistently reinforced?


Consider adopting or aligning to frameworks such as:
  • ISTE
    • The ISTE Standards (for students, educators, education leaders and coaches)
    • Computational Thinking Competencies
  • AI4K12’s Five Big Ideas in AI
  • Digital Promise AI Literacy
  • UNESCO’s AI Competency Frameworks (for students and teachers)
​
​This work must be scaffolded, equitable, and ongoing—not a one-and-done PD. Educators need time, space, and support to explore how AI tools affect teaching, learning, and operations. To truly prepare schools for an AI-infused future, we must invest in a long-term strategy that includes differentiated supports, collaborative learning environments, and time for experimentation. Equity means ensuring that all staff—regardless of role or background—can engage meaningfully in this work. AI literacy is not a destination—it’s a continuous journey of growth, reflection, and adaptation.

ALP has partnered with several districts across North America to design AI competencies for teachers and students aligned to a national or international framework as well as the districts’ prioritized learning model.
AI literacy, which encompasses understanding AI’s capabilities and limitations, recognizing its ethical implications, and using it effectively and responsibly, cannot thrive on a weak foundation of general digital competency. ​

Modeling the Way: Ethical, Transparent, and Thoughtful Use
Leaders can’t ask their teams to explore AI if they’re not using it themselves—or worse, using it in ways that undermine trust.

Effective and ethical modeling of AI tools and platforms looks like:
  • Using generative AI for real work (e.g., drafting memos, brainstorming policy language) and naming when you do. Here’s an example: County of Sonoma Policy 9-6 Information Technology Artificial Intelligence (AI) Policy The use of AI is listed in the acknowledgements.
  • Practicing transparency: disclosing when content is AI-generated, fact-checking results, and citing sources. Note my own AI citation at the end of this piece.
  • Setting boundaries: defining what AI should not be used for (e.g., finalizing student evaluations or writing IEPs). The Georgia Professional Standards Commission recently provided their Ethical Considerations in the Appropriate Use of AI for Educators. This is a great example of effective communication of boundaries and definitions.
  • Surfacing ethical dilemmas and inviting discussion, rather than making AI use invisible. In the professional learning sessions I lead, I often share a set of ethical dilemmas and have the group discuss possible solutions. Facing History & Ourselves provides a sample lesson for middle and high school students around ethics.

District leaders should reflect on the following:
  • In what ways have I used generative AI tools in my own work? 
  • Have I shared those examples transparently with my team?
  • How am I investing in my own learning about generative AI in order to lead?

Modeling is more powerful than any PD session. It sets the tone for culture and expectations. When leaders and educators actively demonstrate thoughtful, responsible use of AI, they create a ripple effect across the school community. Modeling builds trust, normalizes experimentation, and reinforces shared values in real time. It moves AI literacy from theory to practice—showing, not just telling, what intentional and ethical implementation looks like. This kind of leadership is essential to fostering a culture where continuous learning and innovation are both expected and supported.


From Experimentation to Empowerment
The generative AI journey in K–12 is not about chasing the latest tools—it’s about cultivating the leadership mindset and organizational readiness to leverage innovation responsibly.

Leaders don’t need all the answers. But they do need to:
  • Ask the right questions
  • Center the needs of students and staff
  • Build a culture of curiosity, vulnerability, care, and co-creation

As we move forward, let’s shift from fear of the unknown to purposeful implementation—because the goal isn’t just AI adoption. It’s an educational transformation.

Statement on AI use:
To support the development of this blog post, I collaborated with ChatGPT by OpenAI and Gemini by Google. I used it to help organize and refine my ideas—particularly around structuring the outline, clarifying complex points, and generating reflection points for leadership. The content reflects my voice, experience, and perspective as an educational leader, with AI serving as a creative and editorial partner in the writing process.

​Reference

Brown, B. (2024, February 22). Clear is kind. unclear is unkind. Brené Brown. https://brenebrown.com/articles/2018/10/15/clear-is-kind-unclear-is-unkind/. Accessed 19 May 2025.
Prensky, M., & Heppell, S. (2010). Teaching digital natives: Partnering for real learning. Corwin, a Sage Company.
0 Comments

    Author

    Marlo Gaddis is the CEO of Gaddis Education Consulting.

    View my profile on LinkedIn

    Categories

    All
    AI
    Belonging
    Digital Wellness
    Leadership

    RSS Feed

HOME
ABOUT US
OUR PROCESS
SERVICES
BLOG
CONTACT US
Copyright © 2023
  • Home
  • About Us
  • Our Process
  • Services
    • K-12 School Districts
    • EdTech Solution Providers
    • Speaking & Faciitation
  • Thoughts, Ideas, and Lessons Learned
  • Contact Us