top of page
Search
Writer's pictureByron McClure

Should School Psychologists Disclose the Use of AI?

Updated: May 2

Introduction

If your child were undergoing the Individualized Education Program (IEP) process, would you want to know if AI tools were employed?

Good question, right??


Integrating artificial intelligence (AI) into school psychology introduces new ethical questions.


One significant issue is whether psychologists should disclose the use of AI in their assessment reports. While there is currently no precedent or formal guidance, the National Association of School Psychologists (NASP) has convened a task force and is currently crafting guidance on using AI in the field.


The Ethical Imperative for Transparency

Transparency in healthcare, law, and other fields is becoming standard, where professionals are expected to disclose AI involvement. The School Psych AI (SPAI) Ethics Council suggests that school psychology must follow suit and begin having these discussions. That is why the SPAI Ethics Council was created. The primary aim of the Ethics Council is to ensure that AI technology supports students in a way that is fair and safe and respects their privacy. Importantly, the integration of AI tools into educational settings must be done thoughtfully and responsibly. 


Deliberations of the School Psych AI Ethics Council

The Council's first task was to develop ethical guidelines for AI use.


Setting the standard of practice, the Council landed on six guiding principles to help guide the ethical usage of AI in the field of school psychology, education, and solution development. The Council released these guiding principles at the NASP convention in February 2024.


The Council meets quarterly and discusses emerging ethical considerations and how they may impact school psychologists and educators.


Our recent quarterly meeting focused on whether AI use should be disclosed in reports. 


Guiding Principles and Considerations that Guided Our Discussion

1. Transparency and Informed Consent:

  • The Council recommends that the use of AI should be disclosed as part of the informed consent process prior to assessments. This approach respects the autonomy of parents and students, allowing them to make informed decisions about participation in AI-assisted evaluations.


2. Accountability and Professional Judgment:

  • While AI can enhance efficiency, the responsibility for any conclusions and recommendations rests with the psychologist. Disclosing AI use emphasizes that human judgment remains paramount, reinforcing the psychologist's accountability.


3. Fairness and Equity:

  • Ensuring that AI tools do not perpetuate biases is crucial. Transparency about AI use allows stakeholders to understand how technology is applied fairly and responsibly, fostering trust and accountability.


Recommendations

The Council believes that disclosing AI use is beneficial, primarily because it aligns with our guiding principles of transparency and reinforces accountability. However, it doesn’t necessarily need to be included within the psychological report to avoid potential distractions from the core findings. Furthermore, the Council believes that if the usage of AI should be disclosed, the appropriate time for disclosure should be communicated during the informed consent process, allowing upfront transparency without compromising the report’s focus.


Conclusion

To disclose or not to disclose? That is the question.

As the field of school psychology increasingly incorporates AI, the discussion about whether to disclose the use of AI in reports remains active. We do not have a definitive answer yet; instead, we continue to explore the implications and will gather insights. This ensures our approach remains thoughtful and centered on the best interests of students and their families. We commit to refining our practices as this area develops based on new understanding and the field's evolving needs.


Our current thinking is that including AI disclosure during the informed consent process enhances transparency and trust without shifting focus away from the core assessment objectives. To be clear, the Council convened and discussed the intricacies and nuances. We know more questions are still needed.


Additionally, we reflected on how written reports are already information-dense, and as a result, they don’t always get read with the care and attention they deserve. For this reason, we do not support adding any information about AI within the reports themselves. Instead, ideally, you would disclose the use of AI during the informed consent process. This approach provides parents and guardians with all the necessary information upfront, allowing them to make informed decisions about their child's evaluation.


By communicating the use of AI clearly at the beginning of the process, we uphold ethical standards and reinforce our ethical guidelines of transparency and accountability, ensuring that the primary content of the report focuses squarely on the student’s needs.

Informed consent is a necessary part of this process, but it is only one piece.


We are committed to implementing best practices regarding transparency in AI and developing novel processes in collaboration with industry experts in school psychology, education, law, technology, security, and other relevant industries.


Our Council strongly supports maintaining transparency and accountability through informed consent. We strictly adhere to the highest data security and privacy standards.


We proudly set the standard of practice when it comes to using AI in the field of school psychology. Our focused and thorough discussions demonstrate our commitment to upholding the highest ethical standards in our field.


Contributors: Terence Jackson, Chief Security Advisor, Microsoft; Dan Florell, Ph.D., NCSP – Eastern Kentucky University; Dr. Soye Zaid-Muhammad, Author of The Power of A Thousand Heroes (The PATH); Micah Reid, Machine Learning Engineer; Rachele Teson, National Certified School Psychologist (NCSP); Juan Felipe Gomez, PhD candidate, Harvard University, Calmon Lab; Dr. Byron McClure, NCSP, Founder, School Psych AI

1,517 views1 comment

1 comentário


Don Sherwood
Don Sherwood
13 de jun.

I don't necessarily disagree. However, if we need to disclose our use of AI as part of the informed consent process, where do we draw the line regarding the disclosure of other assessment tools? For instance, if we consider AI to be an assessment software tool, should we also disclose our use of other software tools like the XBASS or any other interpretive assessment software? Additionally, if we seek consultation from AGI (when it comes), should we also be disclosing our consultation with, say, another school psychologist or relevant field expert?

Curtir
bottom of page