AI in Psychology and Counseling
Like it or not, AI has already permeated our lives and practices. Artificial intelligence in psychology, counseling, school psychology, and our children’s phones begs us to understand this digital revolution so that we can protect and serve our population. What follows is a review of current uses and concerns, which are changing by the day. This is happening fast and we need to continue seeking to be aware of how these changes affect our practices and people.
https://positivepsychology.com/ai-therapy
By 1966, we saw the beginning of artificial intelligence in psychology, with the early chatbot Eliza convincing patients they were conversing with a real therapist (Weizenbaum, 1976; Mullins, 2005).
https://www.apa.org/monitor/2023/07/psychology-embracing-ai
Great overview one of my favorite introductory articles about AI in Psy
https://www.reuters.com/technology/us-congress-consider-two-new-bills-artificial-intelligence-2023-06-08/ “One would require the U.S. government to be transparent when using AI to interact with people and another would establish an office to determine if the United States is remaining competitive in the latest technologies.”
https://www.reuters.com/technology/stalled-eu-ai-act-talks-set-resume-2023-12-08
“The agreement bans cognitive behavioural manipulation, the untargeted scrapping of facial images from the internet or CCTV footage, social scoring and biometric categorisation systems to infer political, religious, philosophical beliefs, sexual orientation and race.”
Artificial Intelligence in Psychology Applications
As learning tool
Scoring protocols, generating reports, reviewing insurance claims
In lieu of live therapist
As therapy prosthetic- Centaur model; extension, elaboration, intervention brainstormer, as quality control for choosing research based interventions, notetaking automation, has been used effectively to treat sleep problems and chronic pain management
Problems– can be unpredictable and unsafe, privacy, may or may not remember previous sessions with a patient, misinformation.
“Rogue chatbots” have professed their love to users and discriminated against users based on gender, age, disability, and sexually harassed minors.
Spreading malware and obtaining private or sensitive information.
Has been used to create deep fakes, groom, and abuse.
Can result in people using AI as a tool to begin treating people like tools as well.
AI character lineup
Siri
Alexa
Chatgpt, GPT3- https://chat.openai.com/g/g-CYjbmE387-social-story-weaver Flavur6!!!!!
DALL-E 2
Bing AI
Perplexity
Cursor
Micorsoft copilot
Google gemini
Claude
Wysa- CBT for anxiety and chronic pain, can be used as supplement to give reminders and present exercises such as cognitive restructuring, does not collect personal information
Eleos- listens to sessions, takes notes, highlights themes and risks for practioner to review
Lyssn- evaluates providers on adherence to evidence-based protocols
Drift- marketing and sales
Replika- “Join the millions who have already met their AI soulmate.” Claims over 10 million users
Support ninja- customer service
Future directions
Safety and oversight
Privacy
Report generator and protocol scorer for standardized behavior rating forms utilized in psychological and educational testing
Training for new clinicians
Better understanding of human cognition, language, personality
Better mass data research results and ability to predict human behavior
American Counseling Association says
Informed consent
Protect privacy and protect from harm
Not for Dx or crisis response
If used for mental health consult with licensed professional
Artificial Intelligence in School Psychology
Sophia “saves 3 hours in report writing” https://www.schoolpsych.ai/
NASP six guiding principles in using AI in school psychology:
https://www.schoolpsych.ai/_files/ugd/cbd354_0ff04eb2133b49bda73dbd2205fe242a.pdf
California Association of School Psychologists; ). Practitioners need to understand how this data is collected, stored and used to ensure that student privacy is protected (CASP, 2020). At this time, AI platforms are not considered FERPA/HIPAA compliant in the exchange of information. SP/LEPs should avoid using personally identifiable information, including names, IDs, addresses, etc. on AI platforms. AI systems may also track student data in order to create personalized experiences (Akgun & Greenhow, 2021). This creates risk for students unintentionally disclosing their own sensitive or personally identifiable information. Consent and assent for usage with AI systems should be considered and students should not be required to share unwanted data.
https://www.aspponline.org/docs/AI_ASPP_Presentation.pdf
Add this to your boilerplate consent for eval forms!!!
Substitute codes for personal info then edit outside of AI and explore all means of protecting sensitive data.
Pick one person in your district to be the AI smartypants to stay on top and inform the rest periodically on evolving updates
Can it be used as an individual tutor for inattentive/off-task distractible students?
I want to create healthy happy life
It can be hard to work with a mind that keeps going to the problems and worries. It's time to teach children their power over thoughts and feelings.
I would like teachable exercises for; replacing thoughts that are not helpful, reasonable, or true, creating joy and emotional resilience, Mindgarden metaphor illustrating power and choice in thoughts, Dream Book strategy for identifying clear goals and building motivation, a video explaining how NOT to let others or situations have the power to bring you down!
No comments yet.