AI entering psychology should balance efficiency and risk
2025-12-24
The human mind is a complex and intricate universe, filled with ever-changing thoughts, indescribable emotions, and deep-seated motivations. Psychology, as a discipline that explores the inner universe, its core lies in understanding the vivid life experiences and emotional connections behind each unique individual. This profound understanding often relies on human specific empathy, intuition, and therapeutic relationships built on trust - which, according to common belief, cannot be truly understood by artificial intelligence (AI) trained solely on massive amounts of data. However, with the continuous increase in global demand for mental health and the boundaryless penetration of technology, AI has inevitably entered the door of psychology. On the one hand, AI tools have shown great potential in improving the work efficiency of psychologists and alleviating administrative burdens; On the other hand, its widespread application has also raised deep concerns about data security, ethical bias, and clinical reliability. In this context, exploring how AI can integrate into this deeply human focused discipline in a responsible and effective manner, balancing its efficiency and empathy, has become a core issue concerning the future of the industry. According to the 2025 Practitioner Survey released by the American Psychological Association in December, over half of psychologists have attempted to use AI tools in practice in the past year, but almost everyone is concerned that this technology may have an impact on their patients and society. A total of 1742 psychologists participated in this annual survey. 56% of psychologists have used AI tools to assist with work at least once in the past 12 months, up from 29% in 2024; 29% of psychologists indicate that they use AI at least once a month, which is more than double the proportion in 2024 (11%). These technologies can provide support for psychologists in various ways, such as providing administrative support, enhancing clinical care, and so on. However, as psychologists' understanding of AI deepens, they are also increasingly concerned about its potential risks. More than 90% (92%) of psychologists express concerns about the application of AI tools in the field of psychology, with the most common concerns including potential data breaches, unexpected social harm, biases in input and output, lack of rigorous risk mitigation testing, and inaccurate or hallucinatory outputs. Dr. Arthur C. Evans Jr., CEO of the American Psychological Association, stated that AI can help alleviate some of the pressures faced by psychologists. For example, improving efficiency and enhancing accessibility of medical services. But manual supervision is still crucial, and patients need to confirm that they can still trust healthcare providers. Up to now, few psychologists rely on AI to directly treat patients. Among psychologists who use AI, about 8% say they only use it to assist clinical diagnosis, and only 5% say they use chatbots to provide assistance to patients. The most common uses of AI assisted work among psychologists include: assisting in writing emails and other materials (52%), generating content (33%), summarizing clinical notes or articles (32%), and taking notes (22%) - almost all of which are daily paperwork. These often take up a lot of time and energy from psychologists, who are actually more willing to spend this time and energy on patients. Young patients are more likely to accept the introduction of AI in the psychology industry, and one of the driving factors is the sustained growth in demand for mental health services worldwide, particularly among the adolescent population. A recent study published by the University of Edinburgh in the British Journal of Psychiatry reveals this serious trend. This study tracked children born in Wales between 1991 and 2005 to measure how many had received Child and Adolescent Mental Health Specialist Services (CAMHS) before the age of 18. Research has found that only 5.8% of people born in 1991 received CAMHS services before reaching adulthood. In contrast, among the population born in 2005, this proportion sharply increased to 20.2%. This means that in less than 20 years, the proportion of young people in the UK seeking professional psychological services before the age of 18 has increased nearly fourfold. Professor Ian Kelleher, the research leader, stated that this study clearly demonstrates the surge in demand for psychological counseling, especially among adolescents, but the related research is far from sufficient, and people are still unable to fully understand the reasons behind it. The increasing number of young patients also implies the inadequacy of traditional tools. Experts warn that existing services may not meet the needs of today's young people. Professor Kelleher emphasized that unlike oncology or cardiology services, research and evaluation in the field of CAMHS are far from sufficient. Psychologists hope to provide the best possible care, but we need stronger modern evidence to guide our treatment decisions. This growing supply-demand imbalance highlights the necessity of using the latest technologies such as AI to improve service efficiency. From another perspective, some psychologists believe that the surge in young users are more accepting and trusting of the opinions given by AI assistants. Responsible introduction of AI into psychology: Faced with the potential and risks of AI, as well as the growing demand for services, psychologists need a clear set of norms to ensure that AI can be safely and responsibly integrated into their work. The American Psychological Association's advice to psychologists reflects this attitude. Relevant suggestions include: obtaining informed consent from patients by clearly communicating the usage methods, benefits, and risks of AI tools to them; Assess whether AI tools have potential biases that may exacerbate differences in mental health outcomes; Review whether AI tools comply with relevant data privacy and security laws and regulations; Understand how companies providing AI tools use, store, or share patient data. AI is not a panacea, and in fields closely related to human psychology and emotions, caution should be exercised when using it. The role of AI should be positioned as a powerful auxiliary tool, under the strict supervision of human experts, to help psychologists free themselves from heavy administrative work and devote more valuable time to complex clinical judgments, establishing treatment alliances, and humanized care. AI technology can alleviate the burden on administrative documents. Evans said, "Psychologists enter this field because they are passionate about improving people's lives, but they spend hours every day dealing with paperwork and dealing with the complex requirements of insurance companies. By using safe and ethical AI, psychologists can improve their work efficiency, allowing them to receive more patients and serve them better. ”(New Society)
Edit:Wang Shu Ying Responsible editor:Li Jie
Source:Science and Technology Daily
Special statement: if the pictures and texts reproduced or quoted on this site infringe your legitimate rights and interests, please contact this site, and this site will correct and delete them in time. For copyright issues and website cooperation, please contact through outlook new era email:lwxsd@liaowanghn.com