A call for responsible innovation in pediatric health care
Sections
Open Access Perspective
A call for responsible innovation in pediatric health care

Affiliation:

1Department of Pediatrics, Valley Children’s Healthcare, Madera, CA 93636, USA

Email: vchamarthi1@valleychildrens.org

ORCID: https://orcid.org/0009-0008-2610-0946

Venkata Sushma Chamarthi
1*

Affiliation:

2Department of Education Technology, Elsevier, Philadelphia, PA 19103, USA

ORCID: https://orcid.org/0009-0000-1546-5693

Ramandeep Kaur
2

Affiliation:

3Department of Research, WellSpan Health, York, PA 17403, USA

ORCID: https://orcid.org/0000-0002-4383-3411

Rahul Kashyap
3

Explor Digit Health Technol. 2026;4:101178 DOI: https://doi.org/10.37349/edht.2026.101178

Received: September 01, 2025 Accepted: November 17, 2025 Published: January 06, 2026

Academic Editor: Marco Tatullo, University of Bari, Italy

The article belongs to the special issue Expert Opinions on Digital Health Innovations

Abstract

Artificial intelligence (AI) is transforming healthcare by equipping clinicians and patients with tools that support more efficient, patient-centered care. In pediatrics, however, the implementation of AI demands a higher threshold for responsibility, transparency, and family-centered engagement. This perspective explores the opportunities and challenges of AI in pediatric healthcare, highlighting the unique ethical and developmental considerations that distinguish children’s care from adult medicine. Drawing on Kaiser Permanente’s seven principles for responsible AI, the article emphasizes the importance of augmentation over automation, the need for pediatric-specific validation, and the necessity of trustworthiness and fairness in clinical deployment. It outlines how AI can support primary care providers through enhanced decision support, early screening for developmental and behavioral disorders, including the potential for AI to create personalized developmental trajectories, moving beyond static population norms to provide earlier, more precise insights into a child’s neurodevelopmental progress, improved electronic health record usability, and risk prediction models. However, without careful governance, AI poses risks of bias, inequity, and erosion of clinician judgment. Policy recommendations include redesigning family consent models, ensuring robust clinician training, and mandating pediatric-specific testing of AI systems with diverse, representative datasets. Ultimately, AI should function as a supportive tool that strengthens, not replaces, human empathy, clinical expertise, and family-centered values. Responsible innovation is essential to ensure that children benefit equitably from AI while maintaining trust, safety, and compassion in pediatric healthcare.

Keywords

artificial intelligence, pediatrics, responsible innovation, clinical decision support, family-centered care

A call for responsible innovation in pediatrics

Artificial intelligence (AI) is transforming healthcare by equipping providers and patients with advice, guidance, and supporting tools that promote efficient, effective, and patient-centered care. However, as pediatric healthcare enters this new technological era, the implementation of AI must be grounded in principles that prioritize seamless integration into clinical pathways, patient safety, clinical judgment, and family-centered care. Kaiser Permanente (KP)’s comprehensive framework for responsible AI deployment offers a roadmap for how healthcare systems can harness the potential of AI while maintaining trust and ensuring safety within the pediatric population [1].

Although AI adoption in pediatrics is expanding, its integration remains uneven across clinical settings. Current applications include decision-support algorithms for diagnosis, predictive analytics for managing chronic diseases, and digital screening tools for developmental disorders. However, most systems remain in pilot or research phases, with limited pediatric-specific validation, evolving regulatory guidance, and persistent concerns about data privacy and clinician oversight.

The expansion of AI in pediatric healthcare presents both unique opportunities and challenges. AI machine learning (ML) algorithms are being explored for monitoring developmental disorders, enhancing diabetes management, designing clinical tools across specialties, and supporting mental health care through digital coaching [2]. Unlike adult medicine, pediatric care encompasses a broader range of social determinants of health, including complex developmental stages, family dynamics, and age-specific communication needs, all of which necessitate thoughtful AI implementation. Algorithms and coding infrastructures must be adapted to meet the needs of pediatric patients, thereby avoiding unintended biases or inequities in care [3].

KP has established itself as a leader by outlining seven principles for responsible AI use, emphasizing safety, reliability, trustworthiness, efficiency, user-friendliness, and privacy. Their model highlights that integrating AI into healthcare is not merely a technical task but a moral obligation. These principles are particularly crucial when AI is applied to the pediatric population.

Augmentation, not automation

AI deep learning models can analyze patient charts, enabling healthcare providers to make relevant recommendations and deliver enhanced, individualized care based on patients’ health risks. Providers can incorporate these recommendations, along with their clinical knowledge, to further personalize care. These AI predictive models can aid in identifying at-risk children and augment treatment plans for conditions such as obesity, diabetes, and asthma [2]. AI tools for early autism screening and behavioral issues like anxiety should be evaluated across diverse pediatric populations, especially in resource-limited settings. For instance, AI models that analyze longitudinal data from well-child visits—integrating growth metrics, milestone achievements, and clinical observations—can generate personalized brain development charts [4, 5]. These tools can enhance a pediatrician’s ability to detect subtle deviations from a child’s expected trajectory, prompting earlier and more targeted evaluations for conditions such as autism or cognitive delays [6]. To minimize the risk of misrepresentation, misplaced confidence, and algorithmic bias, internal and external validations of models/algorithms are necessary and should be reviewed regularly [7].

Pediatric primary care providers are often overwhelmed with patient volume, administrative paperwork, and regulations. A well-designed electronic health record (EHR) system that uses AI can improve workflow for clinicians. An efficient EHR with AI can transform pediatric primary care delivery. AI scribes can help clinicians spend less time on charting. At the same time, AI-powered EHRs can suggest necessary immunization recommendations, assist with tasks such as coding and billing, and ML models can prompt follow-ups. The clinician is ultimately responsible for checking, modifying, improving, and approving AI-generated notes.

KP has tested, monitored, and improved AI documentation tools through question-and-answer sessions and user feedback. KP ensured the creation of a safe and sound AI documentation tool in a large and varied healthcare delivery setting by collaborating with the training team, conducting post-deployment monitoring, and incorporating vendor feedback [8]. By satisfying these objectives, they have set a high standard for how AI can be applied in therapeutic contexts.

Trustworthiness, fairness, and public acceptance

AI use in pediatrics must earn the trust of families and clinicians by suggesting treatment that is compassionate, fair, and sensitive to cultural differences. Testing in larger systems, such as KP, and giving positive results are crucial steps in gaining acceptance. A systematic review of eleven United States (US) public opinion polls, representative of the country as a whole, revealed that many people believe AI can significantly assist healthcare in various ways, particularly by facilitating early detection, diagnosis, treatment, and disease prediction. However, there were also concerns about the privacy of health data and how it was utilized to inform decisions. The survey results show that people are both confident and doubtful about using AI in healthcare [9]. People are more likely to share health information if they trust the organization that collects it and if the AI’s goals and uses are clear. This raises a pivotal issue: For AI to excel in healthcare, particularly in pediatrics, it must not only pass clinical validation but also earn the public’s trust by being implemented in a transparent, equitable, and patient-centered manner. KP’s seven principles for responsible AI in healthcare have helped create a proactive, anticipatory framework for future clinicians to use AI in healthcare confidently [1].

Clinicians must be co-designers

Clinicians should collaborate to develop tools tailored to each child’s unique growth and developmental needs, providing valuable feedback to guide the design and implementation of AI solutions in primary care pediatrics. As AI becomes a key technology layer of healthcare, several key challenges must be addressed. These include accountability, data protection, public acceptance, and ethical concerns about data breaches, leaks of patient health information, and violations of the Health Insurance Portability and Accountability Act (HIPAA) [10]. The primary problem is ensuring that AI is used responsibly in healthcare. The ideas of KP are constructive, and they should be experimented with in various types of clinical care settings, including pediatric clinics. AI can assist with tasks such as taking notes and identifying risks, but it can’t replace human empathy or judgment. AI should support, not replace, therapeutic competence and align with fundamental human values.

Policy recommendations for AI use in pediatrics

To ensure responsible AI implementation in pediatric healthcare, policymakers and healthcare leaders must prioritize several key areas:

First, pediatric-specific validation requirements should be established for all AI tools used in children’s healthcare, recognizing the unique developmental, physiological, and social factors that distinguish pediatric from adult medicine. This is especially critical for tools analyzing developmental and neurological data, where training datasets must be rigorously audited for demographic, socioeconomic, and genetic diversity to prevent systemic bias [4, 6].

Second, family engagement and consent processes must be redesigned to ensure parents and age-appropriate children understand how AI is being used in their care, with precise opt-out mechanisms that respect family preferences, an approach consistent with emerging pediatric AI ethical frameworks.

Third, clinician training programs should emphasize that AI tools must complement, not replace, the clinical judgment and empathy essential for effective pediatric care. Clinicians should collaborate to develop tools that cater to the unique needs of each child’s growth and development. They can help shape AI solutions for use in primary care pediatrics by providing valuable feedback on the design and development of these solutions.

Conclusion

AI has tremendous potential to enhance the quality of care and alleviate the workload for clinicians as it becomes more integrated into healthcare. However, no algorithm can replace a child’s cry or replicate a qualified clinician’s instinct and sensitivity. AI needs to be monitored and governed by human values, especially in pediatrics, where trust and sensitivity are essential components of care.

Implementing a responsible innovation framework such as KP can accelerate the safe and equitable use of AI in pediatrics. Expected benefits include improved diagnostic accuracy, reduced clinician workload, enhanced family engagement through transparent consent processes, and strengthened public trust in AI-enabled care. Embedding these principles early will ensure that technological progress reinforces, rather than replaces, empathy and clinical judgment in pediatric practice.

Technologies should be developed with healthcare providers in mind and undergo thorough testing with real-time feedback and continuous improvements. KP’s seven principles illustrate how to advance with fairness, transparency, and care that consistently prioritizes patients. AI is not a single tool but a collection of technologies, including ML, natural language processing (NLP), and deep learning, that are being applied across various aspects of pediatric healthcare.

Abbreviations

AI: artificial intelligence

EHR: electronic health record

KP: Kaiser Permanente

ML: machine learning

Declarations

Acknowledgments

We gratefully acknowledge Lloyd Price, BA (Hons.), Partner at Nelson Advisors LLP, London, UK, for his editorial feedback and comments on early drafts of this manuscript.

AI-Assisted Work Statement: During the preparation of this work, the authors used Grammarly for grammar, spelling, and language refinement. After using the tool, the authors reviewed and edited the content as needed and take full responsibility for the content of the publication.

Author contributions

VSC: Conceptualization, Writing—original draft, Writing—review & editing. R Kaur: Writing—original draft, Writing—review & editing. R Kashyap: Supervision, Validation, Writing—review & editing. All authors read and approved the final manuscript.

Conflicts of interest

The authors declare that they have no conflicts of interest.

Ethical approval

Not applicable.

Consent to participate

Not applicable.

Consent to publication

Not applicable.

Availability of data and materials

Not applicable.

Funding

Not applicable.

Copyright

© The Author(s) 2026.

Publisher’s note

Open Exploration maintains a neutral stance on jurisdictional claims in published institutional affiliations and maps. All opinions expressed in this article are the personal views of the author(s) and do not represent the stance of the editorial team or the publisher.

References

Kaiser Permanente’s 7 Principles of Responsible AI in Healthcare [Internet]. Nelson Advisors Blog; [cited 2025 Jun 14]. Available from: https://www.healthcare.digital/single-post/kaiser-permanente-s-7-principles-of-responsible-ai-in-healthcare
Can Demirbaş K, Yıldız M, Saygılı S, Canpolat N, Kasapçopur Ö. Artificial Intelligence in Pediatrics: Learning to Walk Together. Turk Arch Pediatr. 2024;59:12130. [DOI] [PubMed] [PMC]
Hirani R, Noruzi K, Khuram H, Hussaini AS, Aifuwa EI, Ely KE, et al. Artificial Intelligence and Healthcare: A Journey through History, Present Innovations, and Future Possibilities. Life (Basel). 2024;14:557. [DOI] [PubMed] [PMC]
Bethlehem RAI, Seidlitz J, White SR, Vogel JW, Anderson KM, Adamson C, et al. Brain charts for the human lifespan. Nature. 2022;604:52533. [DOI] [PubMed] [PMC]
Zhou Z, Chen L, Milham MP, Zuo X; Lifespan Brain Chart Consortium (LBCC). Six cornerstones for translational brain charts. Sci Bull (Beijing). 2023;68:7959. [DOI] [PubMed]
Fan X, He Y, Wang Y, Li L; Lifespan Brain Chart Consortium (LBCC); China Autism Brain Imaging Consortium (CABIC); Duan X, Zuo X. Profiling brain morphology for autism spectrum disorder with two cross-culture large-scale consortia. Commun Biol. 2025;8:1157. [DOI] [PubMed] [PMC]
Chustecki M. Benefits and Risks of AI in Health Care: Narrative Review. Interact J Med Res. 2024;13:e53616. [DOI] [PubMed] [PMC]
Cain CH, Davis AC, Broder B, Chu E, DeHaven AH, Domenigoni A, et al. Quality Assurance during the Rapid Implementation of an AI-Assisted Clinical Documentation Support Tool. NEJM AI. 2025;2:AIcs2400977. [DOI]
Beets B, Newman TP, Howell EL, Bao L, Yang S. Surveying Public Perceptions of Artificial Intelligence in Health Care in the United States: Systematic Review. J Med Internet Res. 2023;25:e40337. [DOI] [PubMed] [PMC]
Rahman MA, Victoros E, Ernest J, Davis R, Shanjana Y, Islam MR. Impact of Artificial Intelligence (AI) Technology in Healthcare Sector: A Critical Evaluation of Both Sides of the Coin. Clin Pathol. 2024;17:2632010X241226887. [DOI] [PubMed] [PMC]
Cite this Article
Export Citation
Chamarthi VS, Kaur R, Kashyap R. A call for responsible innovation in pediatric health care. Explor Digit Health Technol. 2026;4:101178. https://doi.org/10.37349/edht.2026.101178
Article Metrics

View: 105

Download: 9

Times Cited: 0