Last Updated on September 17, 2024 by Laura Turner
Results from a HPSA/SDN survey on generative AI use by prehealth and health professions students
Read my earlier article on “How AI could change admissions.” This survey focuses on “generative AI” programs and tools, like ChatGPT, Gemini, Claude, or Co-pilot. This article contains published information considered reliable on its original publication date.
Over a year into the conversation about AI’s transformative potential to revolutionize higher education (and society), confusion about its use remains. Some students even use AI tools for simple assignments (Yip, 2024).
Between July 10 and September 2, 2024, HPSA/SDN conducted a short anonymous survey with the prehealth and health professional community about their attitudes, uses, and concerns about AI’s usage in the classroom, research, workplace, and application preparation. This article summarizes some quick insights.
Applying to Health Professions Programs: To AI, or not to AI?
Educational organizations (McArdle, 2023) and admissions committees are drafting legal-compliant policies that will govern the use of AI tools (see box below; Higher Learning Commission, 2023). As more resources are developed and promoted, admissions committees and professional societies still need help explaining the appropriate use of AI tools to prospective applicants (Delgado-Ruiz, 2024). Although recently an AAMC panel has released six principles for using AI in medical education, the landscape remains chaotic.
Prospective and current applicants are concerned about how AI could be used in admissions and residency selection processes (Chen, 2024), and the lack of published policies from the admissions or residency selection committees has contributed to the anxiety. Most are focused on using AI tools to craft a competitive college application (Coffey, 2024). In the meantime, genAI detection tools are still being tested (CopyLeaks case study, Shevchenko, 2024).
One can anticipate that more admissions and enrollment teams will use AI tools to manage application processing (Donaldson, 2024) or score application essays (Quah, 2024). Guidelines are being developed for residency applications; interestingly, Mangold and Ream (2024) compare ChatGPT-suggested guidelines with committee-generated guidelines. Additionally, Career Services offices are guiding students on using AI tools for job applications (Colorado) and as a tool for greater equity for applicants from marginalized populations (Dore, 2024; Evaristo, 2023). Even Walmart is interested in removing bias in the hiring process (McEwen, 2024).
Read Your Application Policies!
As of the original drafting of this article, the allowed use of AI in the application process (entering class of 2025) differs among application services. Applicants must carefully read their user agreements, which must be digitally signed, before submitting their primary application.
In the AMCAS 2025 Applicant Guide (accessed August 24, 2024), candidates are encouraged to use AI tools with their personal statement:
Plagiarism or misrepresentations may result in an investigation. You may use artificial intelligence tools for brainstorming, proofreading, or editing your essays. However, it’s essential to ensure that the final submission reflects your own work and accurately represents your experiences.
In contrast, CASPA 2025 applicants get a much different message (Applicant User Agreement and Policies, accessed August 24, 2024).
- I certify that all written passages within my CASPA application, including but not limited to, personal statements, essays, and descriptions of work and education activities and events, are my own work, and have not been written or modified, in whole or part, by any other person or any generative artificial intelligence platform, technology, system or process, including but not limited to Chat GPT (collectively, “Generative AI”).
- I am strictly prohibited from using Generative AI to create, write and/or modify any content, in whole or part, submitted in CASPA and/or provided to PA programs on my behalf through any means of communication.
- PAEA and PA Programs reserve the right to use platforms, technology, systems and processes that detect content submitted in CASPA and/or provided to PA programs that was created, written and/or modified, in whole or part, through the use of Generative AI.
Other application services also suggest personal essays must be written without using AI tools:
- OptomCAS 2025 Applicant Responsibilities and Code of Conduct, accessed August 25, 2024:
I certify that all written passages within my OptomCAS application, including but not limited to, personal statements, essays, and descriptions of work and education activities and events, are my own work, and have not been written or modified in whole or part, by any other person or any generative artificial intelligence platform, technology, system or process, including but not limited to Chat GPT (collectively, “Generative AI”).
- PharmCAS 2025 applicant responsibilities, accessed August 25, 2024:
[I agree to] Compose an original PharmCAS personal essay without the assistance from other individuals or artificial intelligence (AI) technology (i.e., an essay generator).
- VMCAS 2025 Applicant Guide, accessed August 25, 2024 (page 10):
I certify that the content of my application is my own original work and is an accurate representation of my experience(s). I have not used any AI essay generators (e.g., ChatGPT) or resources to generate the content. I certify that I have not had assistance from other individuals beyond providing feedback on the content of my application.
The HPSA/SDN Survey on Generative AI Tools
The anonymous pilot survey, run using Google Forms, captured 94 responses representing the prehealth/professional community (Figure 1). Most applicants completed undergraduate, postbaccalaureate (community college or undergrad), or graduate-level coursework toward a health professional program. At the same time, a few were career changers planning to begin their non-traditional journey. Other respondents included current or retired academic or research professionals.
Figure 1. Self-described education status among survey respondents
Most respondents expressed a strong interest in medicine (MD/DO), but some were pursuing dentistry, veterinary medicine, and other health professions (pharmacy, public health, psychology) (Figure 2).
Figure 2. Self-identified career interest or pursuit
Self-identified Demographics | Number | Percent |
---|---|---|
MD/DO | 68 | 72.3% |
DDS/DMD | 8 | 8.5% |
DVM/VMD | 7 | 7.4% |
Clinical Psychology PsyD | 5 | 5.3% |
Rehab (PT, OT, SLP, AuD) | 3 | 3.2% |
Other (PharmD, public health) | 3 | 3.2% |
Our respondents include non-traditional career-changers, underrepresented minorities, overrepresented minorities, socioeconomically disadvantaged students, or those from non-Native English households (Figure 3). Other self-disclosed identities include military (including veterans), disabled, gender non-conforming, and postbaccalaureate students.
Figure 3. Self-identified additional demographics
Self-identified Demographics | Number | Percent (from 81 responses) |
---|---|---|
Overrepresented race or ethnicity in healthcare | 28 | 34.6% |
Socioeconomically disadvantaged | 28 | 34.6% |
Underrepresented race or ethnicity in healthcare | 27 | 33.3% |
Non-Native English household | 26 | 32.1% |
Career-changer non-traditional | 23 | 28.4% |
Postbac students, including SMP | 17 | 21.0% |
Gender non-conforming (LGBTQ+) | 10 | 12.3% |
Military or veteran | 10 | 12.3% |
Disabled | 9 | 11.1% |
Applying to or attending a non-US institution | 7 | 8.6% |
Immigrant, asylee, or undocumented | 6 | 7.4% |
(No additional self-identification) | 13 | (13.8% of all 94 responses) |
How Comfortable Are Students with AI Tools?
Over two-thirds of respondents considered themselves familiar or advanced users, but over 25% self-describe as novices or merely aware of generative AI tools (Figure 4). It is still being determined whether AI users in the application process gain advantages over those who minimally use or refuse to use AI tools.
Figure 4. Respondents’ confidence in using generative AI tools
AI in the Academic Setting: Permission or Prohibition?
We asked respondents to recall how they used AI tools in a classroom or research setting (Figure 5). Most used AI tools to refine their writing (spelling, grammar, tone). Other uses in a classroom/research setting include discovering new research articles for literature reviews, improving test-taking skills through designing quizzes and tutorials (such as board practice questions) with AI, and creating a personalized learning experience or study schedule. Respondents also used AI tools for creative purposes, such as making photos or content for social media. However, around 10% of respondents said they had yet to use or refuse to use AI tools in an academic setting.
Figure 5. Use of generative AI tools in classroom/research settings
Tasks | Number | Percent (from 88 responses) |
---|---|---|
Fine-tuning writing (spelling, grammar, tone) | 58 | 65.9% |
Discovering new research articles or conducting literature searches | 28 | 31.8% |
Improving test-taking skills (quizzing, tutorials) | 27 | 30.7% |
Developing a personalized learning experience (study schedules) | 24 | 27.3% |
Creating new content (photos, writing, videos, computer code) | 21 | 23.9% |
(I am not currently enrolled in school.) | 17 | 19.3% |
Social connection or networking | 12 | 13.6% |
Have not used or will not use | 6 | 6.8% |
(No response.) | 6 | (6.4% of all 94 responses) |
Respondents were frustrated about inconsistent messages about AI tools. Some professors explicitly prohibited using AI tools, especially when writing essays from scratch. Other professors allowed AI to edit and refine writing. A few commented that they were allowed to use AI tools for homework but not during class time. Finally, some mentioned that medical/veterinary school faculty developed a module on responsible use of AI. Inconsistent standards on how AI usage would infringe on academic standards and integrity confused students, especially when plagiarism/AI detection tools continue to detect false positives (“hallucinating plagiarism”). Our findings parallel similar student-led discussions on the usage of AI, especially among graduate students (University of Illinois, 2023-2024).
If these arguments sound familiar, academics in the early 2000s were outraged about using Google and Wikipedia to write assigned papers (Hough, 2011; Cummings, 2019). AI’s adoption into academic and societal culture may follow a similar journey.
AI in Workplace Settings Improves Communication Efficiency
In workplace settings (clinical or non-clinical, Figure 6), respondents’ use of AI tools focused on developing effective communication, summarizing meetings or readings, and transcribing/translating materials (including speech-to-text). Other uses include more clinically oriented tasks like data analysis, creating patient education materials, and simulation practice. However, many respondents commented that they were not yet using AI tools.
Figure 6. Use of generative AI tools in workplace settings
Tasks | Number | Percent (from 70 responses) |
---|---|---|
Developing communication (letters, emails, voicemails, etc.) | 41 | 58.6% |
Summarizing meetings, articles, or publications | 28 | 40.0% |
Scribing, transcribing, translating, or note-taking | 22 | 31.4% |
Data analysis, including medical images | 16 | 22.9% |
Developing materials for patient education | 14 | 20% |
Practicing hand skills or simulating cases | 10 | 14.3% |
Avoiding scheduling conflicts | 7 | 10% |
Have not used or will not use | 12 | 17.1% |
(No response) | 24 | (25.5% of 94 total responses) |
Our results on using AI tools for data analysis are consistent with those from an American Express survey of small business owners. The AMEX survey noticed generational differences between Gen Z founders, who leaned on AI tools for task automation and cash flow optimization, and older business owners, who used AI as assistants for workplace management. Student entrepreneurs are thought to embrace AI to launch their companies more quickly.
AI in the Application Process: More Than Just Essays
Unlike in classroom settings, prospective applicants confidently leverage AI to educate themselves about their career journeys and optimize strong profiles and admissions strategies (Figure 7). Most respondents use AI tools to edit their application essays and experience descriptions and to brainstorm practical answers for secondary essay prompts or interview questions. Applicants also use AI tools to develop email drafts to communicate with evaluators (for suggesting texts for letters of recommendation) and admissions teams.
Figure 7. Use of generative AI tools in preparing applications
Tasks | Number | Percent (from 78 responses) |
---|---|---|
Refining application essays (spelling, grammar, tone) | 46 | 59% |
Brainstorming topics for application essays or interview responses | 38 | 48.7% |
Clarifying experience descriptions (resumes) | 36 | 46.2% |
Outlining personal statement or other application essays | 33 | 42.3% |
Preparing or evaluating interview responses (situational questions) | 25 | 32.1% |
Test preparation (developing quizzing/test materials, summarizing passages) | 23 | 29.5% |
Creating emails to admissions offices, including letters of interest/intent | 20 | 25.6% |
Creating a draft for letters of recommendation | 18 | 23.1% |
Seeking guidance about career options, paths, or programs | 18 | 23.1% |
Considering extracurricular activities and clinical options | 16 | 20.5% |
(I am not currently applying but am considering using AI as indicated in my choices.) | 9 | 11.5% |
(I am not planning on using AI to help with my application.) | 9 | 11.5% |
(No response) | 16 | (17.0% of 94 total responses) |
Over one-fifth of respondents found value in AI tools for advice about their desired career path. Some used AI tools to review suggestions about extracurricular activities and make a “program/school list.” A few commented on how AI could jump-start one’s pursuit of a health professional career as a guide for those without access to traditional advising or who cannot afford consultants and coaches (Ember, 2024). Though about 10% of applicants may still eschew AI tools, student advisors and consultants should presume their advisees have likely consulted with generative AI programs to complement their advice or services.
Respondents were concerned about how admissions committees would use AI tools to make decisions. Most know that AI detection and plagiarism tools are problematic and that any enforcement is not justifiable. Many feel that admissions committees will begin to place more value on interviews and situational judgment tests over application essays and resumes.
Seeking Dialogue and Guidance
Where do applicants and students want guidance on AI tools? Free-response comments address several themes (summarized using Claude.ai 3.5 Sonnet free plan, accessed September 2, 2024).
- Lack of clear guidelines: Many respondents express uncertainty about their institutions’ policies on AI use, indicating a need for clearer guidelines.
- Concerns about fairness and authenticity: There are widespread concerns that AI could give some applicants an unfair advantage, particularly in writing personal statements and essays. Many worry this could lead to less authentic or generic applications.
- Mixed views on AI’s role: Opinions vary widely. Some see AI as a useful tool for brainstorming or grammar checking, while others view it as a threat to academic integrity.
- Worries about AI in admissions processes: Some fear AI might be used to screen applications, potentially introducing biases or missing important nuances in candidates’ backgrounds.
- Calls for transparency: Several comments suggest that schools should be more transparent about their expectations regarding AI use in applications.
- Concerns about plagiarism and academic dishonesty: Many worry about the fine line between using AI as a tool and committing academic dishonesty.
- Accessibility and equity issues: Some view AI as an equalizer, potentially helping students without access to traditional resources, while others worry it could widen gaps between students.
- Need for AI education: Some suggest that students should be taught AI ethics and proper use.
- Inevitability of AI: Despite concerns, many acknowledge that AI use will likely become more prevalent and that policies must adapt.
- Impact on learning and skill development: There are concerns that overreliance on AI could hinder students’ ability to develop critical thinking and writing skills.
- Varied institutional responses: Policies range from complete bans on AI use to more flexible approaches allowing limited use with disclosure.
Many themes overlap with an AI Guide for Students, published by the Association of American Colleges and Universities (AAC&U) and Elon University for the start of the 2024 academic year. With frequent human oversight and transparent review of AI-mediated processes and policies, everyone in higher education and society will eventually trust AI tools to serve community needs effectively.
P.S. Avoiding an IA for AI
In a white paper on “Shrinking the AI Gap in Higher Education” (2024), Google and the Chronicle of Higher Education summarized multiple surveys of higher education administrators and educators about how they plan to help students use generative AI tools appropriately. Changing from a culture of hesitation to innovation, colleges and universities are entering these conversations. When using AI tools, pay attention to the consensus opinions:
Figure 8. Consensus regarding appropriate and inappropriate use of generative AI tools in academic settings.
Appropriate Uses | Inappropriate Uses |
---|---|
Personalized student support (including tutoring, academic advising) | Trusting GenAI outputs without human oversight |
Teaching assistant (including offering student feedback, improving accessibility of course materials) | Simulating human judgment (including grading student work, peer-reviewing academic work) |
Research assistant (including sorting and analyzing data, finding and summarizing literature) | Representing AI-generated work as self-produced |
Administrative assistant (including automating tasks, drafting communications like email) | Failing to cite AI for submitted generated content |
Learning analytics (including analyzing and visualizing student success data, providing student retention insights) | Making high-stakes decisions without human insight (including student admissions) |
Digital literacy education (including preparing students for a digital workforce) | Conducting invasive data collection or surveillance |
– | Relying on AI instead of human agency |
– | Giving GenAI tools unauthorized access to sensitive data |
From page 7, “Shrinking the AI Gap in Higher Education” (2024).
Many offices of student conduct and academic integrity have rapidly deployed guides for faculty and students that support responsible use and oversight (Contra Costa College, Harvard, Model Student Handbook for Illinois school principals, Ohio State, Rutgers, University of North Carolina, West Valley College Library). Pay close attention and ask how these policies are applied in academic, research, and workplace settings (Xollo, 2024).
Students should consider turning on revision histories for any assignments in case they are accessed of improper usage of generative AI. Grammarly Authorship is launching a beta version for students (for Google Workspace) to see if this tool can help students fight potential plagiarism concerns (Shevchenko, 2024).
Acknowledgments
We appreciate other prehealth and health professional organizations, including MiMentor, F1doctors, and NAAHP, whose members promoted the survey and encouraged their advisees to participate.
The survey was inspired by these presentations at the National Association for Advisors of the Health Professions 2024 National Conference (June):
- Krysi Davis, Oregon Health Sciences University School of Dentistry. The Future of Recommendations: Committee Letters, AI, and Helping Your Advisee Gain Admission.
- Jacob Plummer, Wingate University. Should Students Use AI Programs to Write Personal Statements?
- Tony Wynne, NAAHP. AI and the Pre-Health Sphere: Transforming Health Professions Advising for the New Digital Age.
Opportunities to help HPSA
By supporting HPSA, we can further develop surveys on the applicant experience, especially among those from underrepresented or disadvantaged backgrounds. Join SDN to be notified of future surveys. Our next survey opens in October!
Resources Cited
“AAC&U and Elon University launch Student Guide to AI,” (2024, August 19). Inside Higher Education. The guide can be found at the website Student Guide to Artificial Intelligence. Accessed August 19, 2024.
AAMC Community on Artificial Intelligence and Medical Education. (n.d.). Accessed August 21, 2024
“American Express small business survey cited generational differences in AI usage,” Fast Company. Accessed August 14, 2024.
Chen, J. X., Bowe, S., & Deng, F. (2024). Residency Applications in the Era of Generative Artificial Intelligence. Journal of Graduate Medical Education, 16(3), 254–256.
Coffey, Lauren. (2024, September 4). Can AI Help a Student Get Into Stanford or Yale? Inside Higher Ed. Accessed September 9, 2024.
Cummings, Robert. (2019, June 12). The First Twenty Years of Teaching with Wikipedia: From Faculty Enemy to Faculty Enabler. Wikipedia@20: Stories of an Incomplete Revolution. Joseph Reagle and Jackie Koerner, editors. Accessed August 21, 2024.
Delgado-Ruiz, R., Kim, A. S., Zhang, H., Sullivan, D., Awan, K. H., & Stathopoulou, P. G. (2024, September 1). Generative Artificial Intelligence (Gen AI) in dental education: Opportunities, cautions, and recommendations. Journal of Dental Education.
Donaldson, Beth. (2024, February 16). How graduate enrollment teams are using AI: results and recommendations from our new survey. EAB Blog. Accessed August 24, 2024.
Dore, Kelly. (2024, July 11). How Can Schools Manage AI in Admissions? Campus Technology. Accessed August 24, 2024.
Ember, Sydney. (2024, August 18). How A.I. Can Help Start Small Businesses. Accessed August 18, 2024.
Evaristo, Ellen. (2023, December 4). Balancing the potentials and pitfalls of AI in college admissions. University of Southern California Rossier School of Education blog. Accessed August 24, 2024.
Gillham, Jonathan. (2024, August 8). 8 Times AI Hallucinations or Factual Errors Caused Serious Problems. Originality.ai Blog. Accessed September 10, 2024.
Google Cloud. (2024). Shrinking the AI Gap in Higher Education (PDF). Accessed August 22, 2024.
GradLife at Illinois (blog article series: Search results “AI @ Illinois”). (2023-2024). Graduate College at the University of Illinois. Accessed August 24, 2024.
Higher Learning Commission (2023, November). Trend Update: How Do You Use Generative AI? Accessed August 27, 2024.
Hough, Lory. (2011, September 9). Truce Be Told. Harvard Graduate School of Education. Accessed August 21, 2024.
Mangold, S., & Ream, M. (2024). Artificial Intelligence in Graduate Medical Education Applications. Journal of Graduate Medical Education, 16(2), 115–118.
McArdle, Patrick. (2023, September 7). What your program should know about AI and admissions. PAEA blog. Accessed August 24, 2024.
McEwen, Colin. (2024, August 22). Case Western Reserve researchers collaborate with Walmart to explore whether AI can aid hiring process. Accessed September 6, 2024.
Principles for Responsible AI in Medical School and Residency Selection. (2024). Association of American Medical Colleges. Accessed July 31, 2024.
Quah, B., Zheng, L., Sng, T.J.H. et al. (2024). Reliability of ChatGPT in automated essay scoring for dental undergraduate examinations. BMC Med Educ 24, 962.
Rubin, J.D., Lombard, E., Chen, K., & Divanji, R. (2024). Navigating College Applications with AI [White Paper]. foundry10, cited by
Study shows how students and teachers are using AI for college essays, letters of recommendation – GeekWire. Accessed August 22, 2024.
Setting school policy about AI: A cautionary tale. (2023, March 14). Ditch that Textbook website. Accessed August 24, 2024.
Shevchenko, Alex. (2024, August 13). From AI Detection to Authorship: How Grammarly Empowers Responsible AI Use. Grammarly.com Blog. Watch their recorded presentation on YouTube: https://youtu.be/zTKXX1XANdo (recorded September 10, 2024). They will release their survey results with HigherEdDive in fall 2024.
Success Story: Oakland University. (n.d.) Copyleaks website. Accessed September 11, 2024.
Will chatbots help or hamper medical education? Here is what humans (and chatbots) say. AAMC News (2023). Accessed July 31, 2024.
Xollo, Skye (2024, April 9). Tips for Grads: AI as a tool in grad school. Graduate School of the University of Wisconsin Madison blog. Accessed August 24, 2024.
Yip, Jaures. (2024, September 8). A teacher caught students using ChatGPT on their first assignment to introduce themselves. Her post about it started a debate. Business Insider. Accessed September 9, 2024.
Emil Chuck, Ph.D., is Director of Advising Services for the Health Professional Student Association. He brings over 15 years of experience as a health professions advisor and an admissions professional for medical, dental, and other health professions programs. In this role for HPSA, he looks forward to continuing to play a role for the next generation of diverse healthcare providers to gain confidence in themselves and to be successful members of the inter-professional healthcare community.
Previously, he served as Director of Admissions and Recruitment at Rosalind Franklin University of Medicine and Science, Director of Admissions at the School of Dental Medicine at Case Western Reserve University, and as a Pre-Health Professions Advisor at George Mason University.
Dr. Chuck serves an expert resource on admissions and has been quoted by the Association of American Medical Colleges (AAMC).