TULSA, Okla. - October 13, 2025
Scot Media Tulsa founder and Emmy Award winning journalist Jeromee Scot was the featured speaker at the Association of Fundraising Professionals Oklahoma Chapter luncheon, where he led a discussion on AI ethics in fundraising.
The presentation explored how artificial intelligence is reshaping donor communication and development strategies while emphasizing that transparency and authenticity remain essential as nonprofits begin using new technology in their daily work.
Scot shared insights from his career in journalism, communications, and nonprofit management. His talk, titled AI, Ethics, and Fundraising, offered practical guidance and real examples for organizations exploring artificial intelligence in their fundraising programs.
“AI can help us be more efficient and creative,” Scot told attendees. “But it can never replace human relationships. Fundraising is built on trust, and that is something a machine cannot replicate.”
Scot explained how tools such as ChatGPT, Google Gemini, and Canva AI can assist with donor communications, campaign planning, and event promotion. However, he warned that the ethical boundaries surrounding AI use must be clearly defined.
He presented an example of an AI generated video of a community food giveaway that appeared realistic at first. When examined more closely, the flaws became obvious. Faces were distorted, signs were unreadable, and movements appeared unnatural. The example illustrated how easily AI generated visuals can create confusion between real and fabricated events.
“When donors see a video like that, they assume it really happened,” Scot said. “If they later find out it was created by AI, it damages credibility. Authentic storytelling matters, and once trust is broken, it is difficult to rebuild.”
Scot shared research showing that about 80 percent of nonprofit professionals now use some form of AI in their work, twice the number from just two years ago. Yet only one in four organizations has a formal policy that defines how AI should be used.
That lack of structure, Scot said, represents one of the most pressing ethical challenges facing nonprofits today.
“You have to decide where your boundaries are,” he said. “What information is acceptable to upload, what images can be created, and who is responsible for reviewing the content before it is shared. Every organization needs an AI ethics plan, not just a strategy.”
Throughout his presentation, Scot outlined specific steps for nonprofits to use AI responsibly while maintaining transparency and donor confidence.
Be transparent about AI use. If you would not feel comfortable admitting that something was AI generated, do not use it.
Do not upload donor data such as names, contact information, or giving history into public AI tools.
Use AI to support human creativity rather than replace it. Let AI help draft letters, summarize reports, or design templates, but ensure all final materials are reviewed by people.
Create an internal AI ethics policy that defines approved tools, privacy standards, and review procedures.
Fact check AI content before publishing or distributing.
Review materials for fairness and inclusion to prevent bias or unintentional exclusion.
Scot also discussed how AI can legitimately strengthen operations by helping teams organize donor data, improve communication efficiency, and evaluate campaign results. He encouraged nonprofits to view AI as a supportive tool, not a replacement for personal connection.
Scot reminded attendees that while technology continues to evolve, successful fundraising will always rely on genuine human connection.
“The most effective fundraisers are storytellers,” Scot said. “AI can write a script, but it cannot create emotion. The heart of why people give will always come from you.”
He also advised against overusing AI generated visuals, voices, or statistics, noting that even small inaccuracies can raise doubts among donors. “When people start questioning whether what they are seeing is real, they start questioning everything else,” he said.
Scot’s presentation to the AFP Oklahoma Chapter highlighted a growing truth in the nonprofit world. Artificial intelligence can be a valuable partner in fundraising, but ethical use must come first.
“AI is here to stay,” Scot said. “But ethics, honesty, and humanity are what will keep your mission strong.”
Jeromee Scot is available for speaking engagements at corporate luncheons, nonprofit events, and professional development sessions. His presentations are tailored to each audience and cover topics such as media strategy, AI ethics, storytelling, and communication leadership.
To learn more about availability and rates, visit the Speaking Engagement Rates page.
Scot Media Tulsa offers a number of AI services including social media strategy for nonprofits, creating business plans, onboarding new employees, or optimization services designed to transform your website copy into AEO (Answer Engine Optimization), ensuring your content is structured for AI-powered platforms like Perplexity, ChatGPT, and Gemini.
Ready to discover how AI can help you in your daily operations? Contact Scot Media Tulsa today to get started.
Email: jeromee@scotmediatulsa.com
AI ethics in fundraising refers to the responsible and transparent use of artificial intelligence in donor communications, campaigns, and data management. It means using AI to improve efficiency and outreach while protecting donor privacy, ensuring fairness, and maintaining honesty in storytelling.
Ethical AI use helps nonprofits protect donor trust and credibility. When organizations use AI without clear guidelines, they risk spreading misinformation, violating privacy, or misrepresenting real events. An ethical approach ensures that technology supports the mission rather than undermines it.
Nonprofits can use AI responsibly by creating an internal ethics policy, avoiding the upload of personal donor data into public AI tools, and reviewing all AI generated content for accuracy. Transparency is essential. If an organization would not feel comfortable admitting that AI was used, it should not be used in that situation.
Yes, AI can help nonprofits analyze donor data, personalize messages, and streamline communication. It can assist with drafting letters, designing visuals, and tracking campaign results. However, it should always complement human effort, not replace it.
The main risks include donor privacy violations, data leaks, biased targeting, and the use of misleading AI generated images or videos. These issues can harm an organization’s reputation and weaken donor confidence.
AI tools can automate repetitive tasks such as generating thank you letters, organizing donor lists, and summarizing campaign reports. This allows smaller teams to focus more on personal engagement and relationship building with supporters.
An effective AI ethics policy should define which tools are approved for use, what types of data are protected, who reviews AI output before publication, and how transparency is maintained with donors. The policy should also outline steps for addressing errors or misuse.
Organizations should clearly state when AI assisted in creating materials such as newsletters, videos, or graphics. They should also train staff to recognize potential bias or misinformation and regularly audit AI outputs for accuracy.
No. AI can enhance the work of fundraisers by making certain tasks faster, but the emotional connection that motivates giving comes from people, not machines. Successful fundraising still depends on authentic human relationships.