Think AI and you think about every science fiction scenario gone awry, but more and more stories about artificial intelligence and the opportunities it presents are popping up in the media.
Perhaps you — or members of your board — are wondering whether there’s a place for artificial intelligence in the nonprofit sector. We spoke to sector experts to find out.
What is AI?
Throughout history, people have wondered whether human intelligence could be replicated. Could machines imitate behaviour previously only associated with human intelligence, such as learning, speech and problem solving? Could machines process information and learn? Essentially, artificial intelligence can be thought of as “a broad set of disciplines and technologies that perform tasks and solve problems once only possible by humans.”
Scientific investigation into this began in the 1940s and 1950s, but was more or less a failure. By the earliest 21st century, however, advances in computer hardware led to a rise in interest and funding in the development of artificial intelligence. The University of Toronto’s Machine Learning Lab was and is a pioneer in this field, which is now being used in every aspect of business and which is being described as the fourth industrial revolution, with a predicted revolutionary impact being on the same scale as the invention of the Internet or even electricity.
While the buzz is strong around artificial intelligence, so are some of the myths and predictions – whether it’s robots assuming human jobs or taking over the world. Dave Baran, VP, business development, CharityVillage, says that significant demystification needs to take place about AI. He suggests that first and foremost, we need to remember that AI is “a tool to help organizations become more efficient in certain functions.”
In a recent article, Tal Frankfurt, founder and CEO of Cloud for Good, breaks down the three functions AI does best:
- Process automation
- Cognitive insights from data analysis
- Engaging with people
What this means, at least in part, is that AI can be used to do routine tasks, and especially those that require the computation of large amounts of data. As Frankfurt says, an AI is a machine and “does not get exhausted from running millions of scenarios or [being] interrupted by meetings.”
How is AI being used in the sector?
While historically Canada’s nonprofit sector has “invested poorly in…technology,” according to fundraising consultant Michael Johnston, it would be a mistake to think that the nonprofit sector is not already engaging with AI. Here are some examples:
- Kids Help Phone uses a chatbot AI to use word and pattern recognition to identify users in more urgent need of aid.
- PAWS is using modeling and machine learning to give park rangers information to predict and stop poachers’ actions.
- eBird, a citizen-science birding organization, is using AI to identify hundreds of thousands of species crowd-sourced by their community of scientists, a task that would take decades to do manually.
- Researchers are developing a computational model to predict extreme fire weather, a technology that will eventually be used in operational fire management.
- Your organization may use Google analytics, Salesforce, CRMs or any other analytics program, which relies on AI to automate the collection and analysis of data. (AI is also behind the suggestions you receive on Netflix!)
Increasingly, nonprofits are making use of AI when it comes to fundraising. Wes Moon, cofounder and COO of fundraising AI company Wisely, says, “AI doesn’t look like wholesale change but makes work easier, automating and improving our intuition.” Wisely’s prediction engine replaces the difficult regression analysis that has been traditionally used to predict donor behaviour. Rather than replacing gift officers or existing CRMs, Moon says, AI allows them to be more effective.
Baran has also seen AI used effectively in job recruitment: where a recruiter might have to scan through thousands of resumes, a machine can be used to provide an initial screening for keywords, concepts, and rudimentary personality fit, as well as to offer an automated response to potential candidates.
Erin Crump, leader of strategic innovation for nonprofit benefits carrier Green Shield Canada , says when the company noticed that diabetes was becoming a high-cost category of disease, they partnered with a health analytics company in order to identify the variables that predict whether someone might become a complex diabetes patient. The goal of this process, Crump says, is to offer tools that encourage health and wellness to be able to change the trajectory of the disease as early as possible.
Opportunities
While Jason Shim, director, digital strategy, Pathways to Education and board member of the NTEN, notes we are not yet at a time when the average nonprofit has the resources for significant AI technology, there is also an argument to be made that AI can build an organization’s capacity.
A recent article claims: “Using AI in a mission-driven context could supercharge the capacities of the social change sector. Specifically, it has the potential to lower costs, improve quality, and broaden the impact of social change organizations. Think of it as transforming these organizations from a VW Beetle into the USS Enterprise.”
Benjamin Losman, acting communications manager, TechSoup Canada, agrees. “You don’t have to adopt AI but you can make your organization more efficient and pursue your mission more effectively in a way that is seamless and doesn’t drain time and resources.” He adds, “Setting up a data-informed culture allows you to see what you are doing well and what can be improved. In doing that, you are becoming more accountable to beneficiaries, board, staff, any stakeholder in achieving your mission well.”
The authors of the Stanford Social Innovation Review article also observe, “Capacity-building investments are often similar across organizations. One fundraising, accounting, or communications solution can, with a bit of tweaking, often meet the needs of many organizations. As a result, technology investments in capacity building can spread quite quickly through the nonprofit sector, and we expect this will be the case with machine learning-based solutions.”
Cautions
There are a number of legitimate reasons to be careful about how AI is developed and adopted. This is particularly true in a sector that works with vulnerable populations where data privacy is paramount. Losman says, “I’m not alone in being concerned about data privacy. People are calling data the new oil. Especially for organizations dealing with extra-sensitive data, there can’t just be mindless tracking and sharing of data, but a deep understanding of what will be done with that data.” Green Shield Canada, for instance, was diligent in protecting data privacy, anonymizing data and auditing the data security of their technology partners as if the systems were their own.
Losman raises another caution: “Nonprofit work is inherently human-based. There needs to be an element of humanity, regardless of your cause. There ultimately needs to be a person at other end of the line. One of the risks in automating everything, relying on AI in communicating, is stripping that necessary humanity away from interactions with beneficiaries because it can derail the way you do your work.”
Another concern is that of bias. Moon says, “Bias can negatively affect a charity’s performance, and even worse, could negatively affect a relationship if a bias is incorrect. We are trying to be careful about not putting too much weight on predictions based on old processes.” Shim adds, “The dream of AI was to build machines that can think like humans, but this can be tricky because humans are subject to our own biases. Yes, we can amplify processing and efficiency but we can also amplify human bias. If we are building these systems to amplify human thinking, we would be better served to make sure first that our thinking is diverse, equitable and inclusive.”
For James Kelly, founder and director, FaithTech, this last caution points to a need for developers and users to “take the morality of AI seriously because the implications are greater than we may suspect.” For Losman it means the sector needs to develop a code of conduct and high standards around the sharing of data. A number of organizations and coalitions are beginning to think about this ethical lens (see below). Other voices are strongly encouraging nonprofits to have a seat at the table where AI is being developed and discussed.
Rhodri Davies, head of policy, Charities Aid Foundation, writes that nonprofits and other civil society organizations “represent many of the most marginalised individuals and communities in our society; and since these groups are likely to be hit soonest and hardest by the negative impacts of AI, it is vital that the organisations representing them are in a position to speak out on their behalf. If they do not, then not only will those CSOs be failing to deliver on their missions, but also the chances of minimising the wider harmful effects of AI will be significantly reduced. And this is important: the implications of getting AI wrong are so far-reaching that decisions about its future cannot simply be left up to technologists.”
A final caution is recognizing that AI doesn’t just happen. Shim says, “Many nonprofits admire what Charity:Water does in terms of their use of technology to accomplish their mission. However, it is worth recognizing that they also have numerous software engineers on staff, which is not the case for most nonprofit organizations.” Organizations wanting to have digital impact, Shim says, need to hire technical staff or train for those skills. Moon notes that organizations that do prioritize data collection and AI analysis are increasingly becoming more successful, and potentially pushing smaller, less digitally inclined organizations out.
Where to go to learn more or get started
- Partnership on AI
- TechSoup Canada
- NetSquared
- NTEN
- AccessNow
- The Toronto Declaration
- The Electronic Frontier Foundation
- InWithForward
- Powered by Data
- Community Foundations of Canada Data Hub
- DataKind
- Canada.ai
- Next Canada
- Creative Destruction Lab
“Reach out to your extended network. If you have corporate partners in the tech space, talk with them about AI.” – Shim
Hang out with developers are hanging out in community. Participate in nonprofit hackathons. FaithTech Sprints are an example of this as are Faith Tech Labs.
Susan Fish is a writer/editor at Storywell, a company that helps individuals and organizations tell their story well. She has written for the nonprofit sector for more than two decades and loves a good story.
Please note: While we ensure that all links and email addresses are accurate at their publishing date, the quick-changing nature of the web means that some links to other websites and email addresses may no longer be accurate.