Wednesday, July 31, 2013

Try and try and try again

How many asks does it take to get to the center of the donor base?  Your very best donors will give after 1 or maybe 2, dependent upon the timing in relation to their last gift and the amount they have given as relates to a tax deduction.  The very worst folks are never going to convert.  Identify those folks who fit this population and box them out - don't waste time money and effort on them until you have maximized your results from every other population.  The balance of your donor/constituent base falls somewhere between.

This matters to a great extent in working with limited resources.  I know for instance that it takes an average of 9 asks for every gift that I get.  That does not apply across constituents - meaning that I don't get a 11% response (that would be awesome!) but that I need to ask each constituent, on average, nine times to get a gift.  If you consider that I have a list of 40,000 folks who have given at some point and over 90,000 who have never given, it is quickly clear that resources have to be concentrated appropriately as getting to an average of 9 asks from that donor population alone is 360,000 efforts.

In laying out the solicitation efforts for the year, I look at which segments I can get to the point where they have reached that saturation point.  How do I determine that?  Based upon more specific contact needs. While 9 is the average, keep in mind that the most loyal donors give much more quickly but the timing of the solicitation matters much more - hence by limiting the communications with this audience to match type and timing of consistent support, you can reduce that count while maximizing the returns.  On the flip side of that, the non donor populations often take 11, 12 or even 13 asks before they give and there is a large portion that simply wont give each year.  Creating creative, varied ways to ask these folks is where the greatest opportunity exists.


Friday, July 26, 2013

asks and external data

One of the many things that I do every year as we roll from one year to the next is build an ask table.  That ask table has typically been based around a series of calculations that combine recency and frequency from gift history to determine what we predict to be the likelihood of that prospect increasing their giving balanced with the desire to retain them as a donor.  I have done 3 and 4 ask arrays and found no difference in returns on either number or size of gifts so have migrated toward the 3 asks - no reason to do additional work without additional results!

I calculate these asks so that our communications throughout the entire year are consistent -if you are asked in June for $78 to the School of Education Dean's fund, you will be asked in October and again in April for $78 to the School of Education Dean's fund.  The exceptions here are donors who have given and are being asked for a second gift, special projects that may have a different fund (but the ask remains constant) and prospects who are identified or qualified into the major gift process.

This year for the first time, I purchased the data to calculate asks off of two external pieces of information; a predictive model for household income and a version of that provides disposable income, adjusted for cost of living (important here in the northeast.)

Using this data as a third criteria, I created a table that calculates asks based upon an expectation that loyal donors will give more and be less bothered by an aggressive ask than a new or previously lapsed donor.  I then turned to data from the Chronicle (http://philanthropy.com/article/America-s-Generosity-Divide/133775/) to add a geographical component.  I then assumed that the average donor supports 5 charities and used the predicted income to establish how much they should be giving away.   The biggest assumption that I made is that each donor does not support all 5 charities equally - in fact they give a disproportionate amount of that to the top 1 or 2 and then the balance to the other 3 or 4.  I had to account for that by looking at that combination of gift total, recency and frequency as compared to what that should be if it was divided equally.  If that combination was greater than 20% of that expected total, I could assume that we were in the top.  If it was less than 20%, then I could assume that they were supporting us at a lower level of investment.

This gave me a good outline that was based on both what they had for capacity and history of giving, assuming that they had a history of giving that is.  For those folks who did not, we used the household income as a dividing point - for those who had household income over $250,000 we placed them into the leadership giving suspect pool to be asked for a gift of $1,000 or more.

This is an experimental exercise - has not yet produced results to the good or the bad but either way with some tweaking it provides a much richer and more robust process than simply looking at history of giving to your organization, without losing the impact or importance of that inclination indication.

Saturday, July 20, 2013

Simple reminders

I have spent much of the last week working on the ongoing communications that make up much of the donor communication for the 2013-14 fiscal/academic year.  This includes monthly renewals for both LYBUNT and leadership donors, quarterly asks for SYBUNT's and the reminder process supporting the phone program.

We have tried any number of different approaches from religious to secular, academic to administrative value added.  Have had a volunteer (it was a trustee) sign the reminders, a VP, and where we are now with me signing them.  I follow a 0, 30, 60, 90, 120 schedule.  The 0 day reminder or pledge acknowledgement garners 60% of the responses and "comes" directly from the caller thanking the donor for their time and asking them to complete the commitment by making a payment.  If we have an email for that prospect, at the close of the shift they also get an email with the same message and a link to a dedicated pledge payment page within our giving site (www.stjohns.edu/paymypledge). Every Friday an email goes out from my office to anyone who pledged that week providing them with similar information as well.

15 days after the pledge a third email is sent, providing little more than a thank you and a link - this email actually generates the highest response rate from the electronic side of the program and was intentionally designed to be simple and direct with no images and says: "Thank you very much for your commitment to supporting St. John's University students this year.  Paying your pledge at www.stjohns.edu/paymypledge today ensures that your commitment is realized. Thank you." and is signed by me.

At the 30 day point you get a letter, this time from me as opposed to the 0 day letter from the caller, this letter is very generic, calling out the best parts of the university and the impact on them donor support can have.  While this letter provides 30% of the responses, it is the area that I feel has the most potential for improvement and am working to center the text around the funds supported rather than the general university.  This is a huge undertaking with reminders as this whole process runs on form letters and general merges so getting fund specific content into them is challenging.

The week that includes day 45 from the point of the pledge also includes an email reminder - similar to the 15 day reminder in content, this produces little in terms of payments but does generate feedback.  For some reason, this piece often generates the "I didn't do that" response.  While not what we want, this does at least limit our further work and the level of aggravation that this process can generate for those who don't plan to pay.

At the 60 day point we produce a statement - having sent 2 letters and up to 4 emails we change look from fundraising to collections.  This provides the final 10% of the responses to the mail stream and that is up from less than 5% when it was a letter.  This is a form with a clear image from campus in the background and a simple table layout summary of the commitment.

At the 90 day point we make a phonecall requesting a credit card.  This does get us some card numbers while also getting a renewed commitment from a small portion of the remaining pool.  Interestingly, almost 60% of these recommitted pledges are paid so it does work to generate support.

120 days after the pledge is made, we send a write off letter out that thanks the constituent for their commitment but lets them know quite clearly that we are no longer expecting payment on that commitment and telling them that the next things that they see from my office are asking for a new commitment.  We get a .025% response to that letter as well so it has value to the fundraising process in addition to the close out of the commitment.

Areas of improvement that we are well aware of are the previously mentioned need to make it more specific to the commitment, a need to make it more personal and a need to make it more specific to that prospects history and methods of giving.



Thursday, July 11, 2013

Thoughts on the start of the year

At the start of every fiscal year there is the anticipation of new and better things and the feeling of opportunity.  One of the things that occurs to me every year is that we need to do a better job in certain aspects of our profession and the aspect that recurs to me on a regular basis is the pipeline.  In my office, I generally refer to our role at the top of the pipeline as identification and or qualification of prospects but that terminology does nothing to affect what that means - starting the process active fundraising and filtering out the small percentage of folks who can make a substantial difference.

In order to accomplish the second portion of that task we have taken a different approach this year.  We purchased NOZA data, predictive modelling providing both estimated household and disposable incomes and have used that to create a qualification pool of alumni.  This was further targeted by removing all constituents who have already been contacted or attempted from a personal contact standpoint and then geographically limited the pool to within an hour of campus (being in NY, that is a huge population so it just isn't worth the time and effort to travel too far until we have worked the local population.)

These prospect are assigned to annual giving officers as 50% of their job with goals of attempting 100% of the pool (each has 300) at least 5 times, contacting 50% and visiting 30%.  Each visit will comprise of a engagement conversation and solicitation with an expectation of adding a small number of new annual leadership donors, increasing the major gift pool and working through the database to identify whom we should be working with in a quick and effective manner.  Low cost, high return and lots of shoe leather will take that concern about the pipeline from a negative to a positive by this time next summer.