The FellowshipOne Blog

What a 90-Day Tithing Challenge Taught One Church about Metrics

To tell this story, I need to introduce you to Nikelle Druck.

She is currently serving as IT Director at Lives Changed By Christ (LCBC), a 13,000-member, 7-campus church located in southeast Pennsylvania, where she helps executives make informed ministry decisions by providing meaningful and actionable data analysis.


A Penn State grad and Information Technology Professional, Nikelle has 15 years of consulting experience managing IT projects in a variety of government and educational roles. Suffice it to say, Nikelle knows data. She is passionate about the very real need to understand the health and effectiveness of the church and loves to share how data can deliver that understanding.

A Bold Idea

In September 2014, LCBC announced a 90-day tithing challenge for the first time. The idea was that if you gave 10% to the church for 90 days and you didn’t see God work in some way in your life, the church would refund your tithe at the end of the 90 days. Incredibly, they only had approximately a dozen folks who required a refund.

At F1, we often refer to 3D metrics, where the first dimension is a simple mea­surement of what happened. LCBC’s initial first dimension approach measured first-time givers, tithing commitments and total contributions during the campaign. To fully understand the organization’s health and effectiveness and dive into second and third dimensions of metrics, however, Nikelle identified 3 basic components they would need to include next time.

3 Foundations of Quantifiable Results

1. Define the business case you’re trying to solve and how you will solve it.

Nikelle explained that defining the business case and its supporting facts will provide the evidence for a recommended course of action. It gives leaders the ability to understand the implications of the need for a change, clearly defined expected outcomes, and finally, whether or not the business outcomes were successful.

As she dove deeper into the metrics, Nikelle realized time and time again that knowing definitively how the campaign was impacting ministry growth would require them to collect adequate data beforehand to fill in the blanks that were revealed by the campaign.

For example, as she studied the data, she wondered who are these new givers? Were they first time guests, regular attendees or long time partners (members) who hadn’t been tithing? Nikelle saw that the way the new data had been entered (without the attribute “first-time giver”), she didn’t have that visibility. This lack of information troubled her, but it also reinforced the value of confirming how those working with the data would label, enter, and validate the results within the constructs of their reporting tools in Fellowship One.


2. Adequately establish baselines.

The 90-day challenge was initially supported by an interesting baseline created by third-party tool called MortarStone that allowed LCBC to aggregate member and attendee addresses to rate each campus’s potential giving capabilities.

The number of challenge commitments in the first campaign also created a valid baseline by which to measure subsequent tithing challenge campaigns, but Nikelle wanted to know what factors had influenced the new givers. Promoting the campaign from the pulpit and through other communications seemed to create a corresponding uptick in first-time giving, but there was no clear way to isolate that variant from other factors. And, she questioned, did these new numbers actually exceed current first-time giving rates? Without previously established baselines, she couldn’t tell.

3. Identify all the moving parts and pieces upfront.

In data analysis, controls (things that stay the same) and variants (things that are new and being tested) are necessary for true comparison and conclusions. Too many variables can render results invalid, though trends can still be hypothesized. Looking at the why? is how you begin to find those gaps.

Next time, Nikelle wants to find a way to isolate as many variants as possible to get a full picture of the impact the campaign had on tithing. For example, just as attendance could affect new giving rates, attendance is impacted by many things, including seasonality, weather, and special events. A baby dedication ceremony, therefore, could increase first-time giving and tithing commitments, and a snowstorm could do the opposite.

It Takes Time

All indications are that the overall health of LCBC’s 7 campuses is thriving. There were many other contributing factors besides the campaign, but, the church did end up with a 15% higher donation total. Despite that, every organization always has room for improvement and deeper levels of data to explore.

Fortunately, Nikelle understands that it takes time to create actionable steps from data. She isn’t discouraged by the journey where sometimes the only insight gleaned is how to do it better next time. Professional statisticians consider that to be a valid test result. Ultimately, Nikelle is committed to working with her team to produce more accurate data to identify the actions that can impact ministry growth.

“Any data collection other than weekend averages is going to require much thought to analyze accurately,” Nikelle advises. “You have to know well in advance what you’re going to measure—and how.”

Now What?

If you’re interested in beginning this journey, a great place to start is by hearing from Nikelle herself. Along with one of our data experts, she hosted a free 60-minute webinar on this very topic of measuring ministry impact.