# Load libraries
library(tidyverse)
library(httr)
library(jsonlite)Background
There are many things I find frustrating about Blackbaud’s Raiser’s Edge NXT product, but one that has caused more headaches than anything else has been the inability to query when a prospect status was applied to a record. I can click on a prospect’s record and see when a status was last changed, but there is no way to get that data for a list of prospects from within lists, queries, reports, or even a SQL backup of the database.
The only place to access that information is via Blackbaud’s SKY API. The specific API - Prospect status (Get) - returns five data points:
comments- comments made when the status was changedconstituent_id- the system record ID of the prospectdays_elapsed- the number of days the prospect has been in the statusstart- the start date of the prospect statusstatus- the current prospect status
Setup
Here are the libraries I’m using.
To set up the application, you will need five pieces of information:
- Authentication URL
- Token URL
- Application ID
- Client Secret
- API Key
1 & 2 are straightforward; the URLS can be found in the code below. 3-5 are a bit more tricky, but Blackbaud has a good walkthrough to get started.
Setting up an Application
When setting up your application, you may need to work with your organization’s database manager to get it connected and authorized with your system.
I recommend setting all of your application permissions to read only. That way there is no danger of accidentally overwriting or deleting information.
The website URL is listed as a required field, but that’s more for publishing your application on the marketplace. I used https://example.com in mine.
While I was focused on accessing the prospect status history, I was also interested in playing with Blackbaud’s Power BI Connector. If you do that, adding these redirect URIs to your application can be helpful:
- https://oauth.powerbi.com/views/oauthredirect.html
- http://localhost:8080
- http://localhost:1410
The Code
Authentication
While there are ways to securely store these data points in R, I left mine in the code since this script was just for me. Obviously, you’ll need to paste your own application id, client secret, and API key.
auth_url <- "https://app.blackbaud.com/oauth/authorize"
token_url = "https://oauth2.sky.blackbaud.com/token"
application_id <- "paste application id here"
client_secret <- "paste client secret"
api_key_primary <- "paste api key"Now we’ll build the client that will connect to the SKY API.
# Build the client
blackbaud_client <- function() {
oauth_app(appname = "sky-api-demo",
key = application_id,
secret = client_secret) } Building the authentication function is honestly what took me the longest to figure out.
# Function to get the token
get_blackbaud_token <- function() {
# Perform the OAuth 2.0 authorization flow
token <- oauth2.0_token(
endpoint = oauth_endpoint(authorize = auth_url,
access = token_url),
app = blackbaud_client(),
scope = "constituent_read",
cache = FALSE )
# Store the token in the global environment
assign("blackbaud_token", token, envir = .GlobalEnv)
# Print token details
print(paste("Token: ",
token$credentials$access_token))
print(paste("Expires in: ",
token$credentials$expires_in,
" seconds")) } This code looks for an existing authorization token. If one doesn’t exist, it opens up a web browser and requests you sign in to authenticate. Once authenticated, the token will be stored in your R environment and you can close your browser.
# Fetch the token if it's not already in the environment
if (!exists("blackbaud_token")) { get_blackbaud_token()
} else {
# Deserialize the token JSON back to an R object
blackbaud_token <- fromJSON(blackbaud_token)
# Print token details
print(paste("Token: ",
blackbaud_token$credentials$access_token))
print(paste("Expires in: ",
blackbaud_token$credentials$expires_in, "
seconds"))
} Accessing SKY API
Here is a function to call the prospect status API, pass it the authentication token and record ID and get the response back.
The API references constituent_id, but be aware that what it actually wants is the system_record_id.
# Define the function to get the prospect status
get_prospect_status <- function(constituent_id) {
url <- paste0(
"https://api.sky.blackbaud.com/constituent/v1/constituents/",
constituent_id,
"/prospectstatus"
)
response <- GET(
url,
add_headers(
Authorization = paste("Bearer", blackbaud_token$credentials$access_token),
Accept = "application/json",
`Bb-Api-Subscription-Key` = api_key_primary
)
)
if (status_code(response) == 200) {
return(content(response, "parsed"))
} else {
stop(
"Failed to retrieve prospect status. Status code: ",
status_code(response)
)
}
}To be polite, we need to pay attention to the rate limits that Blackbaud places on calls to its API so that our function isn’t limited or cut off.
These options will be used in the call to break up how many IDs we sent in a batch and to keep track of how many calls we’ve made.
# Create API Call Process -------------------------------------------------
# Define parameters
batch_size <- 10 # Number of calls per batch
daily_limit <- 25000
calls_made <- 0Finally, we’re ready to call the API and get our data. Here I pass a tibble - sql_re_statuses - that contains the system record IDs for everyone in Raiser’s Edge who has any prospect status. Since our teams are more familiar with the Constituent ID, I also include any other fields that my team might find helpful when working with these results including Constituent ID and Constituent Name.
# Process IDs with batching and rate limiting
time_in_stage <- sql_re_statuses |>
mutate(batch = ceiling(row_number() / batch_size)) |>
group_by(batch) |>
group_map(~ {
if (calls_made >= daily_limit) {
stop("Daily API limit reached.")
}
Sys.sleep(1) # Pause for 1 second between batches
calls_made <<- calls_made + nrow(.x) # Update the call counter
.x |> mutate(response = map(record_id, ~ get_prospect_status(.))) # Use map to apply function
}, .keep = TRUE) %>% # Keep grouping structure
bind_rows()The Results
Once the API runs (about 15-20 minutes for ~6,500 constituent records), we can take the results, apply a small amount of cleanup, and store them in a tibble for futher analysis or saving.
# Combine all results
time_in_stage <- time_in_stage |>
select(-batch) # Clean up temporary batch column
# Unlist results into a tibble
time_in_stage <- time_in_stage |>
unnest_wider(response, names_sep = "_") |>
mutate(response_start = as.Date(response_start))
# Clean up data before saving
time_in_stage <- time_in_stage |>
select(
record_id,
constituent_id,
constituent_name,
response_status,
response_comments,
response_start,
response_days_elapsed
) |>
rename(
prospect_status = response_status,
ps_comments = response_comments,
ps_start = response_start,
ps_days_elapsed = response_days_elapsed
) |>
mutate(ps_start = ymd(ps_start))Something I’ve started doing is immediately saving the results out to an .RData file. This ensures I have a clean copy in case I accidentally filter the data or restart the session. I can simply load the file and the results will be back in my environment exactly as they were at the end of the previous step.
# Save an explicit copy of this data since it is so time consuming to refresh
save(time_in_stage, file = here("data - output",
sql_refresh_date,
paste(sql_refresh_date,
"Time in Stage - SkyAPI.RData")))