top of page

Upbeat Engagement Surveys

Date: 2023-24

Location: New York (Remote)

Roles: Head of Product

Organization: Upbeat

The Problem

Upbeat's mission is to improve teacher retention in K12 schools. It does so through a prioritized survey instrument (the Upbeat Engagement Survey), administered population-wide, coupled with reporting and data-driven coaching. Our service depends not only on executing on research best practices, but communicating it in our product to build trust. 

upbeat.png

The Requirements

The operationalization of data requires seamless collaboration between product, operations, and research teams. This process begins with survey items that are rigorously developed through extensive literature reviews, expert input, and cognitive interviews. Survey instruments are then tested to ensure validity and reliability, while client responses are closely monitored for quality, including the detection of satisficing behavior. Aggregate metrics are analyzed to identify and mitigate potential biases, ensuring the data’s integrity and utility.

​

All of this work must take place within an Agile framework, embedded into sprint-to-sprint iterations to maintain adaptability and alignment with evolving project goals.​

Key Initiatives

Creating the Infrastructure to Measure Survey Quality

To detect satisficing behavior, I collaborated with the engineering team to introduce a new metadata field for survey submissions that tracked response times. This allowed us to flag complete survey responses exhibiting uniform response times across all questions and no variation in response values. This metric was adopted as a product KPI, and after refining and reducing the survey’s length, we observed a measurable reduction in satisficing behavior.

 

Survey Updates

Recognizing the need for a post-pandemic update to the Upbeat survey instrument, I led a cross-functional initiative to enhance its relevance. This process included multivariate regression analysis of historic survey responses to examine their relationship with staff retention, item correlation analyses to identify weaker items, customer interviews, and literature reviews to explore new research areas. The resulting survey instrument featured 15 fewer items, improved reliability as evidenced by a higher Cronbach’s alpha, and achieved higher response rates and quality.

 

Development of an Improved Engagement Score

To make complex survey data more accessible, I redesigned a core Upbeat feature by developing a new scoring algorithm. This algorithm produced a simple 0–100 score, where schools with the most favorable responses to questions most predictive of retention achieved the highest scores. This update made it easier for schools to interpret and act on survey insights.

 

Communicating Research Through Good UI

To improve how research insights were communicated, I collaborated with the UX/UI team to integrate disclosure patterns across the product. These tied each data point to its literature background and included visualizations illustrating the relationship between the data and staff retention in K-12 schools. Inspired by Carrie Conaway’s principle to “use a sentence with a number in it,” this approach ensured clarity and usability for end-users.

The Learning

  • Measuring and communicating survey reliability with item-total correlation and Cronbach's Alpha

  • Apply Exploratory Factor Analysis to examine construct validity

  • Entity Resolution in R with over 200,000 staff roster records to calculate employee retention 

  • Data wrangling and cleaning with large datasets (20 million+ rows)

  • Communicating research to non-research audiences to fascilitate cross-functional collaboration

bottom of page