Photo by Nemo, San Diego 2015
Most web services experience seasonal lifts in new user sign-ups and traffic, especially in January when people return to school, work, and resolve to make changes to behaviors. Specifically, many new users arrive to apps with their highest potential to learn something and do something. However, new users readiness is often quickly lost when your product’s experience doesn’t present them a compelling path forward.
RealTalk™ Over the past three years as our newest users arrived to GitHub, who were different than our early adopters, we failed to engage the majority and lost them.
This large leak of our newest community members has led GitHub’s user research team to conduct a series of exploratory and evaluative studies over the past three years. Today, we have a richly designed a 3-part research program born out success and failure, which I’ll present in this post. The sum of these efforts now helps us to better understand and design for this fast-changing, ever-growing, and often-failing newcomer group.
From delving into the behaviors of Millennials in the workforce to measuring motivations, behaviors, and activity across global communities, our efforts reveal important stories about trends that characterize what our product team has been referring to as the new developer:
Made by my brilliant colleagues at GitHub (Bill, Pam, Liz, and the comms team)
What’s hard to measure
Dashboards and metrics power many decisions in product development. I find these tools to be impressive and useful, they tell us some things, but I can almost never get what I need to know about our users, as humans, by looking at graphs alone. Ines Sombra eloquently shared a memorable statement in her Monitorama 2015 talk:
Example: Do you know what’s going on here?
Here’s my short list on what’s hard to measure with traditional dashboards, analytics, and KPIs:
Awareness — The features people know about and how/if they use them, especially conditional feature sets (multi-step/dependencies).
Motivations — Drivers that power people towards accomplishing tasks and reaching goals.
Workarounds — Patterns people create to accomplish tasks that the current design doesn’t surface.
Behaviors – Core (current/engrained habits), Future (incremental workflow changes with new design to accomplish goals), Aspirational (leaps out and into the future).
Emotions – How people feel about the product and user experience. Some companies measure this with Net Promoter Score (NPS).
Let’s explore the three ways we’re studying hard-to-measure things about GitHub’s newest users.
Learning to surf, by Nemo, San Diego 2015
Studying New User Journeys
Our research approach to studying new users and their journeys is backed by three survey instruments listed individually below. The approach is portable, so you can recreate it almost anywhere. In addition to required questions about product experience, we’ve slowly added optional personal demographic questions (human age, education, sex/gender). When you’re working on a growing app/community that has reach into both hobbyist and professional communities, you can learn a lot from human age. For example, many people on GitHub transform over time from hobbyists into professionals. How do people age with your product?
Responsibly gathering personal demographics and using them wisely is a game-changer.
We’ve collected data that now tells us that formal computer science training v. self-taught individuals packs a predictive punch in who has prior experience with version control. Similarly, human age is helping us understand how and when people age from college into the workforce, the type of technology and professional role they play, and how these folks are aging alongside a product that was once made for hobbyists, but has become professionalized with work-focused features itself.
Three New User Studies
GitHub’s user research team is tasked to go beyond machine reporting in dashboards, connecting the what with numbers to the why with human stories. We take a mixed methods approach with closed-answer and open text fields, and interviews. Three instruments power three studies:
Our research approach for new users spans three types of studies:
New User Quiz – Five question survey delivered to 5% of newcomers upon inception.
Longitudinal New Account Creators (NAC) Survey – 1-year of panel data collection with a single cohort, delivered continuously to 90,000 new account holders.
“GitHub 365” Inactive Accounts Exit Survey – One-time cross-sectional study with 3,000+ respondents (snapshot).
Let’s do a deeper dive into each:
1.) The New User Quiz
We designed and launched a short 5-question quiz (written in English) presented to 5% of new users from the U.S. upon inception. When people sign up for a service you have a willing audience, and their first run is a great time to ask a few helpful questions (helpful to both you and to the end user). At GitHub we don’t ask users survey questions if we can’t use the data to improve their experience. Every question is deliberate –we use the data to surface insights that make life on GitHub better.
Never ask a question if you aren’t going to be able to make use of the data.
Accumulating information in and of itself is not useful. Information can’t possibly serve a purpose until we first identify what’s meaningful and then manage to make sense of it. Even once we understand the information, it remains inert until we actually do something with it. The true promise of the information age isn’t tons of data but decisions and actions that are better because they’re based on an understanding of what’s really going on in the world. –Stephen Few
The new user quiz provided insights into which attributes were predictors of success, as well as those that conversely lead towards churn (people failing to get value out of GitHub and leaving) within the first 30-days of a new account. We learned what core skill-level people had with git and programming and what they were looking for from the product. A few key takeaways included:
- Knowing git (25% of users, but 70+% self-identified as new to git)
- Writing some code (60%)
- Wanting to get involved in open source (90%)
*There are some holes I’m not including, which reveal more about failure (that’s a longer post for another time!).
Let’s talk about you:
What questions would you ask your newest users?
How would they help you predict success/failure?
Where would they uncover intent/motivation?
What would you be able to learn about people signing up for your service today as opposed to your early adopters?
Examples: Jessica Suttles, co-founder is doing this with Current’s early access program and Etsy’s shop creator survey. And, of course, if you’re looking for a new user quiz, stop by OKCupid and create an account (at your own peril).
2.) New User Panel Data “NAC” Longitudinal
Where the new user quiz provides instant insights, our new user panel data project is a longitudinal study. As we gather data for an entire year, we’ll be doing prospective work, tracking variables and identifying outcomes as they occur (e.g. a person’s first merged pull request). However, at the end of the year, we’ll do a retrospective analysis looking back and identifying the variables that contributed to the outcomes we’ve documented (e.g. it took three months before a person’s first merged pull request).
Longitudinal studies are complex and can be arduous to complete — in our technical field there seem to be very few research teams investing in them. Gathering panel data a continuous effort: a person may participate in the first two surveys, but then miss the third, while another one of their peers misses the first two, but does complete the third. Longitudinal studies yield incredibly rich data on user experiences that demonstrate how behaviors evolve over time.
In the case of GitHub, we know that it takes a long time for most new users to gain confidence with git and learn and integrate GitHub into their workflows. We also know that we lose the vast majority of new GitHub account creators. The GitHub NAC (new account creators) study is an effort to follow this process closely among a single cohort of new users who signed up in September 2015 — September 2016, to help us better understand the most current journey from sign up to established user.
Among longitudinal studies, there are two general types:
Cohort studies select a population based on some shared experience (e.g. birth year, exposure to a vaccine, etc), and repeatedly sample from this population to study how the experience impacts the subjects over time.
Panel studies begin with a single sample and study that same sample over time at repeated intervals. This is a particularly resource intensive method, because the same subjects must be located again, contacted, and convinced to continue participating in the study. At each wave, the recontact rate is typically around 10%.
Born out of failure . . .
GitHub’s NAC study is a panel study, and interestingly it was born out of failure. In August — September 2015, we conducted our annual Tools & Workflows survey, which was long at 35-questions. When we looked at the more than 3,000 responses we realized that we were so thin on new user respondents. Those new users who did respond to the T&W survey represented outliers, so we couldn’t draw statistically meaningful conclusions from their participation. Simply by participating they were demonstrating behavior different from their cohort.
Our main recruitment mode had been an in-app prompt, so we tried sending out an email to 5,000 new user accounts, since this group was less likely to be on the site and see the prompt, but again that didn’t work. We decided the survey was too long and asked questions well-aligned for people further along in their GitHub journey.
We divided the main survey up into a series of smaller surveys and began to send them out to a cohort of 90k new account creators. The result was higher statistically relevant (and, repeat) participation with more than 4,000+ respondents organized by “Explorers” (people browsing the site) and “Creators” (people taking maker actions).
In addition to the difficulty of collecting the data, analysis of panel studies is particularly complicated because repeated observations of the same individual violate assumptions about the independence of observations that are necessary for many statistical methods, so this type of data requires specialized models. Additionally, attrition rates between waves must be examined for systematic bias, since higher attrition rates among certain types of respondents will introduce bias into the final data set. This is a long-term study without immediate insights, but we’ll share what we learn around the mid-point in March 2016.
Studying people over time is a special treat for researchers. If you’ve ever been sucked in by a “where are they now” tabloid headline, you know the intuitive appeal of finding out how things turned out. The enduring popularity of the Up series, the profound insights from the Harvard Grant Study, and the critical policy findings from the Panel Study on Income Dynamics demonstrate the cultural and scientific power of this type of study.
3.) New User Exit Study, “The GitHub 365”
I recently wrote about this study’s results and structure but I’m including here because it’s connected to both the New User Quiz and NAC efforts.
While we’re waiting on the NAC, end-of-year (EOY) is an excellent time for a cross-sectional study and retrospective on your app’s growth –specifically looking at where you didn’t grow. At GitHub we conduct a short annual survey, dubbed “The GitHub 365.” We use the 365 to examine what happened with people who signed up for an account, but at some point ceased to return.
Who are your newest users that don’t come back?
Why did they leave?
What is one thing you could have done differently to help them succeed?
How (and should) you try to bring them back?
A cross-sectional study captures and depicts a snapshot of activity. In this case, we were studying why new users went inactive within their first year of account creation. From a personal perspective, the 365 is our opportunity to listen to people who left us, in other words these are not our superfans.
The 365 study is an instrument that enables you to look back at a year’s worth of inactive account data and can be used to help your organization hit the ground running in January when you’re most likely to have a burst in seasonal sign-ups (those new year resolutions!). The study was driven by a short well-designed survey instrument and paired with account data, including activity.
Inactive, but not abandoned
When we think of inactive users we think of abandoned accounts. In this study we reached out to 100,000 inactive accounts created from December 2014 — December 2015. More than 3,000 people responded.
Q: Which best describes why you stopped using GitHub?
People shared thoughts and experiences through closed-response options and open-ended responses, depicting that more than 50% of inactive account respondents (who are humans and not bots) indicated that they intend to return to GitHub someday.
The largest responding population identified, “I’ll be back, I’ve been busy,”which challenges the notion that inactive accounts are abandoned accounts. We looked for differences in the respondent group who identified that they would be back, finding that they are instead best characterized by their heterogeneity. They are a bit of everything, which indicates that the phenomenon of starting something and then not having time to follow through is both common and universal.
Our biggest takeaway is that listening to inactive account holders tells us that they might more fruitfully be thought of as dormant rather than abandoned, and could use a gentle git push.
Studying new users beyond dashboards is a multi-phase approach, and when paired with what your analytics surface can become your organizations greatest superpower in achieving growth.
Growth helps to keep the lights on, how will you study your newest users this year, maybe even shine a light into unexplored areas?