The rise in popularity of accelerators, short programs aimed to connect early-stage entrepreneurs and potential investors, has been astronomical in recent years. Like any new initiative, there isn’t new data to determine how effective these programs are at creating successful ventures.
To address this gap in tech knowledge, Social Enterprise at Goizueta at Emory University and the Aspen Network of Development Entrepreneurs launched the Global Accelerator Learning Initiative (GALI). GALI’s main purpose is to collect and analyze data from accelerator programs around the world and how they measure up against venture performance.
“The general thing [about the report] is the validation of the model,” Peter Roberts, Academic Director of Social Enterprise at Goizueta and the lead researcher on the project, says. “We compared one-year changes, the year prior to acceleration to the year of acceleration. We looked at the three categories of investment: equity, debt and philanthropy. We also looked at revenues. There was an obvious effect on investment — that goes beyond equity, across three categories.”
Village Capital, a seed-stage accelerator for entrepreneurs in impact-oriented sectors, was the first one to start sharing data in 2013.
GALI looked at three measures of entrepreneurial performance: revenues, full-time employment, and investment. They applied these three factors to the chosen fifteen programs and compared them to the performance of ventures that applied to Village Capital but were rejected.
What did they find? In the first year of revenue growth, the participating entrepreneurs averaged $11,329 vs. rejected entrepreneurs at $7,934. The difference is negligible. However, the real impact of the accelerator came across on the one-year investment growth. Participating entrepreneurs saw an average of $54,236 of new investment vs. the $6,274 from the rejected entrepreneurs.
GALI explored further by assembling a panel of experts to look at why some programs were more successful than others.
It came down to the following reasons:
- Partner quality improves program performance. Programs that were described as “engaged”, “putting entrepreneurs first,” and “contributing to program content” did their best.
- Time spent on program-related activities lowers program performance. Programs that let entrepreneurs set aside time to work on their own vs. program content performed better.
- Quality of the applicant pool improves program performance. Less surprising, but still relevant — high-performing programs had applications with more intellectual property and more educational and real life experience.
“The selection result is a good one,” Robert says. “Because there is sort of this ‘machismo’ that goes around where accelerator programs show how awesome they are by how many people they reject. Turns out that the most successful programs were a little bit more focused and targeted.”
Other factors didn’t show enough evidence or had mixed support as to whether or not they helped the accelerator programs be more successful.
In the end, accelerators need partners that will get their hands dirty and help participating entrepreneurs, only choose those with quality experience to participate, create small cohorts and help build entrepreneurial networks within those cohorts.
“The accelerators programs doesn’t promise forever, it just promises an inflection,” Roberts concludes. “What you really want to show anything visibly inflecting during this process.”
You can read more about GALI findings and the initiative here.