Crowdsourcing the CTSA Innovation Mission


This article was published by CTSI Executive Director Mini Kahlon, PhD, Leslie Yuan, MPH, director of the Virtual Home (VH) program and Oksansa Gologorskaya, MSc, VH product manager, in the April 2014 issue of Clinical and Translational Science.

Crowdsourcing the CTSA Innovation Mission

CTSAs and the Mandate to Innovate

To fully meet their mission, Clinical and Translational Science Awards (CTSAs) cannot rely on old models for supporting research. Sure, providing services that investigators have come to expect is a critical role for CTSAs, but current systems of supporting research are obviously broken, with costs only increasing and barriers to delivering health improvements growing over the past several decades. Clearly, new models for supporting and executing clinical and translational research are required.

Top-down approaches to innovation have several advantages, including assuring alignment with institutional goals and matching ideas with the organizational structure that can achieve them. Control brings simplicity and reduces risk of implementation. However, top-down innovation may be out of touch with the needs of investigators, ultimately producing services that are underutilized. Programs may be less innovative because fewer are contributing ideas. Finally, failing to engage end-users throughout the process may lead some academics to be less open to participation. Leaders in community engagement have recognized these risks for many years.

Traditional bottom-up approaches for innovation, such as through requests for proposals (RFPs), are an attractive alternative. Investigators are accustomed to putting forward their best ideas for the possibility of funding, so engagement is intuitive to them through this mechanism. Ideas can come from a large number of people and the community of those working on solutions can increase. However, traditional bottom-up approaches have limitations. Without iterative feedback, an opportunity to improve ideas is lost. Hiding proposals from other submitters eliminates the potential for new teams to come together. Also, treating such ideas as fully owned by the proposer prevents integration of similar proposals or leveraging of existing programs and personnel.

A better bottom-up approach

We decided to create a mechanism to regularly infuse the Clinical and Translational Science Institute (CTSI) at the University of California, San Francisco, with fresh ideas for new initiatives. Recognizing that the RFP approach engendered inefficiencies as described above, we decided to rework the process. First, we decided to open up the submission and review process, so that all contributors could see each other's proposals and identify synergies. Second, we allowed for commenting on individual proposals and iteration of proposals to incorporate improvements or new team members. Finally, we pushed for the most open approach to commenting possible, with viewing and commenting allowed by anyone, anywhere. By framing the process as a public one, we hoped to incrementally acculturate our community to this new approach to sharing proposals. Through practice we wanted to introduce the net benefits of openness even if it came with the risk of losing full control over one's initial idea. We also wanted to remove all barriers to participation for those interested. Even one password or a VPN requirement, we knew, would reduce participation. The entire process was enabled by an online “Open Proposals” platform that implemented these features, built off the popular open source Drupal software.

Continuous innovation at UCSF's CTSI

Over the past four years, we used evolving versions of the “Open Proposals” platform to solicit proposals for eight opportunities to add to our portfolio of activities to improve research resources at the university (Table 1A). For each of these opportunities, proposers submitted short (one page) proposals using a structured template that included an articulation of the perceived need, potential partners and impact (Figure 1). We solicited proposals and comments from a wide community using campus-wide staff and academic listservs to invite participation. Proposers could revise their applications based on input as many times as they wanted, with the system keeping track of revisions. Each revision could in turn receive new comments. This period of ‘Open Improvement' lasted for 2–4 weeks, after which the RFP was closed. Subsequently, a more traditional peer-review process by a committee (usually CTSI leadership) selected the best final proposal/s and teams for support and funding.


Read full article here