CKF receives funding from the Gordon and Betty Moore Foundation to transform research communication

Apr 14, 2016

Despite fundamental shifts in how humans use technology in research, mass communication and popular media, we are still publishing like it’s 1999. At the Collaborative Knowledge Foundation, we’ve set our sights on transforming the research communication sector by building shared infrastructure that will improve what we publish and increase the integrity and speed of the process.

The Gordon and Betty Moore Foundation is generously supporting this effort with a 2-year grant to support our software development and operations. Moore has a rich history of supporting research and innovation and we are honored to be named among the grantees.

“CKF’s technology approach will result in faster, more transparent, better connected, and more reproducible research communication,” said Carly Strasser, Program Manager at Moore. “I’m thrilled to be supporting this important advance in research communication.”

“Investing in shared open infrastructure will both benefit publishers and improve how research is communicated,” stated Kristen Ratan, co-Founder of CKF. “Breaking down the current technology silos will lower costs and enable the industry to spend more resources on innovation.”

The Collaborative Knowledge Foundation was established in October 2015 by Kristen Ratan and Adam Hyde with the mission of changing the ways that knowledge is produced and communicated. CKF is focused on building open source technology solutions for knowledge creation and production that foster collaboration, integrity, and speed. CKF is accomplishing its goals by building community with the recognition that transformation of research publishing is only possible through collaboration. CKF is a fiscally sponsored project of Brave New Software, a 501(c)(3) non-profit organization.

The Moore Foundation press release can be found here:

Technology Slows Down Science

We’ve managed to create a bizarre bottleneck in scientific advancement. The most impactful science in the world is still limited by how well it’s communicated, by how well and how quickly it gets used by the world. The process for this communication today is called scholarly publishing; the products are typically scientific journals and books.

Science and medicine continually improve the technologies in use and achieve amazing results. But the technologies used to share scientific research haven’t evolved much since the earliest days of the internet. They are slow and costly, which adds months or years to scientific progress. The current technologies deprecate scientific output, dumbing down the product to fit it into an odd little box known as an article or book chapter. These boxes have remained basically the same since the days of print, flattening that data into a printable table or image using processes that haven’t evolved since the advent of the internet. Dumbing down the published record this way makes the science far less useful. Research results can’t be reproduced and validated and therefore can’t be effectively used by others.

An industry-wide investment in new open infrastructure could rapidly transform how science is shared and used. By replacing out-of-date proprietary technology silos with dynamically adapting open source platforms, we could speed up the impact of science on our lives.

There are three vectors of failure that can be addressed by better technology: time, cost, and the quality of the output itself.


Share science sooner

Faster publication time would result in the more immediate sharing of research results. Journal articles take months, sometimes years, to publish. The centerpiece of the publication process, peer review, can take two to four weeks. The rest of the time is spent managing largely manual editorial work and print-based production processes. Removing much of that noise would dramatically speed up science.

Stop skimming off the top

Lowering the costs of sharing science would remove the additional tax burden on our investment in research. Debates on business model (open access vs subscription) aside, the costs of publishing are rather ridiculously high. New methods of communicating research have difficulty getting a foothold.

Share science, not articles

Research articles works are served up as a flat HTML page with a quaintly formatted, printable PDF on the side. The long and costly process has added little value to the output and, in many cases, removed critical data. Were the output more closely tied to the rich data and nuanced analyses done during the research process, it would look more like a constellation of networked objects, curated by the debate and conclusions of the authoring team.

The lack of transparency in this process has meant that most scientific publications do not allow for the reproduction of the work, a method that science has for checking its own work that goes back centuries. In the digital age where every transaction we have is record and analyzed, this opacity and lack of accountability is mystifying.

Those who fund science, largely governments and private foundations, have invested billions into improving scientific discovery tools and ensuring that the work is done with integrity, following ethical guidelines. At the point of publication, this enormous investment is turned over to the publishing industry, which is dependent on out-of-date technology platforms that are proprietary black boxes. There is no transparency as to how the hard-won scientific data are managed during the publication process or how they are transformed to a publishable state.

Social media and entertainment have achieved levels of rapid and collaborative creation that, if applied to science, might fundamentally change our world.

Building open infrastructure that can transform the process of sharing science from the ground up, fully re-inventing it, would operate on all three vectors at once, giving us fair, rapid and complete communication of science.

Post by Kristen Ratan