The time has come for bad science. An international team of researchers just released an eight-page "manifesto" on how to improve the quality of scientific research, and it's pretty heavy stuff.

In 2016, there was widespread concern among the scientific community that we're now in the midst of a 'reproducibility crisis', meaning many results that are being published can't be replicated - even when scientists repeat the exact same experiment. And today's 'publish or perish' research culture is only making the problem worse

But the new document, "A manifesto for reproducible science", suggests that there are ways that we can fix the flaws in the current scientific process, and save science before it evolves into something "shoddy and unreliable".

"There is a way to perform good, reliable, credible, reproducible, trustworthy, useful science," said one of the researchers, John Ioannidis, from Stanford University School of Medicine.

"We have ways to improve compared with what we're doing currently, and there are lots of scientists and other stakeholders who are interested in doing this."

First, the current problems. In order to keep their jobs, researchers need to continuously publish new work, and the easiest way to get published is with findings that are new and sensational.

That means fewer researchers are taking on the crucial job of fact-checking and replicating other people's work, because very little publicity comes from a repeat discovery.

It also means there's pressure to 'p-hack' results - manipulate statistical analyses until you get the result you want, and sway data to make findings seem more impressive than they really are.

And if no one is taking the time to back over your study to catch a dodgy p-hack… well you can see how that's a problem.

A study last year showed that these pressures are causing science to evolve into something unreliable. And that not only means questionable results are making it out there to the public, but it also means we're all wasting a lot of money funding inaccurate research.

Each year, the US government alone spends nearly US$70 billion on non-defence research and development, including US$30 billion for the National Institutes of Health. 

But research has shown that as much as 85 percent of biomedical research is a waste of time.

The new manifesto, published in the inaugural issue of Nature Human Behavioursays this is partly because there's a whole bunch of researchers scrambling to find meaningful patterns in the data in order to publish it.

Just like when we try really hard to see faces and animals in the patterns of clouds, if you throw enough money and research at data, you'll eventually find a pattern. 

And this problem goes way beyond the scientists themselves - the manifesto suggests that some of the biggest changes need to happen at the stakeholder level, urging research institutions, scientific journals, funding bodies, and regulatory agencies to change their approach. 

"Most of the changes that we propose in the manifesto are interrelated, and the stakeholders are connected as if by rubber bands," said Ioannidis.

"If you have one of them move, he or she may pull the others. At the same time, he or she may be restricted because others don't move."

So what's their solution? The eight-page paper looks into four categories that need to be improved upon: methods; reporting and dissemination; reproducibility; and evaluation and incentives. 

There's a lot of detail in there, but here are some of the most notable recommendations:

Pre-registering study design: Scientists need to design studies that minimise bias - so that means not telling patients, doctors, and other participants about what they're testing for before the research.

That's already pretty standard for many studies, but the manifesto takes things one step further by recommending that all scientists register their study design before the research even begins. That means the team can't go back later and tweak their results to fit their desired outcome.

Overcoming the "file drawer problem": The file drawer problem is basically when researchers cherry pick what they report on - choosing to highlight positive and notable findings, and leaving the rest of their results sitting in a file drawer.

"The consequence," write the authors, "is that the published literature indicates stronger evidence for findings than exists in reality."

The answer to this is getting universities, journals, and funding agencies to all commit to seeking the truth over just publishing something noteworthy that might not tell the whole story. This could involve employing independent committees that help guide researchers, but have nothing to gain from their work.

Promoting open science: The authors suggest that researchers should be sharing their results with other teams, and journals should be providing full papers to the public for free, so it's easy for everyone to access and interpret for themselves.

Shaking up peer reviewThe authors also endorse pre-print sites such as arXiv.org and bioRxiv to increase the speed at which researchers can assess and review each others work.

Instead of relying on a slow and private peer review process, other scientists can publicly poke holes in each other's work.

"The opportunity for accelerated scholarly communication may both improve the pace of discovery and diversify the means of being an active contributor to scientific discourse," writes the team.

Of course, these are all just suggestions, and pretty bold ones at that. It's one thing to outline what needs to be done to fix science, but another to actually get it done.

But now that one group of researchers has put out their guidelines for making science better, it gives other groups a chance to disagree, discuss, and build upon them - and then hopefully translate that into real action.

"When we are doing science, we are trying to arrive at the truth," said Ioannidis.

"All these measures are intended to expedite the process of validation - the circle of generating, testing and validating or refuting hypotheses in the scientific machine."

You can read the full paper (open access, naturally) here