But is the reaction to this from the scientific community rational? Here's the first clue some sacralization may be happening.
Hypersensitivity around sacred values, like climate change, or possibly mundane actions done by unclean actors, is a clue that quasi-religious dynamics are at play. Is "science" at risk of quasi-religiosity? Let's look at an outlier - climate change.
Science or Religious Dynamics
The dynamics here do not look pretty in this regard. Those who question any tenet of climate change are labelled deniers and are ostracized from academia. This may or may not be reasonable. Certainly a physicist who favours a luminous ether over gravitational space-time may not be functional. Indeed this is the way most people view climate change.
But sharp critique is what advances science. Science works because even the most cherished idea is always subject to factual counters. Climate change really isn't subject to question. It is happening. But are all attacks about it "happening", or are some attacks based on questions about structural bias? (see this David Friedman lecture for example). Are some based upon resource prioritization? What if resource prioritization is erroneously phrased in terms of climate change factuality? Does the position suddenly become "wrong" because of imprecise language? What if the terms of the argument are purposefully off to facilitate persuasion? Is it now "wrong"? Are the decisions of a populace subject to persuasion (from both sides) illegitimate if they pick the "wrong" answer? Are populace decisions only "right" when there is equal balance? Is it only "right" if the balance occurs in proportion to the factual validity upon which external arbiters judge things to lay?
We can tell where on the quasi-religious spectrum we're at based upon answers to these questions. This is similar to the basic technique Jonathan Haidt uses to investigate the ex post facto rationalization aspects of disgust. Is an answer "wrong" based upon its conclusion or its premises?
Scientists tend to say both conclusions and premises have to be right. For the field, this makes sense. But does it make sense in a representational democracy?
First off, there is no way to know what individuals think when they vote. You can certainly study a sample of voters to try and ascertain their thinking, but often voters aren't fully aware of all their reasons. As Haidt would say, complex decision making events are more emotional than rational. So should irrationality defer to expertise in areas of public policy?
If the March for Science was framed in these terms, I would be really happy. Instead it seems framed in terms of Scientists are right and Trump supporters are idiots. But to be more charitable, it seems framed in terms of limiting the role popularism can have on scientific agencies. This seems reasonable. Reasonable until you think that scientific agencies are also responsible for policy... Application of action from fact is not as clean cut as some Marchers make out. This is especially true in ideological monocultures.
Over the last few decades an increasingly untenable elected representational system has offloaded many of its decision making duties to the bureaucratic/technocratic level. This may be wise: this is where technical expertise lay. It may be unwise: technocracy tends to become increasingly detached from "reality".
Technocratic detachment from reality fosters revolution. Parental-governments that always know what is best for people (based upon aggregate averages) eventually run into independence backlash. "Yes I know having a gun is statistically unsafe, but in my particular circumstance, with my particular abilities, with a psychotic stalker, I don't agree." This is where the policy application of science runs into popularism.
Scientists are likely to say they have the right premises. Popularists are likely to say they have the conclusion they like. The problem is, the language both sides use to communicate has a very minimal cross sectional area. Thus, at least to me, the fight is really over power.
This is not because of the facts (which are pretty black and white), but because of the morality ascribed to those facts. For instance, is CO2 of 400ppm "bad"? Was it "bad" in the Jurassic? Is it "bad" because how it will effect population diversity? The latter is a moral question. It is clearly based upon decision of what outcome is wanted. How is that different than popularist reasoning?
Where the Conflict Comes From
And that, I think is where the conflict comes from.
Because this is a fight, the chances of it running its course without an arms race is unlikely. It is structurally likely that each side will begin to employ tools designed to win. Thus groups of scientists and scientific agencies grow more and more likely to leave out key contextualizing details (Jurassic CO2, dogmatic outcome preferences, positive black-swans, etc.). Popularists grow more and more likely to rely on persuasion rather than fact (or pure appeal to dogmatic preference). At some point the dogma comes out and groups rally around their poles. Inter-group competition ensues, boundaries solidify and mediating cross-overs become sellouts. Memetically fit cogents become sacralized (like climate change). Religious dynamics take over because they tend to be very fit and resonance with adaptive group formation.
The end isn't pretty.
There seems to be an unspoken assumption that progressivism should and will win; that science tied to a factually correct reality is more adaptive than popularism. To me, this is where the danger lies. This assumes a religious like blessing to factual reality. This type of determinism facilitates inter-group competition: the other side is wrong so let's just jump to the end-game.
Unfortunately practical reality (popularism), is very evolutionary fit. The ability to rally groups for action isn't always guaranteed to win out over technical ability.
So if you want to avoid needless escalation is the current social wars, think about what role you really want popularism to play?
So, at this point most people would say - science should win. As discussion often does, group identity and righteousness has been re-enforced.
So let's imagine a department of justice which is ideologically/politically pure. Say it is as homogenous as the EPA or National Parks Service. Say 99.9% pure. This department is very concerned that racial disparity in crime will lead to societal collapse. They've got very good models on this. They conduct and release lots of scientific studies (at a grant ratio of 1000:1 for the positive position). Some of their facts seem robust, like IQ by race or quasi-racial grouping. Obviously some people don't like them, but their critique is often about conclusions not the science. Although, they often frame their attacks as a rejection of the science. Aspects of these "facts" can certainly be questioned, but if you do you'll never get a job, advance in your career, and you will socially ostracized as a regressive.
Now imagine that a new government was elected and that it decided elimination of racial disparity was not a priority. Perhaps they felt locking up lots of people of one race to reduce crime rates was not worthwhile. Perhaps this is what the people voted for. Perhaps not. Perhaps the voters were just racists. But the department of justice and its scientists feel their expertise on the actions on crime should be minimally subject to dogmatically determined popularism.
Back to Quasi-Religion
While all analogies are necessarily poor and subject to lots of issues, the main point of this one is that ideologically homogeneous institutions create their own dynamics that eventually careen into the wall of populist dogma. The limitation is that race is easier to delegitimize than "the jurassic wasn't so bad for life". Science is agnostic about application. Application may be informed by fact, but it is always a-moral. The degree of morality varies, but group dynamics act to accentuate morality. Thus application is rarely dis-entangeable from group dynamics and quasi-religious behavioural wells.
So, to me the best thing one can do is if you see a scientific meme becoming sacralized, don't employ quasi-religious tools to defend or sell it. De-sacrilize it by publicizing critiques. Make them as open as possible, and let the weakness of such arguments be their own downfall.
If nefarious persuasion techniques are used against your position in order to rally the masses, be aware that you may become a doppleganger to the dynamics you're criticizing. Realize the public policy decisions in representational democracies are almost always based on outcome rather than expert thinking.
If you can't fathom why that may not be as catastrophic as you think, read some judgement aggregation literature, eat a dose of humble pie, and get used to the sacrifices required for pluralism. Large groups are not stable with a total absence of corruption. Large plural societies are only stable when they give "enough" space to corruptors with erroneous ideas. Don't assume a 49% Trump vote means 49% of people deny the facts of climate change. Some may deny the prioritization of actions. Some may have voted for other policies. Appearing like you're enforcing homogeneity of thought weaponizes inter-group competition which is as likely to make you operate religiously as it is to re-enforce adaptive group dynamics on the other side.