Where it can go wrong is when it increases our impression of understanding. So, mental representations aren't inherently bad. They're only bad when they're falsely make us believe we know more than we actually do. When we start substituting, choosing more plausible explanation, etc. Relates to mental representations from R: Peak by Andres Ericsson and substitution from R: Thinking Fast and Slow by Daniel Kahneman
I think that we're at a higher risk of this fallacy when we're in extremistan environments. In predictable environments with lots of feedback this bias is diminished.
So, to avoid this fallacy we need to refrain from forming stories, conclusions, explanations in complicated subjects and second, use empiricism and practice to get closer to the truth.
Further, knowing about this fallacy you should avoid noisy data as much as possible. Each input is a potential trigger for the mental machinery start creating narratives. And when they're based on false data, it just multiplies the problem.