Light isn’t RGB. We just receptors that react to certain wavelengths. I suspect the sensors on the telescope have a range of wavelengths they are sensitive to. It would be a straightforward translation to shift it to map it to the visible spectrum without varying the relative intensities to accentuate certain aspects of the image.
The IR may be in a very narrow band. The visible wavelengths have different colors because there are a range of wavelengths that correspond to different cones in our eyes that roughly match red:green:blue sensors. If you shift the IR frequency up into the visible range, you would just get a luminance image (like grayscale) centered on one visible wavelength like red.
False color imaging sometimes applies colors to different luminance levels or sometimes it takes multiple images at different wavelengths and assigns RGB values to each of those wavelengths. The results are informative but require some editorial / aesthetic decisions to produce the best results.
That's not how vision works. You see an extremely post processed image that's extremely far away from the original light that hit your retinas. There's nothing at all privileged about shifting something into the visible spectrum directly and seeing junk. You're just making an image that your visual system isn't good at understanding. It's not pure, it's garbage. You would hallucinate things that aren't there, you would miss obvious things that are there, etc. For you to really comprehend something the transformation needs to be designed.