If you were designing the JWST today, you would probably also put onboard a GPU. That could be programmed to do some of the scientific work in space to reduce the amount of data that needs to be downloaded.
This would allow new types of science (for example, far shorter exposure times and stacking to do super resolution and get rid of vibrations in the spacecraft structure). It would also allow redundancy incase the data downlink malfunctions or is degraded - you can still get lots of useful results back over a much smaller engineering link if you have preprocessed the data.
Obviously, if that GPU malfunctions, or there isn't sufficient power or cooling for it due to other failures, data can still be directly downloaded as it is today.
It's hard to say -- I might disagree on whether to do something like that. Most often you want to be able to keep the raw data as long as you can, in anticipation that perhaps some day, some future technique or different calibrations/processing pipeline may improve. Or that you might find (or be looking for something) you didn't expect.
Especially for a scientific instrument whose usage patterns, operating conditions, and discoveries may change over time. (sensors too) Note for an instrument like this, the amount of people/researcher time studying the data afterwards is many times more than the amount of time taking the data. The value of it is incredibly high ($/hour), so you want to keep in as future-usable state as possible.
Once you process something on board for a certain purpose, unless for very low level integrity checks that are almost mandatory, etc, and discard the raw data, you lose the chance to do that in the future.
So unless you are really transmission constrained, I think they would prefer not to do it -- also because of the additional complications involved. I think once you get into "higher functions" becoming an obligation of the telescope's operations, those satellite / defense contractors etc. who have to launch and operate the thing start making requirements that are very difficult to live by.
I don’t know… that’s a lot of power and heat that needs to be dealt with for an onboard GPU. Heat is probably the biggest factor, as it might be enough to affect the image sensors (speculation). Plus, needing a radiation hardened GPU might be an issue. Just for data reprocessing purposes, I’d want to have copies of the rawest data terrestrially.
> If you were designing the JWST today, you would probably also put onboard a GPU. That could be programmed to do some of the scientific work in space
That's just not how science is done in astronomy. People want the raw data to analyze it for decades in different contexts. There's not much that can be done onboard that would make you not want to copy that data back.
The article alludes to laser comms - NASA is developing[0] laser based comms systems (as opposed to radio frequency) which would allow gigabit speed downloading of data. Hardening this technology to send back more raw data is probably a lot more straight forward than trying to do image processing on board.
This would allow new types of science (for example, far shorter exposure times and stacking to do super resolution and get rid of vibrations in the spacecraft structure). It would also allow redundancy incase the data downlink malfunctions or is degraded - you can still get lots of useful results back over a much smaller engineering link if you have preprocessed the data.
Obviously, if that GPU malfunctions, or there isn't sufficient power or cooling for it due to other failures, data can still be directly downloaded as it is today.
Basicly, it adds a lot of flexibility.