You can upscale a 170x170 image yourself, if you're not familiar with what that looks like. The only high frequency details you have after upscaling are artifacts. This thing pulled real details out of nowhere.
You can try to guess the location of edges to enhance them after upscaling, but it's guessing, and when the source has the detail level of a 170x170 moon photo a big proportion of the guessing will inevitably be wrong.
And in this case it would take a pretty amazing unblur to even get to the point it can start looking for those edges.
I think if you paste our conversation into ChatGPT it can explain the relevant upsampling algorithms. There are algorithms that will artificially enhance edges in a way that can look like "AI", for example everything done on pixel phones prior to ~2023
And to be clear, everyone including Apple has been doing this since at least 2017
The problem with what Samsung was doing is that it was moon-specific detection and replacement