There are certainly web apps that could not function without JavaScript. This initiative is more of a protest to shoving functionally useless JavaScript down your throat when you browse a news site or read an article.
JavaScript should progressively enhance those types of websites, and not be a strict requirement.
Nitpick: This is not a todo list or a calendar in Doom; this is Doom running in a todo list or a calendar. About a todo app - we already have a doom rendered using checkboxes [1], integrating that into a todo app is left as an exercise for the reader.
Sad, but true. Unfortunately, Safari still has some ways to go in terms of feature support.
https://ios404.com/ has a great list of stuff that's still missing.
In my opinion it is very fortunate that most of these APIs are not supported. I don’t want a webpage to make my phone vibrate for instance. Nor do I want to be able to give the permission for a webpage to do it by the way.
Neat. I wasn't aware that there's an endpoint you could use to retrieve JSON data without an API key.
If this project were to gain more traction, I suppose they would interfere with that, right?
It's sad to see that every free Reddit client is forced to be a browser nowadays.
> Neat. I wasn't aware that there's an endpoint you could use to retrieve JSON data without an API key.
Call me naive but in fact I always thought that's what the apps were using. One thing that took me by surprise in the whole fiasco was that the apps were actually using their own API keys -- I always thought I was just logged in to reddit in my own account. I figured it might be using some backend service for storing some app metadata but I didn't realize it was needing to communicate with reddit using the app developer's account on my behalf.
Based on folks using ReVanced to patch the shut down apps, it's an oauth client id that was specific to an app. Communication was with reddit's API directly.
The JSON endpoints are pretty neat, you can just take any reddit page, be it subreddit, submission comments or username, and append `/.json` to the URL and tadaa, JSON data. Although like you said, I'm pretty sure they will axe it at some point, and I'm honestly surprised that they didn't already.
I can imagine however, that old.reddit.com scraping/parsing layers will pop up left and right as soon as that happens (and those in turn could be the final nail in the coffin for old.reddit.com)
I think spez completely forgot about the fact they had a way to export data as JSON everywhere. That's basically 80% of what a reddit client needs and basically all an LLM model needs for training.
Every time I see this site it still amuses me that despite all the valuable UX information, it is very annoying to use on a desktop because it feels like it was designed entirely for tablets or phones.
I've been in the same boat with these discussions.
When I do encounter a component with countless re-renders as described, it is usually painfully obvious what causes them and easy enough to fix.
I think we're going too far if we're expecting React, or any framework, to be fool-proof to the point that we can just throw "whatever" we want at them and expect it to have peak performance.