Hacker Newsnew | past | comments | ask | show | jobs | submit | c0de517e's commentslogin

I imagined that there would be better ones made by real c64 democoders, but I can include a .prg. Also, if you make a C project in the web-based IDE I linked, and copy-paste the last .c file, it will compile it, run it in an emulator, and give an option to download the compiled .prg from there too


It's just how I decided to generate the previews for my blog in general. Not my best idea, but I don't hate it enough yet to change it.


There are both mechanical keyboards for the c64 (mechboard 64) and c64 inspired mechanical keyboards for the PC (8bitdo sells one, probably others)


Cool, that 8bitdo one would have been a perfect holiday gift. Look so good.


Eh, if only I had one. I have some relatives living next to me through and I think I remember an old TV in their basement, I might check it out, that's a good idea.


Fwiw, the c64 is pretty robust, if you don't use the original power supplies.

I'm surprised that people find this to be an example of clickbait. If I cared about views, I'd imagine an honest title like - "I turned my c64 into a digital fireplace" - would have probably been more appealing, no?


You’re surprised that people find a title of “Help! My c64 caught on fire!” to be clickbait in a case where your c64 did not catch on fire and you don’t need help?!

It’s an interesting article, but the title is a textbook example of clickbait and I’m surprised that you’re surprised.

https://en.wikipedia.org/wiki/Clickbait


i understand them but considering the project and its nature its a punny / good way of clickbait :D. lovely lil fireplace btw!


thanks


Author, fwiw, I don't do/care about click-bait, as I never cared about clicks. Since I moved to my bespoke blog system (previously I was on blogspot) I don't even track page views. But I thought it was somewhat funny.


I think not enough people today have ever seen the message "printer on fire".


Not GP, but I was expecting to read about a C64 on fire and was disappointed when it was just a post about an unoptimized fire demo


Author - yes, it's "aesthetic", albeit not my best work and I might revert that decision at some point. Was inspired by lowtechmagazine but they did a much much better job.

I do care about the blog being snappy and working also on very low-end, vintage hardware though, so that also somewhat achieves that goal.


I like the aesthetic choice


I wrote that it doesn't need VR, not that it wouldn't benefit from it. My 2 cents is that currently and for the foreseeable future VR is not a big asset, I think it's clunky and ridden with fundamental problems we don't know how to solve even in the realms of academic research, much less in products we can actually make and sell for reasonable prices and so on. That said, it's orthogonal to the point I really wanted to talk about, that's why I didn't elaborate further :)


Author here.

DLSS is not the end-solution to rendering, well, nothing is, but it's an AMAZING piece of technology. All these temporal and dynamic resolution techniques are here to stay as they -improve- the look of games no matter the HW.

What do I mean? Obviously dynamic resolution and temporal reprojection are worse than say, a fixed 8k rendering at 240hz! Yes, true! But, that's not the correct math.

The more correct math would be, on a given hardware, say a 3080, would you rather spend the power to render each single pixel exactly, or would you rather "skip" some pixels and have smart ways to recover them for a fraction of the price, almost equal to the real deal, so now you have extra power to spend somewhere else?

Of course if you just do less work with DLSS or similar technologies, you're losing something, it's bad. But that's never the equation. The real equation is that no matter how powerful the HW, the HW is a fixed resource. So if you spend power to do X, you cannot do Y, and you have to chose whether or not X is more valuable than Y.

Makes sense?

Now, all that said, it's also true that sometimes you max out everything in a game, you cannot have more of anything because that's literally all the game has to render, and at that point sure, it's reasonable to spend power even in things that are not that great bang-for-the-buck because literally you cannot do anything else anyways! So for the very top PC HW, you end up doing silly things, like rendering in native 4k because you cannot use that power in any other way.

But that's in a way "bad", it's a silly thing that we have to do as there is no other option! If we had the option though, it would be much better even on a 3080 to render say a 2k or 1080p upscaled to 4k via DLSS, and use the remaining power to say, have 2/3 times the detail in textures or geometry or number or shadow-casting lights etc etc...


> almost equal to the real deal

Here is our disagreement. For my eyes DLSS is unacceptably blurry (even in ”Quality” mode). You can look at it - ok, but it's only one of the opinions. You can spend 1 minute in google to find quite contrary opinions. Whole your long comment is based on the idea that upscaling is an optimization - you are forgetting that upscaling is a tradeoff.

My main complaint about DLSS ”hype” marketing is exactly about this: do not promise ”incredible quality” when under the hood it's just a pity upscaling. Some HW is not good enough, some games are not optimized enough - it's fair and it's okay, there are things to sacrifice, there are workarounds. Just don't lie.


Respectably, I don't care about "your eyes" - all people are different and that's ok. Nor I care -specifically- about DLSS, obviously technologies are always evolving, it's not that DLSS is the best ever that the concept can be. Also its implementation varies among games.

What I meant to say is that we live at a time where rendering every single pixel all the times is simply a waste of resources - that can be better spent somewhere else.

And you're still saying it's "blurry" - that's not the point. Certainly temporal reprojection will -always- be blurrier than not using it. But you're not considering what you're -gaining- by that blur. The real question is - would it be better to say, have a world with 1 million objects at 4k, a bit blurry, or a perfectly sharp image, at 2k, with 100k objects...

Temporal reprojection saves time that then can be invested in other things.

Lastly. CP2077 ALWAYS uses temporal reprojection. ALWAYS. If you disable DLSS it uses its own TAA instead. If you disable TAA (which cannot be done in the settings menu, but there are hacks to force it) is STILL USES temporal for most of its rendering before the final image.


> all people are different and that's ok

I'm happy to see at least one person on HN can agree with this.

After re-reading your and my comments I concluded that my real issue is lack of the settings in CP2077. I do respect opinion of people who want to use TAA and DLSS, I’m just upset that I can not pick something I like, can’t decide what to sacrifice and what to prioritize.

Your article is quite interesting and I’m grateful for it. Please keep writing things like this.


To let you better understand me: I have vomiting calls when I remember how TAA+DLSS image looks. I just can't force myself to use it, it's like torture. FPS drop or aliasing are much smaller problems.


Depending on the engine, yes, it might need to be made relatively early. But in practice it's, as everything, design. Some games want to have that element in their gameplay, some do not, it changes things a lot as lighting is used to make things visible (obviously) and to steer attention, it's a fundamental part of level-design, so things change considerably in terms of the game you're playing based on if you want to allow lighting to change or not.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: