Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A few notes.

1. Rasterization is, conceptually at least, the painter's algorithm. You conceptually paint the "backmost" triangle, then paint the triangles "on top" afterwards. As long as you paint from furthest back, to front, you will get the right image. The textbook algorithm sorts all triangles from furthest back to front, but O(n*log(n)) sort is obviously too slow for 100-million triangles 60-frames-per-second that video-gamers want.

2. Culling modifies the #1 algorithm by removing triangles from consideration. Traditionally, this is done by ASIC parts of a GPU (I mean, traditionally, the whole GPU was ASIC and non-programmable. But even just a few years ago, hardware culling was not yet done in shaders). If Triangle#5 is completely covered by Triangle#200, then you can "optimize" by never drawing #5 to begin with, and instead just drawing Triangle#200. The GPU's hardware can detect cases like this and automatically skip the "Drawing of Triangle#5)

3. Primitive Shaders / NGG and other features from Vega onwards of AMD allow for data to be passed between the rendering pipeline in new ways. This seems to enable *software* culling.

4. Its not too hard to do software culling per se. What's hard is to do software culling that its worthwhile (aka: faster than the hardware ASIC culling). The claims here are suggesting that software culling is finally worthwhile thanks to these new shading units, new ways of passing data back-and-forth between stages of the GPU. With these new datapaths, it is possible to implement a software culler that matches (or exceeds) the speed of the hardware culler.

5. That's what a "shader culling" is. Software culling that's faster than the ASIC-paths of the GPU. Fully defined in software, so you can make them more flexible / tuned for your specific video game than the hardware.

6. Video games do often have CPU-side culling before sending the data to the GPU side. Its also culling, but in a different context. I believe that "shader culling" implies the low-level, fine-grained culling that the GPU-hardware was expected to do, rather than the coarse-grained "X character is on the wrong side of the camera so don't draw X or the 200,000 triangles associated with X" that CPUs have always done.

7. On #6's note, there's lots of kinds of culling done at many stages of any video game engine today, from software/CPU side all the way to the low level GPU stuff. Since this seems to be a low-level GPU driver post, you can assume that they're talking about low-level GPU hardware culling specifically.

8. I'm not actually a video game programmer. I just like researching / learning about this field.



Great explanation, thank you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: