Not sure what category of ecomm sites you were scraping but I scrape >10million ecomm URLs daily and, honestly, in my experience the compute is not a major issue (8 times out of 10 you can either use API endpoints and/or session stuffing to avoid needing a browser for every request; and in the 2 out of 10 sites where you really need a browser for all requests it's usually to circumvent aggressive anti-bot which means you're very likely going to need full chrome or FF anyway - and you can parallelise quite effectively across tabs).
One niche where I could definitely see a use for this though is scraping terribly coded sites that need some JS execution to safely get the data you want (e.g. they do some bonkers client side calculations that you don't want to reverse engineer). It would be nice to not pay the perf tax of chrome in these cases.
Having said all of that, I have to say from a geek perspective it's super neat what you guys are hacking on! Zig+V8+CDP bindings is very cool.
fully agree here, using a browser for everything is the dumb way. You just usually use it to circumvent the blocking and then reuse the cookies to call the endpoints directly.
It might works if you need to handle a few websites. But this retro engineering approach is not maintainable if you want to handle hundreds or thousands of websites.
One niche where I could definitely see a use for this though is scraping terribly coded sites that need some JS execution to safely get the data you want (e.g. they do some bonkers client side calculations that you don't want to reverse engineer). It would be nice to not pay the perf tax of chrome in these cases.
Having said all of that, I have to say from a geek perspective it's super neat what you guys are hacking on! Zig+V8+CDP bindings is very cool.