Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

True, but you don’t need 150GB/s delimiter scanning in that case either.


As the other comment said, its a practice in good enough chunks quality. We focus on big chunks (largest we can make without hurting embedding quality) as fast as possible. In our experience, retrieval accuracy is mostly driven by embedding quality, so perfect splits don't move the needle much.

But as the number of files to ingest grows, chunking speed does become a bottleneck. We want faster everything (chunking, embedding, retrieval) but chunking was the first piece we tackled. Memchunk is the fastest we could build.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: