This is a weird take. If its not opt in or you’re shoe horning it into a browser, then that sucks. Nobody is getting enraged that an app for running local LLMs downloads data to do so.
Although you can opt out and even disable the download feature when you build them in some cases, most of the local LLM tools are too download–happy by default.
Disagree with this. When cost becomes an important factor or the free but worse option becomes compelling and accessible (i.e. on device agent via apple style UX), there has been significant user behavior towards local. Think about stuff like removing backgrounds from photos, OCR on PDFs, who uses paid services for casual usage of these things?
It's not so much the agents' througput I'd be worried about, more meant to imply that at such speed, large parts of this are going to be pretty much just guaranteed unsupervised / unchecked completely. Like literal "LGTM + god bless + fuck it we ball" tier.
Zig is much more type aligned to bun than typescript. And there’s a common interface of C ffi so you could imagine porting it modularly and keeping the test suite in zig
Claude doesn’t write Rust like a champ. It’s still miles ahead at js and python than it is at rust. It can do macros and single file optimizations but its gotten really stuck in type hell and tried to dyn everything on multiple occasions for me.
Claude struggling at Rust: not getting types correct, using the wrong abstractions, not implementing things correctly
Claude struggling at Zig: the above + memory safety issues if you run “fast” mode.
It is generally true that Rust code tends to be written in a way that the compiler catches the issue at compile time. The same is not as true for Zig, Python or JS
I think they said the ad vendors wouldn't but the matching algorithm would still be aware of it. Which IMO is the bare requirement to have ads be anything but magazine style ads.
reply