Did it? Just checked and my feed is still completely untranslated. I have my settings set as English. I hope they don't do the weird YouTube thing of translating things from languages you know into the language you set. Multilingual people exist
Utaite. Will find barely any anywhere else. Thankfully if you're in one of those sub-communities, you don't ever get recommended anything political or American.
I've personally found the repairability to be worth the price for me. I got the baseline $999 back when it launched & have done stupid things like spilling a whole gallon of milk on it. Had to take it apart & clean as well as replace the keyboard but now it's still chugging along.
Used to own a MacBook & the keyboard started dying after a year with a failed A key. Very expensive to replace so I just remapped caps lock to A. Then the screen started getting weird color issues and dead pixels.
A MacBook Neo does look attractive though. Probably better performance.
They were solid before the butterfly design too. It was just Apple's inability to admit the new design was shit and their hubris that they'd engineer their way to a solution for so long that the whole world became aware of the issue when mainstream journalists started writing about it in major publications. The Wall Street Journal article with no letter 'e's was brilliant.
Stars occasionally correlate with quality but more often it's timing and naming. I have a total of 40k stars on GitHub, and I know the code is shit in most of those repos (many written back when I was 16-18 as I was just learning to code). Jumping on hype trains before they start is how you get stars.
- The default seems to make the payment without confirmation. What stops an endpoint from changing payment amount between an inspect request and the actual request?
- Will adoption of this payment protocol ever grow large enough for anyone to implement this on either the client or server?
- Bots have more of a financial incentive to crawl sites than a human. I doubt this will actually stop anything
- I see a AGENTS.md. How much of this is vibe coded? It's near impossible to get a sense of the care taken to review LLM output. Hard to trust with money.
What does production ready even mean? The problem with AI is that there isn't an obvious way to prove how much human attention\care was actually put in & thus no signal on quality. Nobody is gonna review 1M lines. Also, the 1M line number shouldn't really be a boast. More lines != higher quality or more features
Fair points. "Production-ready" was probably too strong for v0.1.0 — "API-stable" is more accurate. The crates compile, pass their test suites, and the public API surface is locked, but real production readiness comes from users hitting edge cases we haven't. (no warnings policy etc.)
On line count — you're right that more lines isn't inherently better. I mentioned it as a scale indicator, not a quality claim. The meaningful numbers are the 92 crates, the codec/container/protocol coverage, and the test results.
Happy to discuss any specific module if you want to dig into the details.
I've been working on a project lately as my bachelor's dissertation which I later plan on working on long term on this issue.
The basic premise is a secure package registry as an alternative to NPM/PyPi/etc where we use a bunch of different methods to try to minimize risk. So e.g. reproducible builds, tracing execution and finding behavioral differences between release and source, historical behavioral anomalies, behavioral differences with baseline safe package, etc. And then rather than having to install any client side software, just do a `npm config set registry https://reg.example.com/api/packages/secure/npm/`
eBPF traces of high level behavior like network requests & file accesses should catch the most basic mass supply chain attacks like Shai Hulud. The more difficult one is xz-utils style attacks where it's a subtle backdoor. That requires tests that we can run reproducibly across versions & tracing exact behavior.
Hopefully by automating as much as possible, we can make this generally accessible rather than expensive enterprise-only like most security products (really annoys me). Still definitely need a layer of human reviews for anything it flags though since a false positive might as well be defamation.
Won't know if this is the right direction until things are done & we can benchmark against actual case studies, but at least one startup accelerator is interested in funding.
I registered it about 40 minutes ago, but it seems the DNS has been cached by everyone as a result of the wikipedia hack & not even the NS is propagating. Can't get an SSL certificate .
I had looked into its availability too just out of curiosity itself before reading your comment on a provider, Then I read your comment. Atleast its taken in from the hackernews community and not a malicious actor.
Do keep us updated on the whole situation if any relevant situation can happen from your POV perhaps.
I'd suggest to give the domain to wikipedia team as they might know what could be the best use case of it if possible.
Not quite sure which channels I should reach out via but I've put my email on the page so they can contact me.
Based on timings, it seems that Wikipedia wasn't really at risk from the domain being bought as everything was resolved before NS records could propagate. I got 1 hit from the URL which would've loaded up the script and nothing since.
Its misinformation that the malicious script loaded that domain. The malicious script did have a url with that domain in it, but it wouldnt load javascript from it (possibly due to a programming mistake/misunderstanding by the author, its kind of unclear what the original intent was)
> they are using Apple's Wi-Fi positioning service, but proxying it through their own servers
My concern with this system is that their proxy is (afaik) compatible with Google's format, which by default is less privacy respecting as it does the location calculation server side and doesn't allow the client to cache.
I'd much prefer if they called out to Apple's servers directly (or through a direct proxy) & cached the AP data locally so over time it will work offline.
reply