Hacker Newsnew | past | comments | ask | show | jobs | submit | ubixar's commentslogin

I've long suspected DJT is on a rampage of radical, ragebait news worthy actions to take the news away from the Epstein files. I hate that it's working and many people have to suffer because of it.


C# gets close to this with records + pattern matching, F# discriminated unions are even better for this with algebraic data types built right in. A Result<'T,'Error> makes invalid states unrepresentable without any ceremony. C# records/matching works for now, but native DUs will make it even nicer.


Notepad had one job, display text. Microsoft decided it needed an attack surface instead.

The year of the Linux desktop doesn't need to arrive - it just needs Windows to keep shipping.


More like the year of the Mac OS (or MacBook). Once market saturates with cheap M series you will see everyone switching.


The most interesting thing here isn't the CVE - it's the invisible coordination. A backbone provider acted on advance knowledge of a critical flaw, implemented filtering at scale, and the rest of us didn't notice until GreyNoise's data showed the drop. The vulnerability got patched at the network layer before it ever reached the application layer. This is what mature security ecosystems look like - the boring, quiet fixes that happen before the press release.


Stop spamming AI slop


?


You comment reads very AI generated. From the, it's not X it's y, to the overdramatization of completely normal events (i.e. key infrastructure providers are notified of CVEs before they are disclosed so impact is minimized)


because it reads like claude output?

and also the pattern:

> The most interesting thing here isn't the CVE… This is what mature security…

> The most interesting finding isn't that hyperbolic growth… This is Kuhnian paradigm…

both comments in the last 24h


Whatever the AI, the point is valid and I had a similar train of thought reading TFA. This comment section took a different turn but hey, what can be used for good can be abused for bad. Gee whizz!


The most interesting finding isn't that hyperbolic growth appears in "emergent capabilities" papers - it's that actual capability metrics (MMLU, tokens/$) remain stubbornly linear.

The singularity isn't in the machines. It's in human attention.

This is Kuhnian paradigm shift at digital speed. The papers aren't documenting new capabilities - they're documenting a community's gestalt switch. Once enough people believe the curve has bent, funding, talent, and compute follow. The belief becomes self-fulfilling.

Linear capability growth is the reality. Hyperbolic attention growth is the story.


Though this is still compatible with exponential or at least superlinear capability growth if you model benchmarks as measuring a segment of the line, or a polynomial factor.


For those exploring browser STT, this sits in an interesting space between Whisper.wasm and the Deepgram KC client. The 2.5GB quantized footprint is notably smaller than most Whisper variants — any thoughts on accuracy tradeoffs compared to Whisper base/small?


Just when you thought it was safe to use Opus 4.5 at 1/3 the cost, they go and add a 6x 'bank-breaking mode' - So now accidental bankruptcy is just one toggle away.


One of my favorite YouTube creators (Josean Martinez) for finding super productive dev / terminal tools has just made the jump from macOS to Arch Linux / Hyprland (ala Omarchy). It's a great channel for finding out a Pro Hyprland setup tuned for terminal productivity.

https://www.youtube.com/watch?v=IOsJr5EB2zc


Are YouTube creators incentivized to often be switching and introducing new tools?


First time I've seen Josean switch to another OS fulltime - was always on macOS since I started watching him.

He's a Vim / Terminal super user and was always surprised that he stuck it out on macOS for so long, IMO Arch Linux / Hyprland is the best choice for VIM keyboard warriors.


Fair.


Content grist for the algorithm mill, so I would say so. Also gives the appearance of staying on the bleeding edge which may attract viewers.


been following OpenCiv3 with interest. curious if you've been using any AI coding assistants to speed things up, or if it's been mostly vanilla dev? the codebase looks pretty clean


been playing around with world models for sim-to-real transfer lately. the waymo approach looks solid, but curious how you're handling the distribution shift between generated scenes and real sensor data. any tricks for that besides the usual domain randomization?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: