Hacker Newsnew | past | comments | ask | show | jobs | submit | stlava's commentslogin

At the end of the day there has to be a tradeoff between ease of use and performance. Having spent a lot of time optimizing high throughput services in go, it always felt like I was fighting the language. And that's because I was... sure they could add arenas but that just feels like what it is, a patch over the fact you're working alongside a GC.


It's more like fighting ideology. Each language goes long ways to teach their idiomatic ways, but if it comes to performance most languages break down at that point. Writing fast code makes you feel dirty, but the fault is in the constant signalling of DON'T DO THAT.


The movie seems like a fluff piece when you find out what has transpired at DeepMind subsequently with slowing down publishing material to “selling out to product” which the founder was hell bent against in the documentary.


I had some experience with RTK and sensor fusion about 13 years ago on a college project. At the time the only people that were using RTK in real world applications were tractor companies because it kind of matters that you're precise when seed drilling. I'm not sure how far RTK has come but it had pit falls back then like base station drift when the set of satellites it saw changed due to them going over the horizon.

Tractors don't go _that_ fast but I'm not sure I'd rely on it in an actual car for anything but slightly better GPS.


My worry is we're going to have a generation of engineers that have not built up the necessary critical thinking/pattern matching skills needed to weigh tradeoffs and bring in context to ask the right questions and interpret the answers.

Sure we can segment this into code generation models and code review models but are engineers really going to want to be questioned by a code review tool on "what are you trying to do?" or are they just going to merge and pull the slot lever again?


I’m impressed they managed to make an app that makes me not want to use it or my phone to take photos! After I used it for about 5 mins I resolved to dust off my older DSLR and use it instead.


I feel that if you need an LLM to help pivot between existing data it just means the operability tool has gaps in user functionality. This is by far my biggest gripe with DataDog today. All the data is there but going from database query to front end traces should be easy but is not.

Sure we can use an LLM but I can for now click around faster (if those breadcrumbs exist) than it can reason.

Also the LLM would only point to a direction and I’m still going to have to use the UI to confirm.


One of the interesting things an agent can do that no individual telemetry tool does effectively is make deductions and integrate information across data sources. It's a big open challenge for us here; in any given incident, we're looking at Honeycomb traces, OpenSearch for system logs, and Prometheus metrics in a VictoriaMetrics cluster. Given tool calls for each of these data sources, an agent can generate useful integrated hypotheses without any direct integration between the data sources. That's pretty remarkable.


VictoriaMetrics team member here.

Have you had a chance to play with VictoriaLogs? If not, then I highly recommend to test it. Our team is also working on implementing traces in top of Jaeger and VictoriaLogs, see https://victoriametrics.com/blog/dev-note-distributed-tracin...


The whole point of Datadog is that you can seamlessly go between products for the same event source. Doesn’t it just highlight a setup issue?


We've discussed getting backup power periodically since moving (most of our neighbors have something). After the big bomb cyclone that hit the NPW this became a priority and I did a very similar tradeoff matrix.

Here were some of my consideration points against a generator (I had a list for batteries too):

1. the previous owners electrified the house so having a tank just for a generator didn't make sense

2. per 1, a small tank wouldn't last very long and if we're out of power for multiple days gas delivery is unlikely to be happening anyway.

3. tank + generator didn't have a practical placement location for us

4. smaller portable generators didn't make sense from a maintenance perspective since they didn't auto test.

5. what happens if it fails a self test right before a storm?

6. we get intermittent power cuts / fluctuations / outages throughout the year and the generator + ATS wouldn't protect sensitive electronics well

Edit: ATS + batteries can play nice together then I might look at doing a small portable as aux backup


> the previous owners electrified the house so having a tank just for a generator didn't make sense

The PNW has been on a campaign of hate against natural gas lately and unfortunately that delusion has spread to its residents. One of the most reliable low-maintenance appliances you could own is a direct vent natural gas fired water heater. Requires no external power source and quietly delivers hot water. The previous administration outlawed them but I understand that ridiculous law has now been reversed. You were trading a shred of efficiency for much lower reliability and resiliency. I would hear a lot of whataboutism that natural gas delivery requires electricity for the pumps but have never seen or heard of a single example of that happening, certainly not your typical PNW windstorm.

That leaves the essentials:

* Fridge/Freezer

* Heat

With an Energy Star fridge and a high efficiency natural gas fired furnace you could easily power this off a Costco-grade portable generator and have enough power left over to power the phone chargers and Internet (assuming that isn't out too).

The PNW's environmental crusading hate-boner against natural gas is going to end very poorly for its residents. Before forcing everyone onto heat pumps and electric everything they need to figure out how to keep the lights on more consistently than Puerto Rico.


Y'all have natural gas lines run to the home but use electric to heat it? That seems so backwards.


> In the year 2030, no one will remember Kubernetes.

I highly doubt that. Maybe there will be an evolution to k8s but fundamentally it solves a whole host of challenges around defining the environment an application runs in.


Nice! I'm one of the authors of pg-bifrost which is in the same space. Have you thought about / have solved sharding consumption across multiple slots / multi consumers to increase throughput? This is on my radar but not something I've investigated yet.

The issue we've ran into is some team at work decides to re-write an entire table and things get backed up until they stop updating rows.


pg-bifrost looks solid.

> Have you thought about / have solved sharding consumption across multiple slots / multi consumers to increase throughput?

Not yet, there has been not performance yet, as the project is still quite young.


Sounds like the author is missing an interface to abstract the DB backend.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: