Hacker Newsnew | past | comments | ask | show | jobs | submit | teiferer's commentslogin

How does loading up on social media help?

Maybe turn off any gas stove, secure any dangerous tools, stop your car, that kind of thing.


Modern gas stoves have security sensors to turn down themselves. I had to reset my water boiler when I got home.

It's not that social media helps, it's that there's not really more to do. It's just another day on the ring of fire.

In practice for anything short of the very biggest earthquakes, if you're close enough for the earthquake to truly be a big deal you're only getting a few seconds of warning. It's not a task list, it's stop doing the immediate dangerous thing you might be doing and grab immediate cover.


Quite clear that people here work in low risk jobs. Anybody working with heavy machinery, drills, saws, knives, etc will immediately know how to use those 45 seconsa well. Those trades that don't typically let you read HN all day.

In the age of agentic programming and the ever increasing pressure to ship faster, I'm afraid this kind of knowledge will become more and more fringe, even moreso than it is today. Who has the time to think through the intricacies of parallel data structures? Clearly we'll just throw more hardware at problems, write yet another service/api/http endpoint and move on to the next hype. The LLM figures out the algorithms and we soon lose the skills to develop new ones. And tell each other the scifi BS myth that "AI" will invent new data structures in the future so we don't even beed humans in the loop.

AI is like a genie: be careful what you wish for or you'll get what you asked for.

Lately at work I've done C++ optimization tricks like inplace_map, inplace_string, placement new to inline map-like iterators inside a view adapter's iterators and putting that byte buffer as the first member of the class to not incur std::max_align_t padding with the other members. At a higher architecture level, I wrote a data model binding library that can serialize JSON, YAML and CBOR documents to an output iterator one byte at a time without incurring heap allocation in most cases.

This is because I work on an embedded system with 640 KiB of SRAM and given the sheer amount of run-time data it will have to handle and produce, I'm wary not only about heap usage, but also heap fragmentation.

AI will readily identify such tricks, it can even help implement them, but unless constrained otherwise AI will pick the most expedient solution that answers the question (note that I didn't say answers the problem).


This is also the reason why we have two polar opposite views on AI. “Slop generator” vs “Next best thing since sliced bread”.

With SOTA models it all depends on how you drive them.


All my old software before AI was self documenting and didn't need comments -- it just was obvious. Today my prompts never make slop. I'm a really good driver.

I think the opposite is the case. We increasingly need to care more about performance and know how to leverage hardware better.

The market is telling us that through increased hardware prices.

LLMs being very powerful means that we need to start being smarter about allocating resources. Should chat apps really eat up gigabytes of RAM and be entitled to cores, when we could use that for inference?


People underestimate the effect of knowledge accumulation that happens when learning from high quality sources imo.

LLMs aren't even close to the level of knowledge distillation capacity a human has yet.


Or..? A golden era for people who want to think of new things and test out their ideas quickly by having AI code it up.

The last point in your intro description can't be stressed enough: this allows for safe handling of rounding errors in floating point operations.

Though you are inherently losing precision: there are values in the output interval which don't have a corresponding input that causes this output.


"Isolation" and "full rights" are mutually exclusive, contradictory properties.

The problem is that any data now becomes effectively an executable.

> I think I have an almost complete knowledge of all the attack vectors.

That's exactly the kind of hybris where the maximum danger lies.


> I mean, who the hell wants to be 10X more productive without a commensurate 10X compensation increase?

The person who realizes that everybody around them is bow at 10X and if they don't follow suit then they will soon be out of a job.


Your first paragraph is so short sighted that its message didn't even make it beyond the next one. It's a race to the bottom and your "doing whatever the fuck I want" will obviously never materialize.

The typical work week today is 40 hours. Just like it was 80 years ago. The typical worker is dramatically more productive than 80 years ago yet "doing whatever the fuck I want" time has not increased. Why would it? Employers don't need to pay such that 20 hour work weeks give you the same income. Because everybody around you is ok with working 40 hours.

This won't be different with AI, no matter if the overall effect is 1.1x or 10x or 100x productivity. Because it's not a technological problem but a sociological one.


> I think most people are going to say they dont want it. I mean, why would anyone want a tool that can screw up their bank account? What benefit does it gain them?

I'm not so sure. Matter of marketing and social pressure, big time.

Consider this: "Always-on pervasive google/fb/... login? I think most people are going to say they dont want it. I mean, why would anyone want a tool that would track their every move on the internet?" That could easily have been a statement 20 years ago. And look where we are.


McDonalds. Homogenous everywhere in the world. US, Italy, Japan, Brazil, same stuff.

Good pizza in Italy, goos ramen in Japan, grilled Picanha in Brazil, that's why you go there and want it different/original.

But in software UI this is often overdone. I want the pizzazz in my audio software in what it produces, not in how the UI looks like.


McDonald's is extremely different around the world. Different menu, different price.

I second that. There is a mile difference between the sorry excuse of a burger that’s called Big Tasty and McCrispy in the Netherlands versus the already way better proportioned and fresher one you get in Germany, up to the better ones in Italy.

Besides the bun, it is noticeable in every part. The amounts and quality of the sauce, vegetables, and meat. And finally how the burger is presented.

So if this difference can occur within 1000km of each other in the same continent, I fully accept that it is even more varied in the whole world.


Maybe your sample size is too small? I've lived close to the NL/D border for a while and the McD quality was indistinguishable on both sides of the border. The variation between restaurants in the same country and also between different days/times in the same restaurant was much greater than between countries.

Thatis, if you happen to go to a random McD in some country and the big mac was great that day and you go to a different restaurant in a different country on a different day and the big mac was bad, then that difference has likely least to do with them being in different countries. It's not like they actually use different recipes.


Okay, granted maybe it is. In NL it is mostly in cities (Amsterdam, Utrecht, Rotterdam, Dordrecht, Lelystad), in Germany with smaller places, and in Italy only in touristic places like Siena and Genoa. So maybe it is just a problem with McDonalds in Dutch cities.

I’d say the only place I’ve experienced McDonald’s to be ‘extremely’ different is in India due to the obvious prevalence of vegetarianism and outlawing of beef.

In other countries the do have a lot of additional meals which are specific to their local taste (rice/fried chicken/different sauces) but the core burgers like a Big Mac, mcChicken and sides such as fries are there.


"extremely different" is an exaggeration. It's mostly the same with some local differences.

Considering that people expect literally the same thing, I can understand how even small regional differences can seem extreme. Like not finding any beef on the menu in India, or any bacon in the Middle East.

"extremely"

Counterpoint: winamp was strictly more fun than any other audio software

And all those Delphi programs (ok rn I can only think of the crackz but there must have been others).

What made these Delphi programs so unique in their UIs?


Delphi shipped with its own, pretty complete, library of UI components.

McDonald's is homogenous within a country, but very different in different countries.

The American McDonald’s is a magnitude worse than the European (all of them), Australian or New Zealander. The menu is different in every country. However, it’s getting more uniform. Cheeseburger is the same basically everywhere outside of America, but not there. As somebody who got used to the European McDonald’s and tried it in about 30 countries all around the world, American McDonald’s is inedible. So there are differences. I completely understand the American sentiment of it, because it’s really, really terrible there.

Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: