Hacker Newsnew | past | comments | ask | show | jobs | submit | more tart-lemonade's commentslogin

I've also seen these marketed as "Kodi boxes".


Alternatively, it could become more efficient at creating inefficiencies.

Imagine pushing your vibe-coded changes triggering a CI hook that uses an LLM to create a TPS report per commit+pipeline in a merge request, which another LLM-based bureaucrat could use to decide whether the MR should be approved, generating its own TPS reports in turn which some middle manager could use their LLM to summarize.

Think of all the shareholder value we could create by going all-in on AI!


The number of vibe coded PRs clogging up the review queues is going up. When you ask the submitter what some nonsense line of code is doing and what it's for, they have no idea. Slowing the whole process down.


It definitely takes me more time to review PRs now, and I have more PRs to review. Agent created code seems “fine” but also seems like 30% more than is needed (total SWAG)


The worst part is how it destroys your implicit trust that the author ran and understood the code. The generated code looks fine but doesn't actually work because it was never run. So to properly review PRs today you have to spend a lot longer assuming nothing works and every detail needs to be carefully tested.


Is there’s no screenshots of it working or logs from a console showing it working, I kinda presume it won’t. In a previous role that was more front end facing (tablet experience for self driving cars), I had a few coworkers who would put out PRs that I knew could not have worked and were not tested. My nudge was “oh can you add a screenshot to the PR?” And they would realize it didn’t work and get back to me in a few days. Been dealing with this since before gen AI produced code haha.


The US also has outsized influence in this arena due to the USD being the world reserve currency. Which isn't to say that might makes right, but it's easier to get your way when you can dictate the terms by which banks and nations can interface with the global economy. The British pound doesn't have quite the same level of soft power, so it must be wielded more strategically to avoid completely losing that which it still possesses.

I don't imagine going after Imgur would be a worthwhile exercise of that soft power.


Intel's iGPUs don't seem very at risk because the market for low-power GPUs isn't very profitable to begin with. As long as Nvidia is able to sell basically any chip they want, why waste engineering hours and fab time on low-margin chips? The GT 1030 (Pascal) never got a successor, so that line is as good as dead.

Even before the Pascal GTs, most of the GT 7xx cards, which you would assume were Maxwell or Kepler from the numbering, were rebadged Fermi cards (4xx and 5xx)! That generation was just a dumping ground for all the old chips they had laying about, and given the prominence of halfway decent iGPUs by that point, I can't say I blame them for investing so little in the lineup.

That said, the dGPUs are definitely somewhat at risk, but I think the risk is only slightly elevated by this investment, given that it isn't exactly a cash cow and Intel has been doing all sorts of cost-cutting lately.


Aren't a lot of those cards sold for the audience that needs more display heads rather than necessarily performance?

This has been somewhat improved-- some mainboards will have HDMI and DisplayPort plumbed to the iGPU, but the classic "trader desk" with 4-6 screens hardly needs a 5090.

They could theoretically sell the same 7xx and 1030 chips indefinitely. I figure it's a static market like those strange 8/16Mb VGA chipsets that you sometimes see on server mainboards, just enough hardware to run diagnostics on a normally headless box.


Agree. Not only would there be no money in it to try to replace Iris graphics or whatever they call them now -- it would be ultra pointless because the only people buying integrated graphics are those where gaming, on-device AI, and cryptocurrency aren't even part of the equation. Now, that is like 80%+ of the PC market, but it's perfectly well served already.

I saw this move more as setting up a worthy competitor to Snapdragon X Elite, and it could also probably crush AMD APUs if these RTX things are powerful.


Intel sells discrete cards and their next card was setup to do AI and games competently. They were poised to compete with the low to mid range Nvidia cards at HALF the cost.

It was definitely going to upset the market. Now i understand the radio silence on a card that was supposed to have been coming by Xmas.


Oh for sure. Arc is in jeopardy. Though tbh it was already, wasn't it? Can't you see an alternate universe where this story never happened, but Intel announced today "Sorry, because our business is dying in general and since Arc hasn't made us a ton of money yet anyway, we need to cut Arc to focus on our core blah blah blah".

I just meant their integrated GPUs are what's completely safe here.


I doubt it's safe, it competes directly with Nvidia on handhelds.

Also the arc wasn't in jeopardy, the arc cards have been improving with every release and the latest one got pretty rave reviews.


It wasn't in jeopardy for being no good, it was in jeopardy because Intel is so troubled. Like the Bombardier C-Series jet: Everyone agreed it was a great design and very promising, but in the end they had no choice but to sell it to Airbus (who calls it the A220), I think because they didn't really have the money to scale up production. In like manner, Intel lacks the resources to make Arc the success it technically deserves to be, and without enough scale, they'll lose money on Arc, which Intel can hardly afford at this point.


Calling BS on "gaming not part of the equation". Several of my friends and I have exclusively games on integrated graphics. Sure we don't play the most abusively unoptimized AAA games like RDR2. But we're here and we're gaming.


RDR2 is quite optimized. We spend a lot of time profiling before release, and while input latency can be a tad high, the rendering pipeline is absolutely highly optimized as exhibited by the large amount of benchmarks on the web.


This is why I love HN. You get devs from any software or hardware project you care to name showing up in the comments.


RDR2 ran beautifully on Linux for me. If you were part of the team, excellent work.


Sorry, I'm happy for you, and I do play Minecraft on an iGPU. I just meant that about 80% of the PCs sold seem to be for "business use" or Chromebooks, and the people writing those POs aren't making their selections with gaming in mind.

(And also, I'm pretending Macs don't exist for this statement. They aren't even PCs anymore anyway, just giant iPhones, from a silicon perspective.)


RDD2, Ghosts Of Tsushima, Black Myth Wukong. These games will play at 40 to 50 + fps at 1080p low to medium on the intel ARC igpus (no AI upscaling).

To anyone actually paying attention, igpus have come a long way. They are no longer an 'I can play minecraft' thing.


That performance is not surprising, Arc seems pretty dope in general.

I hadn't realized that "Arc" and "Integrated" overlapped, I thought that brand and that level of power was only being used on discrete cards.

I do think that integrated Arc will probably be killed by this deal though, not for being bad as it's obviously great, rather for being a way for Intel to cut costs with no downsides for Intel. If they can make RTX iGPUs now, and the Nvidia and RTX brand being the strongest in the gaming space... Intel isn't going to invest the money in continuing to develop Arc, even if Nvidia made it clear that they don't care, it just doesn't make any business sense now.

That is a loss for the cause of gaming competition. Although having Nvidia prop up Intel may prove to be a win for competition in terms of silicon in general versus them being sold off in parts, which could be a real possibility it seems.


"Gaming" = "real-time-graphics-intensive application". You could be playing chess online, or emulated SNES games, but that's not what "gaming" refers to in a hardware context.


> Sure we don't play the most abusively unoptimized AAA games like RDR2.

Wait, RDR2 is badly optimized? When I played it on my Intel Arc B580 and Ryzen 7 5800X, it seemed to work pretty well! Way better than almost any UE5 title, like The Forever Winter (really cool concept, but couldn't get past 20-30 FPS, even dropping down to 10% render scale on a 1080p monitor). Or with the Borderlands 4 controversy, I thought there'd be way bigger fish to fry.


It would be amusing to see nVidia cores integrated into the chipset instead of the Intel GPU cores. I doubt that is in the cards unless Intel is looking to slash the workforce by firing all of their graphics guys.


Out of 9 desktop GT 7xx cards only 2 were Fermi rest were Kepler.

Out of 12 mobile GT 7xx cards only 3 were Fermi (and 2 of those were M and not GT) rest were Kepler.


> I could go into more specifics if interested (storage at scale/speed is my bread and butter), but this post is long enough.

I would read an entire series of blog posts about this.


Not my work, but ask and ye shall recieve: https://www.storagereview.com/

They primarily focus on Storage (SSD's and HDD's) but also evaluate storage controllers, storage-focused servers/NAS/DAS/SAN/etc and other such storage-adjacent stuff. For an example of the various factors that differentiate different kinds of SSD's, I'd recommend the above's article reviewing Micron's 7500 line of SSD's[0]. It's from 2023, but still relevant and you don't have to read the whole thing. Heck just scroll through the graphs and it's easy to see this shit is far from simple even when you're accounting for using the same storage controllers and systems and testing methodologies and what not.

If you want to know about the NAND (or NOR) flash itself, and what the difference/usecases are at a very technical level, there's stuff like this from Micron "NAND Flash 101 NAND vs. NOR Comparison"[1]

If that's too heavy for you (it is a whitepaper/technical paper after all), and you want a more light read on some of the major differences between Enterprise Flash and Consumer flash, SuperSSD has a good article on that [2] as well as many other excellent articles.

Wanna see some cool use cases for SSD's that aren't so much about the specific low-level technicals of the storage device itself, but rather how they can be assembled into arrays and networked storage fabrics in new and interesting ways ServeTheHome as some interesting articles such as their "ZFS without a Server Using the NVIDIA BlueField-2 DPU"[3]

Apologies for responding 2 days late. I would be happy to answer any specific questions, or recommend other resources to learn more.

Personally my biggest gripe that I've not really seen anyone do proper analysis on is the thermal dynamics of storage devices and the impact that has (especially on lifespans). We know this absolutely has an effect just from deploying SSD's at scale and seeing in practice how otherwise identical drives within the same arrays and in the same systems have differing lifespans with the number one differentiating factor being peak temperatures and temperature delta's (high delta T can be just as bad or worse than just high temp, although that comes with a big "it depends"). Haven't seen a proper testing methodology really trying to take a crack at it (because that's a time consuming, expensive, and very difficult task, far harder to control for relevant variables than in GPU's imo, due in part to the many many different kinds of SSD's from different NAND flash chips, different heatsinks/form factors, wide variety in where they're located within systems, etc etc). Take note that many SSD's, save for those that explicitly are built for "extreme/rugged environments" have thermal limits that are much lower than other components in a typical server. Often the operating range spec is something like -10C to 50C for SSD's (give or take 10C on either end depending on the exact device), meanwhile GPU's and CPU's can operate at over 80C which - while not a preferred temp - isn't out of spec, especially under load. Then consider the physical packaging of SSD devices as well as where they are located in a system can mean they often don't get adequate cooling; M.2 form factor SSD's are especially prone to issues in this regard, even in many enterprise servers both due to where they're located in relation to airflow or other hot components (often have some NIC/GPU/DPU/FPGA sitting right above them or a nearby onboard chip(set) dumping heat into the board which raises the thermal floor/ambient temps). There's a reason the new EDSFF form factor has so many different specs to account for larger heatsinks and cooling on SSDs [4][5][6]

I've barely even touched on things like networked arrays, the explosion in various accelerators and controllers for storage, NVMeoF/RoCE/Storage Fabrics, Storage Class Memory, HA storage, Transition flash, DRAM and Controllers within SSD's, Wear-leveling and Error Correction, PLP, ONFI, SLC/MLC/TLC/QLC and the really fun stuff like PCIe root topologies, NVME zoned namespacing, Computational Storage, CXL, GPUDirectStorage/BaM, Cache Coherency, etc etc etc.

0: https://www.storagereview.com/review/micron-7500-pro-7500-ma...

1: https://user.eng.umd.edu/~blj/CS-590.26/micron-tn2919.pdf (Direct PDF link)

2: https://www.superssd.com/kb/consumer-vs-enterprise-ssds/

3: https://www.servethehome.com/zfs-without-a-server-using-the-...

4: https://americas.kioxia.com/en-us/business/ssd/solution/edsf...

5: https://americas.kioxia.com/en-us/business/ssd/solution/edsf...

6: https://members.snia.org/document/dl/27231 (Direct PDF link, Technical Whitepaper on the Standard if you really want to dive deep into what EDSFF is)


Just to clarify, do you mean that UUIDv4 in general is worse, or just this 7->4 obfuscation?


I'm not saying anything about better or worse. I'm saying that UUID v4 by definition has high entropy and UUID v7 does not. You can always go from low to high entropy, but not the other way around.


The longer the route, the harder it is for the food to stay fresh and warm/cold/frozen. It's a trade-off between efficiency, price, and customer satisfaction.


It always felt like a weird business model to me. If you lack a physical presence, the only thing you have over a decent prepared section at the grocery store is variety (and freshness, at least in theory). You don't even have convenience on your side since Instacart exists, and because the lower rent was predicated on leasing in more remote areas, the food is even less likely to be warm by the time it arrives than if you got groceries delivered.

And for the providers of the ghost kitchens, while they are selling a shovel of sorts, their bet was there would be a continuing market for their shovels. That space isn't likely to be used for any other restaurants because of the lack of foot traffic, but it also isn't likely to be used in large-scale food production because the facilities usually aren't large enough to be re-tooled for anything beyond catering companies. Commercial kitchen build-outs are not cheap, so investing in large scale small kitchen spaces is a risky bet.


I mean,they could get better bags that keep the food warm,or a "heating spot" downtown to heat the food,but it's prepared in the kitchen? I don't see how it's impossible,just nobody willing to invest in the Business model that's not a scam


I think inequality is part of the story here too, even with restaurants generally. 60 years ago, it was reasonable to save some money at your union factory job to open a restaurant. If it didn't work out, you could go back to the plant and finish out your 30 years and retire with a full pension.

Now, I'm a top decile professional and would basically have to bet my whole net worth, including my retirement money, if I wanted to open a real restaurant. No wonder chain restaurants rule the day and the only thing interesting happening in food in most of the country are in food trucks. Ghost kitchens, at least a few years ago, seemed like a logical next step after the food truck: an even less capital-intensive way to get into the food service business.

The same forces will push someone who has this ambition to go the ghost kitchen route. Hopefully failing this way instead of with a fully staffed restaurant has saved at least one family from total ruin (downgraded instead to partial ruin).


I have a painting of a commuter train I used to ride to work every day. It's not a conversation starter (or rather, I'm always the one who starts talking about it) but I love it because it reminds me of when I discovered that I didn't need to be a slave to my car and how freeing that was.

It cost me $50 and I've taken it with me every time I've moved.

I really don't get dropping thousands on a single piece, I've never felt any work speak that loudly to me.


I think it depends on the piece. I have a piece that I love and spent about $5k on. It's relatively large and has a lot of detail. I wouldn't be surprised if that is the equivalent of a month's work full time for the artist so the price seems reasonable to me.


Yeah art has objective value and subjective value. Every once in a while you'll find something with a lot of subjective value. Finding something with both is also a thing but it's not easy to find them for a low price.

The reason that expensive art exists is because there's a market. The fact that the market is weird and in decline doesn't change the fact that wealthy people find art to be a worthwhile thing to buy.


I didn’t end up buying it - it went for a price I could afford but did not want to pay - but I have seen an original Al Bean painting. Twelve men walked on the moon. One painted it. If it had gone for $10k it would have been mine.


I'm sure there is some art you'd shell out for. Maybe an original prop from your favorite movie or a collectors item from a time period you're nostalgic for...


It was done to try and make the bill somewhat pencil out and make the national debt increase less egregious. Everyone just assumed it would be delayed forever or reversed before it could take effect, but those negotiations failed, triggering massive waves of layoffs.

https://blog.pragmaticengineer.com/section-174/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: