Hacker Newsnew | past | comments | ask | show | jobs | submit | hu3's commentslogin

I'm with the other commenter. There's no way to port 5k lines in a day with confidence unless using LLMs + strong unit tests.

I won't even ask for an example of otherwise, but feel free to provide a repo where a human did that.


related: itchio has multiple jams starting every day!

https://itch.io/jams


1060 is a sweet card for multi monitor. good on you for gifting him.

I upgraded to a 4070 super last year. I ran both cards at the same time for a little bit, but it got really frustrating to keep the wrong card from being assigned to a particular task with llama. I really should’ve taken an R&D tax credit on my AI research but I’m still able to expense it for the business.

can't agree. this name has logical meaning

Apple might have convinced some gullible customers that this was something new.

But to the rest of the world variable refresh rate existed for years by then. As is with most Apple "inventions".

In this case the patent goes back to 1982: https://patents.google.com/patent/US4511892A/en


Indeed but:

1) That is relatively very slow.

2) Can also be done, simpler even, with SoTA models over API.


Right, this works with any models. To me, the most interesting part is that you can use a smaller model that you could run locally to get results comparable to SoTA models. Ultimately, I'd far prefer running local, even if slower, for the simple reason of having sovereignty over my data.

Being reliant on a service means you have to share whatever you're working on with the service, and the service provider decides what you can do, and make changes to their terms of service on a whim.

If locally running models can get to the point where they can be used as a daily driver, that solves the problem.


Can confirm. I have a Ryzen 9800X3D with RTX 5070, 128GB of RAM and TBs of Gen 5.0 NVMes.

Not only it is screamingly fast (the fastest on earth for some workloads), but I can upgrade it easily. And is dead silent too.

The best thing is it runs native Linux and it just works.


And a 9800X3D is not even the fastest CPU out there, nor even the fastest CPU you could use with your specific motherboard. A 9950X3D is essentially two of the 9800X3Ds combined, and would be a drop-in replacement.

Wrong. See benchmarks. Many games and single-threaded workloads run faster on 9800X3D.

There are various reasons for this, major one being that the 9800X3D has more L3 cache per thread than the 9950X3D.

And also wrong that a 9950X3D is 2x 9800X3D combined. A quick glance would tell you that, since 9950X3D has 128MB of L3 cache shared between more threads while 9800X3D has 96mb for half the threads, so more L3 per thread.

And most of the times, even when a 9800X3D loses to 9950X3D in games, it's usually within 1-4% margin for most games.

It's a monster for games and some workloads.

It's funny that people who blindly buy 9950X3D for gaming+office workloads without checking benchmarks often end up with similar or slower performance.

Much smarter to use the price difference on other hardware to speedup other things such as faster NVMEs, efficient silent cooling, faster GPUs, etc.


source? they actually just added 16bit support. Something not even Windows support anymore.

> I don’t believe that’s true. Things are moving constantly, and in the right direction.

Hah! I'll use that argument if I ever get PIP'd.

No but seriously, constantly moving doesn't mean fast enough. Swift took took long to have cross-platform support.

And it is still uberslow to compile. To the point of language servers giving up on analyzing it and timeout.


Not just uber slow to compile, because as a Rust dev I could take that. But it rejects correct programs without telling you why! The compiler will just time out and ask you to refactor so it has a better shot. I understand that kind of pathological behavior is present in many compilers but I hit it way too often in Swift on seemingly benign code.

Did that happen recently (the compiler just bailing out)?

Because they got much better at that, and it’s been a long while since that happened to me. Like “I don’t even remember when was the last time it happened” long.


The last time I used Swift was 4 months ago. It was recent enough that I'm still salty about it! :P

If cross platform support took so long, it's a major red flag.

Plus Swift is arguably too unnecessarily complex now.

And there's Rust/Zig so why use Swift for low level?



> Plus Swift is arguably too unnecessarily complex now.

I would argue the allegations of complexity against Swift are greatly exaggerated. I find the language to be very elegant and expressive in syntax, high in readability, and fairly terse. Other than that, Swift feels near identical to every other OoP language I have used.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: