Hacker Newsnew | past | comments | ask | show | jobs | submit | arctek's commentslogin

Similar to my experience, it works well for small tasks, replacing search (most of the time) and doing alot of boilerplate work.

I have one project that is very complex and for this I can't and don't use LLMs for.

I've also found it's better if you can get it code generate everything in the one session, if you try other LLMs or sessions it will quickly degrade. That's when you will see duplicate functions and dead end code.


Perhaps need to semi-randomize the file size? I'm guessing some of the bots have a hard limit to the size of the resource they will download.

Many of these are annoying LLM training/scraping bots (in my case anyway). So while it might not crash them if you spit out a 800KB zipbomb, at least it will waste computing resources on their end.


This just moves the issue elsewhere though. I do agree that adding an extra step of having to notarize documents will filter many people.

But outside of this if someone is determined they can issue fake documents at this level of provenance.

Drivers licenses for example you can buy the printing machine and blanks (illegally) so you actually need to check the registrar in that location.


Has anyone actually measured this yet?

Much of this feels like when they did studies on people who take mushrooms for example feel like they are more productive, but when you actually measure it they aren't. It's just their perception.

To me the biggest issue is that search has been gutted out and so for many questions the best results come from asking an LLM. But this is far different from using it to generate entire codebases.


No, there’s been a couple of attempts but no one would call those outcomes conclusive.


This is also why I think the current iterations wont converge on any actual type of intelligence.

It doesn't operate on the same level as (human) intelligence it's a very path dependent process. Every step you add down this path increases entropy as well and while further improvements and bigger context windows help - eventually you reach a dead end where it degrades.

You'd almost need every step of the process to mutate the model to update global state from that point.

From what I've seen the major providers kind of use tricks to accomplish this, but it's not the same thing.


Even Google ended up adding more stuff to their homepage in the end. For a long time they tried to keep it super minimal- and it still is, but there's footer links and signed in header and a whole bunch of other links as well.

Mind you in the early days pages used to have hundreds if not thousands of text links all over the place, the only sites that do this now are the hardcore conspiracy sites where the author just adds several new links a day.

So in this dimension at least web UIs have changed for the better.


When divisions are rewarded with prestige and that prestige is numerate in public visibility, you either need a site per division or a very, very busy homepage, where the links aren’t organized by user need but by political clout.

It’s almost like your aunt and uncle who always bicker at family reunions. Keep that drama shit out of public spaces.


Except that nothing is illegal now, Biden showed that with blanket pardons. So for all intents and purposes they will do what they like and Trump can just wipe it clean at the end.


Do you mean “SCOTUS ruled that the president has ultimate immunity”?


Goes further than that too, suppose the one working on the wooden part is slow and the one on the metal part is faster. And surely the value of one part or another is also different, even though its the combined value that's relevant.

Suppose as well there are a thousand people lined up to make the wooden part but hardly any for the metal, then surely the ones who work on the metal part will (try to) command a higher wage too.


I actually think out of any language async/await makes the most sense for javascript.

In the first example: there is no such thing as a blocking sleep in javascript. What people use as sleep is just a promise wrapper around a setTimeout call. setTimeout has always created microtasks, so calling a sleep inline would do nothing to halt execution.

I do agree that dangling Promises are annoying and Promise.race is especially bad as it doesn't do what you expect: finish the fastest promise and cancel the other. It will actually eventually resolve both but you will only get one result.

Realistically in JS you write your long running async functions to take an AbortController wrapper that also provides a sleep function, then in your outer loop you check the signal isn't aborted and the wrapper class also handles calling clearTimeout on wrapped sleep functions to stop sleeping/pending setTimeouts and exit your loop/function.


> async/await makes the most sense for javascript.

More like: Has no alternative. There are no threads in JS.


Yes there are. They are called "Workers" (WebWorkers), though. It's a semantic game with a different API but the same concept.

https://www.w3.org/TR/2021/NOTE-workers-20210128/


Note that the underlying C++ libraries for Node are perfectly capable of using threads, only the final user space model is single threaded by default.


I rather like things like the mobx flow action system based on generators - you yield() promises rather than awaiting them and use yield* to splice other functions following the same approach.

You can see a nice example of using that approach standalone here: https://axisofeval.blogspot.com/2024/05/delimited-generators...


_

    function blockingSleep(amt) {
        let start = new Date().getTime()
        while (start + amt > new Date().getTime()) {}

    }


This is horrible - I love it.

I did forget you could do this, though I'm fairly sure in the case of a browser (at least it used to) this will block the entire UI.


fsync doesn't work here right because unix pipes are in memory? I've had luck elsewhere with nodejs and WriteableStreams that refuse to flush their buffers before a process.exit() using fsync on the underlying file descriptors.


Well, yes:

    EINVAL    fd is bound to a special file (e.g., a pipe, FIFO, or
              socket) which does not support synchronization.
Or as POSIX puts it,

    [EINVAL]
        The fildes argument does not refer to a file on which this operation is possible.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: