Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For some reason this sort of thing bothers a lot of people. I think it’s great that we have a new tool in the toolbelt.


Gowers wrote about AI in the late 90's. He predicted a short golden age where mathematicians would still be useful to the AI. We are in that golden age now, apparently. The AI will soon eclipse all humans in mathematics and the art form of mathematics will cease in its present form.


Mathematics in its current form, you say? Like, for example, when we transitioned from doing things manually to using calculators/computers?


Could you elaborate on your last sentence please?


Think of it like software development. That art form also deceased due to AI. Remember the famous painters and hackers essay? It's not more relevant.


Go read Gowers' essay.


Form your own independent thoughts


I was sharing Gowers' thoughts. You clearly don't know how to read. It's not surprising considering the intellectual quality of the average commenter here.


I still have no idea what "eclipse all humans in mathematics" and "the art form of mathematics will cease in its present form" mean.


Did you read the essay?


Egh this is pretty "use unalived not died".

"Get out an English thesaurus and recreate Mona Lisa in different words."

If you really want to be a cognitive maverick, you would encourage them to make up their own creole, syntax and semantics.

Still, the result is describing the same shared stable bubble of spacetime! But it's a grander feat than merely swapping words with others of the same relative meaning.

You totally missed the point of "put this in your own words" education. It was to make us aware we're just transpiling the same old ideas/semantics into different syntax.

Sure, it provides a nice biochemical bump; but it's not breaking new ground.


How about a link to it?


It's because I still need to earn a living and this technology threatens my ability to do so in the near future.

It also significantly changes my current job to something I didn't sign up to.


I personally like AI but it has definitely shifted my job. There is less "writing code", more "reviewing code", and more "writing sentences in English". I can understand people being frustrated.

To me it's like a halfway step toward management. When you start being a manager, you also start writing less code and having a lot more conversations.


> To me it's like a halfway step toward management. When you start being a manager, you also start writing less code and having a lot more conversations.

I didn't want to get into management, because it's boring. Now I got forced into management and don't even get paid more.


Yes reviewing robot code submitted by other humans. Oh the joy of paying bills.


> It's because I still need to earn a living and this technology threatens my ability to do so in the near future.

That's certainly not the reason most HNers are giving - I'm seeing far more claims that LLMs are entirely meaningless becauzs either "they cannot make something they haven't seen before" or "half the time they hallucinate". The latter even appears as one of the first replies in this post's link, the X thread!


given that there had never been a technological advancement that was successfully halted to preserve the jobs it threatened to make obsolete, don't you see the futility of complaining about it? even if there was widespread opposition to AI - and no, there isn't - the capital would disregard it. no ragtag team of quirky rebels are going to blow up this multi-trillion dollar death star.


> don't you see the futility of complaining about it?

I'm not complaining to stop this. I'm sure it won't be stopped. I'm explaining why some people who work for a living don't like this technology.

I'm honestly not sure why others do. It pretty much doesn't matter what work you do for a living. If this technology can replace a non-negligible part of the white collar workforce it will have negative consequences for you. You don't have to like that just because you can't stop it.


Well yea, but school also tried to educate us for the unforeseen future.

Or at least my school system tried to (Netherlands).

This didn’t fully come out of the blue. We have been told to expect the unexpected.


> This didn’t fully come out of the blue. We have been told to expect the unexpected.

It absolutely did. Five years ago people would have told you that white collar jobs where mostly un-automatable and software engineering was especially safe due to the complexity.


In a concrete sense, what actually happened came out of the blue. I fully agree with that. That's not what I mean.

> We have been told to expect the unexpected.

But this didn't.

What happened is unexpected. And we've been told to expect that.

I understand that that's very broad, but the older people teaching me had a sense of how fast technology was accelerating. They didn't have a tv back in the day. They knew that work would change fast and the demands of work would change fast.

The way I've been taught at school, it's to actually be that wary and cautious. You need to pivot, switch and upskill fast again.

What are humans better at than what AI isn't? So far, I've noticed it's being connected to other humans. So I'm currently at a job that pivots more towards that. I'm a data analyst + softwar engineer hybrid at the moment.


The problem with automation is that it can suck the soul out of a job and turn something fulfilling and productive (say, a job as a woodworker) into something much more productive but devoid of fulfillment (say, working a cog in a furniture factory).

In the past this tradeoff probably was obvious: a farmer's individual fulfillment is less important than feeding a starving community.

I'm not so sure this tradeoff is obvious now. Will the increased productivity justify the loss of meaning and fulfillment that comes from robbing most jobs of autonomy and dignity? Will we become humans that have literally everything we need except the ability for self-actualization?


Humans are risk averse and loss averse. You see the downsides and are fearful but can't yet see the upsides or underestimate them. Why not make the same argument for internet and computers? We would've been better off without them? If AI makes doctors more efficient would you have your child die to make the doctor's life more fulfilling?


To be honest, yes, given the relatively high standard of medical care that we have, I might choose to be in a world with no AI accepting the possibility that some things could have been improved.


> For some reason

> _brief_ but enjoyable era


It's not even that enjoyable to review AI slop all day. So it's a brief and unejoyable era before the long and miserable era.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: