Hacker Newsnew | past | comments | ask | show | jobs | submit | more ilaksh's commentslogin

I think that XP was the only true agile methodology. Agile just got more and more corrupted over the years through stupidity.

Clearly AI programming allows you to quickly close feedback loops. I don't think everything needs a comprehensive set of unit tests though.

But if people can go back and understand the core concept of XP (which again is about feedback loops to me) and take advantage of LLM-based agent systems to create those tight closed feedback loops, then that will be an advance for software engineering.


Starting with XP and in shops doing XP quite intensely has ruined me because I simply can't stomach working in "SCRUM" shops where a whole pile of stuff is taken as "agile" dogma which is mostly just ritualized meaningless bastardizations of things that XP pioneered, turned inside out.


I think the ideal scenario is usually two paired programmer using a shared set of AI agents on the same working branch together. It's an ideal feedback loop of paired planning, reviewing, building, and testing.


> I don't think everything needs a comprehensive set of unit tests though.

There's a difference in the tests of that era though. Around the xp times, unit tests were for unit of functionality, not per-method.


That’s not really true.

“Unit tests are small tests, each one exercising a little piece of functionality. The units tested are usually individual methods, but sometimes clusters of methods or even whole objects.”

Extreme Programming Explained: Embrace Change, 2nd Edition (2004) Kent Beck


You're agreeing with me there. "each one exercising a little piece of functionality". In Beck's approach the tests are added at each step for a simple functionality addition, which then gets implemented in the project to pass the test. These days unit tests are more commonly used as "every method is a unit to be tested completely", often after the implementation is already there.


I don’t see how?

You said: “Around the xp times, unit tests were not per-method.”

Beck said: “Unit tests are usually individual methods”


That quote is saying the same thing as GP.


I also wonder if this is written from a statically-typed perspective. In dynamic-typing land there are so many more stupid little things that can break that the compiler would otherwise catch for you

Either that or tracing/logging/debugging, but other than specific niches like parsing (of specific bug repros) I think integration tests are generally a lot more bang for the buck.

Anyway, if you want to go down a related-but-unrelated rabbit hole, J.R. Thompson's lecture on the Space Shuttle Main Engines is a good one. You can probably watch it at higher speed to smooth out the many, many "uh"s (believe me, it's bad):

Integrated testing: https://youtu.be/uow6v1EuybE?t=1292

Test to failure: https://youtu.be/uow6v1EuybE?t=3135

https://ocw.mit.edu/courses/16-885j-aircraft-systems-enginee...

--

There's this more-modern link but in true modern fashion you can't really link to specific things presumably because it's all javascript muck: https://openlearninglibrary.mit.edu/courses/course-v1:MITx+1...


I think a more accurate version of this is: unit tests were not only per-method but also per functionality. This was often called BDD (Behavior Driven Development), e.g. Ruby's cucumber. Your intuition here is correct though.


I disagree with the "not only". The idea in xp is to write the test first. http://www.extremeprogramming.org/rules/testfirst.html You don't know how many methods/functions (if any) you're going to add to make it pass, so they're explicitly per-functionality.


Depends on how accurately AI can close the loops.


Really? Because there is nothing agile about not shipping half your code to users (unit tests).


If you're not trolling, you're doing a great impression.


Pure pretension.


Absolutely. One of the footnotes even uses the phrase "deeply cohere their attentional field" as if that actually means something. Barf.


There are other models that do that, such as photogrammetry models.

But someone could possibly extend the work so it was a few photos rather than one or many. The way you ask the question makes it sound like you think it was a trivial detail they just forgot about.



There is a fundamental misunderstanding about popularity. People think that popularity is directly related to merit or rationality.

Technical things are largely popular for the same reason non-technical things are popular: trends. In other words, they are popular because other people perceive them to be popular. Humans are herd animals.

async is harder and associated with Node.js/JavaScript which probably makes it uncool for a certain influential python subculture.

But actually Fast API has basically taken over and now I think people should recognize that means async IS popular in python at this point.


Location: Texas

Remote: Yes

Willing to relocate: No

Technologies: AI agents, LLMs, Python, web components, Linux, etc.

Resume/CV: Available on request

Email: jason@mindroot.io

GitHub: https://github.com/runvnc

Very experienced full stack engineer with a focus on AI agents for the past few years. Have built complex agents for things like: real estate cash flow analysis and presentation generation, medical necessity review, grade school math tutoring and illustrated story generation, sales agents, website generation, interactive avatars, etc. Created the MindRoot comprehensive open source agentic application platform.

Also experienced with Node.js and other web and desktop technologies. I've built numerous different types of applications end-to-end: TV news production, auctions, GIS decision support, infrastructure as a service, complex database systems, 3d retro computer programming platform, etc.


I know this is not really what they mean, but in case you want to give AI a literal VM, I have a service that deploys a VM with an open source AI agent on it that can act as a sysadmin. Also gets it's own IP address. https://mindroot.io


Also a good option for experimenting with local AI: GMTec AMD Ryzen™ AI Max+ 395 --EVO-X2 AI Mini PC


GMTec AMD Ryzen™ AI Max+ 395 --EVO-X2 AI Mini PC seems pretty similar and only $2000.

Would be interested to see head to head benchmarks including power usage between those mini PCs and the Nvidia Thor.


Its not even close dude, the nvidia stuff is like 2000 TOPS vs the 50 you get from the ai 395+


True, but this advantage is strictly for AI inference and only when using the very low resolution FP4 and sparse matrices.

When using bigger number formats and/or dense matrices the advantage of Thor diminishes considerably.

Also the 50 Tops is only from the low power NPU. When distributing the computation also over GPU and CPU you get much more. So for a balanced comparison one has to divide the Thor value by 4 or more and multiply the Ryzen value by a factor that might be around 3 or even more.

The Ryzen CPU is significantly better, and the GPU has about the same size but a much higher clock frequency, so it should also be faster, so for anything except AI inference a Ryzen Max at half price will offer much more bang for the buck.


You are forgetting Thor has CPU too. So far we haven't seen how good Ryzen CPU is at matrix operations. Which makes given numbers 4 and 3 questionable at best.

The best would be to run a real test. However for robotics it's obvious, Thor with camera interfaces is unbeatable. They are used for (stereo) vision and obstacle avoidance. Add to that software stack.


Seems like a great concept. Hope they are commercially successful.

It reminded me about another geothermal energy idea: dig about 3 or so miles straight down and harvest the heat that is there already. I guess that's a lot harder than making a dirt pile. But maybe it could become practical if there was enough commercial effort and large scale manufacturing of the equipment.

Kind of brings it around full bore though. Why do that kind of project when you can just harvest actual fuel like oil or gas?

I think this stuff can become practical with more scale and wide manufacturing of equipment and development of efficient techniques. But it requires you to do a lot of upfront work based on principal rather than the bottom line.

So anyway again great idea because it eliminates a lot of challenges and costs that come with concepts like "Journey to the Center of the Earth" etc.


> another geothermal energy idea: dig about 3 or so miles straight down and harvest the heat that is there already

Deep geothermal ought to work. Deep drilling is hard, but it's been done. Eavor-Deep got down to where they got 250C water. [1] That was back in 2023. Not much new since. The problem seems to be that when you drill into really hot rock, most drilling techniques run into trouble. Rock becomes plastic and clogs things up. The drilling tools have problems with the heat. Progress continues, slowly.

There's these guys, trying to drill with microwaves: [1] On September 4, they're going to do a public demo and try to drill a 100 meter hole.

[1] https://eavor.com/eavor-deep/

[2] https://www.quaise.com/


The other issue is that the steam loses energy as it's pumped over a long distance. Iceland and a few other locations are big users since they are on fault zones with hot rocks relatively close to the surface. Even if deep drilling is demonstrated, it's unclear if geothermal power will become geographically independent.


> Why do that kind of project when you can just harvest actual fuel like oil or gas?

How can that still be a question in this day and age? Unless somebody doesn't "believe" in climate change caused by greenhouse gas emissions.


Forget climate change, the best reason is national security. Russia's war plummeted Europe into gas shortages and price hikes. Many countries would love to not be dependent on them. And since WW2, all war requires vast quantities of oil; maybe drones will reduce that a bit, but you still need to move stuff (and people) around a lot. So you need a reliable source of energy both in peacetime and wartime.

For the US, the best reason is sustainable energy. Gas, oil and coal are not renewable, so you eventually need to adapt a new form of energy. Just transporting it is problematic, with most communities rejecting pipelines. In the meantime you're polluting your local environment and putting workers at risk. Whereas if your energy plan is largely "the sun shines", "the wind blows", and "dirt holds heat", that is ridiculously more sustainable.

The biggest problem we have is we demand too much energy. AI has made this problem way worse. Nuclear is the only thing that's going to fill the gaping chasm of demand.


> Forget climate change,

And that's why we are in trouble.

> the best reason is national security.

Sure, if that's what convinces people to do the right thing, so be it. Though the continuation of that thinking tends to cause behavior that's not very friendly w.r.t. climate or environment.


I think Austin Vernon has spent some time investigating geothermal which is likely why they've arrived at stored energy in dirt


Geothermal already does the “harvest energy within the earth” but it’s closer to the surface. What are the challenges with digging 3 miles down?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: