Hacker Newsnew | past | comments | ask | show | jobs | submit | JSR_FDED's commentslogin

Pfff, this is nothing. The $20,000 Neo folds clothes much slower and comes with free remote operators.

Lua is very fast - even without the JIT it makes Python feel like wading through molasses.

Lua is so small and simple (but not simplistic) that you can keep it completely in your head. Even if you only get to work on your project once every weekend you won’t have to relearn half of it every time.


What is the exact threat being addressed here? If the proxy detects a sensitive key and then places it in a .env file, Claude code would pull the value from the env file and then still send it to Anthropic servers wouldn’t it?

Correct, it doesn't protect against a malicious LLM actor, but rather places guardrails on the developer as well as the agent.

I saw that kimi.com offers a cloud-based version of OpenClaw, using their Kimi2.5 frontier model. There’s also a “link to existing OpenClaw” option.

Interesting, haven't tired it. I'm afraid Anthropic will be cracking the whip here and there. On the other hand, it makes sense that other providers would move fast on this. Curious, have you tried it with Kimi2.5?

A friend of mine used this as motivation for switching to a used MacBook Air M1 (today he’d probably buy a Neo) - and started using Pages to write his papers s. He watched a bunch of YouTube videos and asked a lot of questions in Google every time he got stuck. A week later he’s a happy camper.

Pages takes me back as I really enjoyed all the Apple-first products (remember Sofa and Cultured Code?) but nowadays I can’t justify getting into macOS.

This is so great!

China has the most sophisticated grid in the world, and is spending $100B a year on expanding and upgrading. They have a uniquely high share generated by renewables. It runs 800kV and will go higher after the upgrade. The first Small Modular Reactor will come online this year. If you think that’s all just being built in random factories without standards you’re very much mistaken.

China's energy generation mix is not uniquely high in renewables! Where on earth did you get that idea from? China has about the same % renewable generation as Australia, about 1/3rd. Brazil has 80%, Norway 90%, some have near 100%.

What China does have is a very high carbon emission intensity of electricity generation thanks to over half capacity coming from coal.


How does the carbon intensity affect transformer operation or are you just adding non sequiteurs?

I was replying to a comment that brought up renewables. I can be of further assistance to help you follow the chain of comments if you let me know what part you're struggling with exactly.

There are no “good guys” amongst the top tier AI companies.

I spent decades completely happy with Cmd+Tab. Now I’m helping someone develop a trading system and I need to see several log files simultaneously, a broker GUI, and neovim.

Once I realized that in order to answer a single question I needed to Cmd+Tab at least four times, often more, I added two monitors and it’s dramatically lowered my stress level.

FYI, on older MacBooks you can’t add more than one extra screen, but if you get a DisplayLink dongle it works perfectly.


>but if you get a DisplayLink dongle it works perfectly.

LOL

Im currently typing this on a work issued Macbook thats about 2 years old at this point, and 40% of the time, when I plug in a cable, it decides it wants to turn on and turn off hdmi output in rapid succession.


I always use DisplayPort over USB-C DP-Alt (or Thunderbolt on some displays) and I literally never have a problem across various LG, Dell and Apple Studio Displays.

MacBook Pro M1 Pro or MacBook Pro M5

Sounds like something is really broken in your setup?

On the other hand, sleeping/waking Thunderbolt displays on my ThinkPad with Linux regularly leads to kernel panics, across several kernel versions.


Display over USB-C is very janky (as in very much depends on hardware in the display or the docking station that you use).

The only saving grace was that the Macbook has external HDMI which works flawlessly, just like it has been for the past decade on any laptop. But not all models of Macs have had external HDMI. My last one did not, and it was a piece of crap, that ended up also swelling the battery somehow.


M1 non-Pro could only support one external screen through TB, and I think it carried on through at least M2 Air. It would also frequently get my Dell screen into a weird hung state after suspending and attempting to reconnect, frequently requiring a power cycle of the screen (not even connecting a Linux laptop back to it got it fixed). At some point, it seems to have gotten fixed and I am not seeing it anymore.

Linux, however, has worked great ever since I got the USB-C DP Alt-mode screen back around 8 years ago with my Thinkpad X1 Carbons over the years. I do have trouble getting a stable 8K at 60Hz through it with Iris Xe (gen13), but that does not work with Macs either.

Linux did have issues with using different scaling factors on multiple connected screens, but I only ever used one monitor so it never bothered me.

On top of that, it still does support subpixel rendering, and you can even tune pixel layout (RGB, BGR...) for VA and OLED panels, so text never looks crappy or janky as it can on Macs with low DPI screens (eg. large 4k screens of 40"+, but noticeable even on 32" 4k).


Recent thinkpads are a bit of shit-tier laptops, and linux doesn’t help much (it’s not linux’s fault).

for personal use I gave up after almost twenty years of thinkpad+linux and got a MacBook neo. So far it’s been great, much much better than my shit-tier ryzen-based x13g1 with 8c/16t and 32gb RAM. (Edit: it’s also more reliable when driving my 34” 1440p external display).


I had used Linux since mid-1990s and gave it up for Apple Silicon. Not fighting my hardware/software has been great despite the diminution of Apple’s software stack.

DisplayPort over DP Alt Mode != DisplayLink. DisplayLink is a way to send compressed video streams over a normal USB connection.

Yes, I never said so. I was suggesting to use DP-Alt mode instead (if they have a MacBook without HDMI, even if they had, I'd still prefer DP-Alt).

Great example of "what works for you" I think - I'm in the camp of:

I did multiple monitors for a long time, and probably my best productivity thing was switching to one AND ONLY ONE big enough 4k. Basically allows me to switch between "one focused monitor most of the time" and "the equivalent of 2 or 3, maybe even 4" if I need it.


The reason I love my old cheap 1080p monitors so much is because they need less organizational overhead compared to a large 4k monitor where you constantly have to fix UI scaling bugs and zoom in/out, force different fonts for shitty web pages etc.

I am never gonna sway away from i3 [1], a notification free tiled window desktop system is just way too convenient. When I have to bootup a Windows VM for work (I am a malware analyst most of the time) I am losing my mind with all the notifications and blocking popup windows all the time. I have no idea why people are tolerating this as their work setup. It is hostile design to its users.

I use my computer to work. I don't want a computer that works me all the time.

[1] for desktop/GUI apps I use a mixture of GNOME forks and LXDE apps. Everything that makes popups when running in the background is avoided.*


“compared to a large 4k monitor where you constantly have to fix UI scaling bugs and zoom in/out, force different fonts for shitty web pages etc.”

counterpoint: this doesn’t appear the case with Apple, as they have defaulted their OS entirely to retina-level density now, removed subpixel rendering, and anything non-5K may look off (and you need to go through hoops to make it look well).

As such, I’m typing this in a MacBook with 3x5K displays connected.


Subpixel rendering has nothing to do with any of this: it was messing up on non-RGB pixel layout panels like VA and OLED, and it used to be a simple setting in GNOME (hidden these days unfortunately).

Still, even 5K at 27" is not without noticeable jagged edges in diagonal lines and textual characters (though I've only tried 4K at 24", but that's a similar DPI and angular resolution if at the same distance) if your visual acuity (with or without correction) is around 20/20 or better (mine is better with glasses/contacts).

I hate how the text looks with a Mac on a 4K 32" screen, let alone 4K 42" screen.


And I love my multi monitor setup, because each monitor has its own set of app, and I can remove window switching by a lot.

I put my browser on 2k monitor so no need to fight with resolution and other things

but IDE is always on 4k monitor, no scaling, slightly larger font size, so I can see more code. And all the log, and note app are on 3rd 1080p monitor.

And Wayland gnome was pretty solid for me, until recently gnome-shell eating over 2/3gb on long run. Switched to niri for the time being, which is working pretty solid.


In sway + Wayland, these UI scaling bugs are fixed

KDE + Wayland is fine, too... except in some Java apps and LibreOffice with its ancient crap toolkit.

Firefox, MS Edge (my MS Teams sandbox) and any GTK apps do work.


Yeah, I don't have UI scaling bugs with Niri + Wayland.

... while other bugs are introduced :-/

Can't switch because of old hardware and vulkan/mesa legacy reasons.


1080p is plenty for text-based work if you're not using Chinese or Japanese. It looks as sharp as 4K when you disable text anti-aliasing.

It absolutely doesn’t look as sharp as 4k, even on 22” inch screen

I've seen 4K monitors in person. Disabling anti-aliasing (with full hinting enabled) on a 1080p monitor increases text sharpness to equal that of anti-aliased text on a 4K monitor. The only drawback is that the font no longer approximates the shape of printed text. This doesn't matter except in the outlier cases of Chinese and Japanese, which use some extremely visually intricate characters.

Yeah, I've moved back and forward between multiple monitors and a single one, multiple times throughout my professional career, depending on what I'm doing. Game development usually makes me end up using at least two monitors, just programming frontend/backend usually works best with a single monitor (for whatever reason), producing music and video editor also works best with at least two monitors, but other creative things like writing works best with one, and so on.

At least personally, there is not a single setup that works for everything, I'm switching basically as often as I change what I work on.


Workspaces solve this problem better. Cmd + 1,2,3,4,5,6,7,8,9,0 It's like having 10 monitors in a single one. Though if I remember right, the switch is inefficient on Mac OS, but there's some workaround to that.

I was about to post something very similar: the degree of benefit you get from having multiple displays depends a lot on the amount of multi-tasking that you have.

If you can focus most of your time on a single window then a single monitor is just fine.

But when you have to reason across multiple windows very very often then multiple displays help a lot.

For me it’s a bit messy: i am a cloud engineer and the kind of work i do varies multiple times a day. At some point I’m writing terraform code and all i need is my editor and a shell (sometimes my editor is in my shell) while ten minutes later i might be doing incident response and then i need a multitude of windows (shell, web browser showing logs, web browser showing metrics, web browser showing the aws console, web browser showing the meeting with other people handling the incident with me, shell, other stuff)…

So yeah, it really depends.


Similar job. My solution was a single 4k monitor and Stage Manager. I can tile viewing a log and having a terminal open, and then just pop back to a browser when I need to. Plus, terminal can have tabs.

When monitors were 1024 by 768, I needed more than one monitor. Now that everything is designed to be one’s only window at 1920 by 1080, I need a 4k monitor. I imagine that when 4k becomes the default, I will need a 16k monitor.


I am somewhat surprised by how long 1080p being the standard has stuck.

People often seem shocked that I use mostly 4K screens, but I've had one of them for almost 10 years now.

It also seems that 8K has died for now. I think we still have time.


M-x on a tiling window manager. I agree Cmd+Tab / Ctrl+Tab is inefficient. Linear vs constant time context switch.

They look for pincer marks

It's a pair of ragged claws.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: