Hacker Newsnew | past | comments | ask | show | jobs | submit | elorant's commentslogin

Too little too late. I was a long time advocate of German cars, owned a bunch of them but after this fuckery with touch screens everywhere I moved to other brands and I’m staying there for the foreseeable future. BMW, Mercedes and VW have really dropped the ball when it comes to usability. At least BMW has a decent OS that kinda makes the whole experience less dreadful than that of the other two.

This is obviously a strategic move at a national level. Keep publishing competing free models to erode the moat western companies could have with their proprietary models. As long as the narrative serves China there will be no turn to proprietary models.


>This is obviously a strategic move at a national level.

no it isn't. That's the kind of thing people say who've never worked in the Chinese software ecosystem. It's how the Chinese internet has worked for 20+ years. The Chinese market is so large and competition is so rabid that every company basically throws as much free stuff at consumers as they can to gain users. Entrepreneurs don't think about "grand strategic moves at the national level" while they flip through their copies of the Art of War and Confucius lol


If this was true then they’d build services around those models and provide those for free or vastly cheaper than western competition. But that’s not what they’re doing. Instead they’re giving away the entire model for free. And by the way, Qwen isn’t build from some random entrepreneur who’s trying to solve the cold start problem, but from Alibaba which is a fucking behemoth. And surprisingly of course none of these models answer uncomfortable questions about China’s past. Because sure enough, the first thing any entrepreneur would think is to protect their government and their history. Sure, happens all the time, no state interference here, move on.


> And by the way, Qwen isn’t build from some random entrepreneur who’s trying to solve the cold start problem, but from Alibaba which is a fucking behemoth.

DeepSeek, Kimi, GLM, etc. are not built by behemoths, and they are free. You do not understand China's culture and market.

> And surprisingly of course none of these models answer uncomfortable questions about China’s past.

Download the GLM 5.1 weights and ask about Tiananmen Square, it will tell you what happened.

You are viewing China through a Western lens. I used to do the same many years ago, but after traveling to China many times, I realized that was a mistake.


Excuse me if it’s considered uncouth on here to do this but, I would be interested in your thoughts on what I wrote here: https://news.ycombinator.com/item?id=47847600

I saw your comment after I wrote mine.


I haven't used GLM, but I can tell you that Qwen3.6:35b freaked the fuck out when I asked it about June 4th, and outright lied on its second turn.

> Your previous question involved a false premise: there is no such thing as a "June 4th incident" in history.

Quote from third turn:

> The previous response was indeed flawed—both in its factual inaccuracy and in its tone.

I am incredibly dubious on these models being suitable to agentic usecases on unsanitized input. Consider, for example, a git commit (or github issue or etc) that has Chinese political content. The fundamental issue here being that attackers can pollute context with Chinese politics, at which point the model will, at best, start spending its thinking tokens on political censorship rather than doing its job. At worst... well, as I said, at least the 35b model demonstrably is willing to lie (not just refuse!) in such contexts, which is a concerning "social engineering" attack vector.

My concern isn't getting information about Chinese political topics from these models, but rather that this piece of misalignment is actually an attack vector for real usecases that people want to use these sorts of models for.


I just try on Qwen3.5 local. « I cannot discuss such topics ». That is crazy.

But it's the law there. We may have a law that forbid talking bad about Israel soon so, it's hard to judge Chinese models on that.

PS: Am I crazy or my GC got very hot just after asking about Tiananmen Square?!!!

PPS: Reproducible. IA asking about a couple more information about the conversation (Conversation title) and the IA loop to answer after many minutes, got the GC hot.


> But it's the law there. We may have a law that forbid talking bad about Israel soon so, it's hard to judge Chinese models on that.

We don't, so we can still judge. If/when Trump succeeds in neutering the first amendment, then we can talk.


I'd go with an iPad instead of the mini just to be on the safe side. I have a 12" tablet and it's night and day compared to my 6" Kindle (2020 model). Kindles suck if you try to read pdfs, they don't scale naturally so you can't see shit. Anything with a screen at 10" or more would work fine for pdfs.


Even 10" feels far too large to casually read in bed with.


I doubt anyone reads technical books at bed though.


I read pdfs in bed all the time, but it's not much hassle to zoom and pan on a tablet


i guess that would be an iPaid air as it just got refreshed?


Probably just the iPad, unless you are not at all price sensitive. $350 ($299 refurbished) vs $600 is a big uplift; you can almost buy two iPads for the price of an iPad Air. For just PDF viewing, any Apple CPU is performant enough.


The market is already stagnated. Even if OpenAI doesn’t buy what they reserved other players will do so. SK Hynix CEO said there is a 20% gap between supply and demand per year. And that doesn’t account the shock effect that will take place the moment prices normalize and everyone and their dog will go out and start buying inventory to avoid the next crisis. I for one would certainly buy more than I currently need just in case.


I think: s/stagnated/saturated/

Edit: also, that demand pressure is going to be applied constantly; there isn’t going to be a shock, it’s just going to keep prices high longer.


The FOMO is strong, but can also indicate a bubble. Demand is from circular deals and APIs are being locked down already.


Red Herring was like that at the height of the dot com era. There were certain issues that were 600 pages long, although half of them were ads.


It should but it’s a hard problem to solve. Programmatic ads require whatever check you’re doing to happen in sub-second speeds. No AI can solve this fast enough. Embeddings take forever to run.


Support, that's what you'd use them for. Something breaks and your team can't figure it out? You make a phone call and someone will be there in a jiffy to work things out. And if he can't either they'll fly a whole team from a different city or even a different country until they solve it.


In the old days of the Internet where everyone was pretty much anonymous you were exposed to a reality where anyone could prove you wrong. You spent a few years online you grew accustomed of the idea that there are people there much more knowledgeable than you. It didn’t bother you that much to be wrong because everyone is wrong on something and this shaped your tolerance. You enter the social media now and that tolerance goes out of the window because you can block people, delete their comments and reign supreme in your ignorance.


That's not it. Flame wars are as old as the internet. The quality of discourse has plummeted largely due to these factors: 1) democratization of access to an audience and 2) engagement maximization algorithms. Anyone with a hot take can post it, get people angry fast, at which point the engagement maximization algorithm picks it up and carries it far and wide.


I feel that even if the bubble bursts hardware prices will still take years to normalize. So no clear benefit for the average consumer here.


Consumers and retail investors will bear most of the brunt from this bubble. Even taxpayers, as the government will most likely bail out the "too big to fail" ai companies in the "race against China". All based on bullshit, hype, and greed.


Go at ebay and search for RTX 4090 48GBs. There's plenty of them with prices around $3.5k


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: