Hacker Newsnew | past | comments | ask | show | jobs | submit | layer8's commentslogin


> The only exception is when there would exist some digital documents that would completely replace some traditional paper documents that have legal significance, like some documents proving ownership of something, which would be digitally signed, so forging them in the future could be useful for somebody, in which case a future-proof signing method would make sense for them.

This very much exists. In particular, the cryptographic timestamps that are supposed to protect against future tampering are themselves currently using RSA or EC.


Yes, though we do know how to solve this problem by using hash-based timestamping systems. See: https://link.springer.com/article/10.1007/BF00196791

Of course, the modern version of this is putting the timestamp and a hash of the signature on the blockchain.


> you have to cut off previous technologies because virtually everyone's moved to something better.

It’s hard to argue that having to manage a smartphone and its ever-changing apps and UI flows for purchasing and handling tickets, is simpler than buying a paper ticket with paper money. Is it really better?


It's better for the company not the customer

This. It's just another form of hidden inflation at play.

Smartphones, appification, and self-service is usually a downgrade from immediately preceding solutions for everyone except young folks who are money-poor and time-rich, so think nothing of wasting the latter. But this state flips for most around the time they start their career, or at the latest when they start families.


I really appreciate that sentiment, but on the other hand 98% of the books I buy I won’t read a second time (because reading a new book will almost always trump rereading an old one), so I’m actually fine with not owning most of them, especially at $1.99 prices. The few that I deeply care about I buy a physical copy of.

In TFA, the analysis shows that the customer is using more tokens than before, because CC has to iterate longer to get things right. So at least in the presented case, “dialing down the power” appears to have been counterproductive.

Microgpt isn’t a product either. Are you saying that differences between cool projects aren’t worth thinking and conversing about?

> Identifying a criminal is ethical.

I agree that “doxxing” is being misused in TFA, but criminals have privacy rights like anyone else. Violating these rights requires specific justification, it’s not automatically ethical.


They put the person on a wanted list.

My comment isn’t about this specific case. It’s about the general claim.

I mean doxxing is totally incorrect. Let's say for example there was a person on film near a crime scene, even though the police know they weren't directly involved there is no violation of privacy in the US if the police post their picture and ask for them to come forward. Or even later find out their name and look for them publically.

The term GPU was first coined by Sony for the PlayStation with its 3D capabilities, and has been associated with 3D rendering since. In some products it stood for Geometry Processing Unit, again referring to 3D. Purely 2D graphics coprocessors generally don’t fall under what is considered a GPU.

It has been associated with 3D rendering, but given that things like the S3 86C911 are listed on the Wikipedia GPU page, saying "Accelerated GUIs don't need GPU" feels like attempting to win an argument by insisting on a term definition that is significantly divergent from standard vulgar usage [1], which doesn't provide any insight to the problem originally being discussed.

[1] Maybe I've just been blindly ignorant for 30 years, but as far as I could tell, 'GPU' seemed to emerge as a more Huffman-efficient encoding for the same thing we were calling a 'video card'


I don’t agree with what you state as the vulgar usage. “Graphics card” was the standard term a long time, even after they generally carried a (3D) GPU. Maybe up to around 2010 or so? There was no time when you had 2D-only graphics cards being called GPUs, and you didn’t consciously buy a discrete GPU if you weren’t interested in (3D) games or similar applications.

In the context of the discussion, the point is that you don’t need high-powered graphics hardware to achieve a fast GUI for most types of applications that WPF would be used for. WPF being slow was due to architectural or implementation choices.


That's the real takeaway - WPF should have degraded gracefully (read, full speed performance without the bling) but it didn't.

Nonsense. Do you read and write your email using the command line? I use Mutt and Vim for that, and that’s not the command line. GUI with power-user support is just as efficient as Mutt and Vim. Did you use curl to read this thread and submit your comment? I use Firefox with Vimium C, which allows most web pages to be navigated and operated efficiently by keyboard.

Wait, mail clients other than mutt exist?

Some have, but it depends on the profile used, and also on the country: https://meta.wikimedia.org/wiki/Have_the_patents_for_H.264_M...

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: