Hacker Newsnew | past | comments | ask | show | jobs | submit | brokenkebab2's commentslogin

Frankly it doesn't look like it's ready to be useful. As an example tried "Braze notifications" and the first result was about Brave, then two mildly relevant, and then a long stretch of "Who's hiring?" topics from which HN seem to mention only notifications.


The title mentioning “Brave” seems to be a red herring: there’s someone in the comments talking about Braze, though it looks like a typo. Similarly the “Who’s hiring” posts do actually have job listings for Braze, but you have to click through the More link at the bottom to find them. (Because I load HN directly from a data dump, the search doesn’t know about the pagination.)

I think the main problem here is that my index is relatively small: it has only (!) 30 million pages, and it looks like Braze just isn’t popular enough for me to have run into it with the right keywords yet.


Overgeneralization detected! I live tiny, and spend less than I used to. I'm not in US though. But I think the general principle is the same everywhere: many of those who come to alternative homes do it not to save (especially from IT crowd), but for the fun of trying something different, and they happily spend a lot. But some do it to practice self-restraint. Others to keep themselves busy with DIY. Many different goals, many different compromises, all of that leading to different budgets.


Guests fiddling with their phones is not QR-dependent from what I experienced.

Also, waiters can be helpful, and friendly even when there's no paper menus. In my current city most restaurants are QR-enabled, and they serve food as nicely (or not) as paper-based. Absolutely the same experience except how you looking into menu. I just stopped noticing the difference.

EDIT: Maybe important note that I'm neither in US, nor an EU country.


Interesting idea. Though maximally-simple claim doesn't look warranted to me. It feels more complex than most of Forth incarnations.


Right? Its creator obviously knows Forth, but it feels like they've gone to extreme lengths to make it prefix rather than postfix, at the price of making it much more confusing than it needs to be? It's really not obvious to me that if they were both equally complete, there'd be any advantage at all to choosing this over Factor...

(That said, I did a lot of Forth programming in the old days, and while I am utterly intrigued by Factor, I've never managed to write anything useful in it...)


The advantage is that the language isn't stackful.

This is about more than just underflow, there isn't a stack at all. Missed arity is a syntax error, not a mistake in reasoning about the program state.

I'm not sold on Om's solution, I doubt the author was either since it's frozen in time. I have long wondered about a good prefixed concatenative language, without really getting anywhere. Om seems too much like inside-out Joy, although I'm not sure I could justify that impression in detail.


Stackfulness and the possibility of stack underflows are not actually constrained in any significant manner by whether or not a language uses prefix or postfix notation, and it would be more-or-less trivial to reject arity mismatches in a postfix language as well.

The one real advantage that prefix notation has is that it is familiar, since it is so widespread in mathematical notation. Though that particular advantage would optimize the experience of using a language for people new to programming or that particular language, whereas it might realistically make more sense to optimize for the experience of people already familiar with a language, since those will constitute the vast majority of development time in any serious language, and I would argue that postfix notation has an advantage there in most domains.


That happens too often with Forth. Someone gets interested, thinks it would be sooo awesome to add this and that "modern" feature, and then they end up with something that combines the disadvantages of Forth with the complications of what they've added. An oddity nobody wants to use, and yet another bit-roting carcass is some CVS.

It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration [1]

I think that Forth only works with people who've started with assembly programming. Someone who started with a high level language suffered that sort of brain damage that make them unable to feel complexity even when it coils around them and give them a hard time breathing.

[1] https://www.cs.virginia.edu/~evans/cs655/readings/ewd498.htm...


> I think that Forth only works with people who've started with assembly programming. Someone who started with a high level language suffered that sort of brain damage that make them unable to feel complexity even when it coils around them and give them a hard time breathing.

What I find interesting about Forth is that it is a language that is simultaneously a high level language and a low level language. It encourages the programmer to write high level programs but if they need to write low level assembly routines for maximal performance, they can (though they may need to write their own Forth interpreter to do so).

I believe that the viral spread of c and unix is possibly the worst thing that ever happened to computing. There is a quote attributed to a physician named Vincent Felitti in the context of addiction: it's hard to get enough of something that almost works. This is the story of c. It is neither a high level language nor a low level language and is thus the worst of both worlds. We've added a staggering amount of complexity to work around the limitations of c, which have also infected most of our other supposedly high level languages, e.g. java, go, rust, etc. To your point though, because we know no other way, we can't even see that much of this complexity is largely unnecessary.

Much of this goes back to Gabriel's observations in the rise of worse is better: https://www.jwz.org/doc/worse-is-better.html. What I think Gabriel doesn't fully articulate is that worse is better is an essential feature of market capitalism. The most profitable customers are addicts. Most of the time, you can't just sell them something that is complete garbage. The best thing to sell is something that is almost good enough and that the customer is not empowered to improve themselves. Then they have to keep coming back to you for updates. Why would you sell a customer a complete program that does exactly what it is meant to do and nothing else when you could sell them something incomplete that you can keep repeatedly selling back to them?

This is the water that we swim in, and it sure seems to me that we're all having a hard time breathing.


I’ve sold many an incomplete product and made money from improving it over time.

From my perspective, exploiting addicts and capitalism never entered into it; that’s just how MVP and customer-focused development work. Making money from the MVP validates the general direction and developers are incentivized to improve utility to generate additional cash flow.

Projects that seek to deliver a complete and optimal solution day one generally fail because the creators don’t have a complete and optimal understanding of customer need. Plus it takes forever and creators need to eat while working.


We have a Factor Discord if you want some help starting out!

https://discord.gg/QxJYZx3QDf


Wow, somehow I thought Factor died. I know Slava stopped working on it several years ago.


How's this important for a project which is obviously an experiment in PLs field? You either interested in such things, or not, nothing else applies.


I don't know, maybe someone wanted to try writing a program in it, which seems not irrelevant to programming languages. To most people, even an experimental programming language isn't just an objet d'art; it's more like an ornate tennis racket, and it doesn't look like this one will ever be strung.


Thank you. Aptly written. Even if the idea is fantastic, if not much happens for years, it's almost safe to assume that the product is sort of dead. But it kind of depends! An old academic language like Standard ML might not get a lot of action on its repo (actually gets updated frequently), I'd still use it as it forms the basis of several successor languages.


Just because it hasn't seen activity in awhile doesn't mean it can't be used at all.

Sure, you're not going to want to start up a major project with a long-abandoned language experiment, but frankly it's not much of a step down from even an active language experiment.

I suppose it's got a greater chance of being harder to get going, but so much of that depends on how it was built in the first place that simply being older isn't enough information make much of a guess on that either.


The home page is careful to point out that it lacks a lot that would be needed to make it capable of doing interesting and useful things:

> …the software is currently at a very early "proof of concept" stage, requiring the addition of many operations (such as basic number and file operations) and optimizations before it can be considered useful for any real-world purpose.


> objet d'art

Why french?


Why do you think it's French? The term has developed a distinct meaning among speakers of English. It has slightly different connotations than alternative phrases, as the sibling comment points out.

It may be derived from French, but it has a distinct meaning in English. You might as well ask why someone is using French when they say they're going to see a ballet. It's a feature of English (and really, most languages) that foreign words and phrases are often imported and given connotations distinct from both the meaning in the language of origin and from other words or phrases in English. Once that adoption is widely recognized, is it really not English?


That's a reasonable question, and one I asked myself. I considered some options:

- "Piece of art" didn't communicate engagement with the item as art

- "Work of art" is often used to mean a masterpiece

- "Artwork" is a neologism of the sort I avoid

Finally, "objet d'art" communicates appreciation as art, but not necessarily any artistic quality, and it's a phrase people use in this context.

While we're asking irrelevant questions, why are the majority of your comments questions and criticisms of trivial matters of wording? Obviously, we agree that words are important, but it's equally important to engage with the intended meaning of those words.


The website mentions that it is "not complete" and that "the intent is to develop it into a full-featured language".

This is pretty different from your characterization as a finished experiment.


>System wide function keys (play, pause, volume up, etc.) that work everywhere

What's so special about it? The only difference between my Asus with Linux, and Mac in this regard is that on Mac I have to use touch stripe which is inferior experience compared to physical keys.

I also see no discernible difference in many other things you mention like wifi stability.


My WiFi is fine too (as long as you have the right chip!) but the touchpad thing alone almost made me replace my thinkpad with an m1 air.

Now I'm on an unholy win/linux/ios setup and don't really get use all the Apple integrations, but I have to admit I wish I could, not for gnarly dev stuff, but simply to make my life easier


I can easily agree that Mac's touchpad is nice, as many other pieces of its hardware. MacOS however is not as good as it used to be (comparatively), because other OSes advanced significantly while retaining better configurability.

Btw, any touchpad should be used for a limited time, say in travel conditions. If you use it so much it became a decisive factor of choice - you are voluntarily marching towards RSI.


I appreciate the concern, I mosty use an ergonomic mouse and rarely use the touchpad (the one in my ThinkPad is utter trash anyway but that's a different story). It's just for me personally, build quality and UX will always be decisive factors and I will pay extra for that with no hesitation.

macOS' weirdness is actually what's preventing me from buying into this ecosystem although I love how it looks. Pretty sure it's not a very 1337h4x0r thing to say but I like nice colors and consistent designs more than tweaking configs and writing cursed scripts. YMMV


MacOSX UI nowadays feels quite outdated compared to e.g. default KDE. And I don't even like KDE that much.


I didn't like FB well before it became fashionable, but I don't understand your logic here: what is that exactly that you can't entrust WhatsApp to send, but you can trust Apple, or your mobile carrier?


Apple's iMessage is secured end-to-end, your mobile carrier can't see the message. As soon as the box goes green then any intermediary, including your mobile carrier and your friend's mobile carrier, can read the message and/or make it available to law enforcement.

In a U.S. court of law Apple has stood up to authority and pointed out they can't provide the messages requested as they're locked even to Apple. Facebook on the other hand has acquiesced to U.S. law enforcement. That tells me that even if Facebook has secured end-to-end messaging, which I don't think they've ever claimed, they have backdoors.


WhatsApp is end-to-end encrypted. WhatsApp actually uses the Signal Protocol. Probably the best protocol they could use. It's described pretty detailed in a whitepaper publicly available on the WhatsApp website: https://www.whatsapp.com/security/WhatsApp-Security-Whitepap...


You got me to look into this further. I was seeing if there was a backdoor - and it appears there isn't. https://signal.org/blog/there-is-no-whatsapp-backdoor/

Given that I don't see why more of us Americans aren't using WhatsApp! :)


Ofcourse WhatsApp is not open source, as opposed to Signal (the app). You can never be truly sure if the protocol is implemented without backdoors because you can't verify the code. But I agree, WhatsApp is a pretty solid app.


WhatsApp definitely claims end-to-end encryption. There’s a bubble telling you so at the top of every new WhatsApp thread.


If that's the case how do they detect if a message has been forwarded too many times ?


It's all about metadata, PRISM et al


I don’t know.


I'm not really answering your question, but in the States, there is a lot more trust placed in Apple over Facebook. This isn't that surprising - Apple has a much stauncher stance on privacy than alternatives.

I wish Signal would be more popular, but it's just not. I can't really ask new people I meet to communicate over Signal because it's a burden to get others to install some new application.


DMing on Instagram is super popular in USA. Most of the people don't give a shit about privacy to begin with.


Im tired of super secure apps that ask for your phone number and even publish it to your contacts. Wake me up when Signal doesnt depend on a phone number any more.


I don't know about GPs logic, but my logic is pretty simple. I don't use any infrastructure that Mark Zuckerberg had his fingers on. That was clear to me after the first time I saw him.


Facebook's goal is to extract as much information from you as possible. Apple wants to sell you a new phone.


It's absolutely not like that. Apple extracts profits in a lots of ways, including those which rely on gathering personal data.

https://news.ycombinator.com/item?id=32539762


I must note switching from novelty to boring phase is a crisis which every growing project will come through once it starts to expand its workforce. I saw it in teams with very average tech stack many times.


>If I ask you to count the number of red balls in a bag with only 3 yellow balls, then the initial count in your head is 0,

Sorry, no. Humans count from 1. That's just a basic fact reflected in the history of numbers, which at early stages often didn't treat 0 as a number, but as a special case. And if you would say "I counted zero red balls" most people around will find it an unusual wording. Normal way of saying it doesn't involve mentioning 0 at all: "It's empty", "There are no red balls" etc.


That special casing in English of no/none/empty is as much an artifact of lost germanic grammar cases in English as it is anything "natural" or inherent to how English speakers count.


I'm a member of multilingual family, and I'm inclined to insist it's not about Germanic grammar cases, because it's true for non-Germanic languages as well.


I just went "nearest ancestor up the stack" as a short hand, because the evolution of languages is a huge tree and a lot to talk about. If we want to get into it deeper, Proto-Indo-European had some truly fascinating grammar cases from what we think we've reconstructed of them. Most of the stuff that PIE did seems like "natural laws" simply because of how many modern languages we regularly see branched from it and how deeply rooted a tree in the language forest it is. But then we also have had chances to study non-PIE rooted languages and the "universals" are fewer than we think they are.


I mentioned history of numbers, and it starts not from PIE speaking peoples, so I'm a bit lost as for what exactly is your point.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: