Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>Something weird happened over the past decade.

I blame it on social media monopolies, or at least effective monopolies, oligopolies, and the network effect. Once everyone was sucked into these silos and unable to leave they could do whatever they wanted with the user experience without repercussions. And once everyone was used to it and took it as a given it was only a matter of time before it leaked even into desktop apps or even Windows, and people just accepted it. What I think could reverse the user hostility trend would be competition which would require breaking the oligopolies/network effect which means the rise of Linux desktop or Fediverse platforms, and that's gonna be a while, but not impossible. And of course there are limits to how far they can degrade the user experience without starting to lose users so I don't think it can be a lot worse than now, but still we're stuck with the status quo.



I blame it on web development, pure and simple. The relentless drive for analytics and advertising is what brought the acceptance of spying.


I blame marketing departments being given too much (read: almost any) control over product development.

No developer cares about the level of analytics being pushed today, and unless they're profit sharing they probably don't care about the ads either.

Those anti-features are there because marketing departments want them there and have enough power to get what they want.

Don't let marketing make product decisions.


Back up even farther, though - SO MANY PRODUCTS are literally indistinguishable outside of marketing.

I think the issue is much deeper than "don't let marketing make product decisions".

Browse through any app store - click a category, and it's a sea of apps that provide essentially the same capabilities.

Just like your grocery store has a sea of jars filled with slightly varying salsa.

So take the diagram in the article:

Customers asked for it: Check.

Customers would benefit from it: Check.

We built/tested/shipped it: Check.

What's the missing step? Did ANYONE fucking buy it?!?!

And it turns out none of the other steps actually matter compared to the last one, if the goal is to remain a functioning company.


> Customers asked for it: Check.

> Customers would benefit from it: Check.

The key is in what the word "it" means. The answers are positive if by "it" you mean "this category of product". They may very well be negative if by "it" you mean "our particular product".

The customers want a jar of salsa. They don't want, and never asked for, your particular variation of a jar of salsa, essentially identical to 10 other variations except for a differently designed label.

Another tricky bit is in the "asked for" part. For most products on the planet, customers don't really ask for anything. The market isn't structured this way. Products are just dumped on the market, and those that sell survive. This is wasteful, but has some benefits. It would just be better if marketing wasn't there to meddle with things, artificially sustaining more variations of a product than needed.


I'm gonna use one of those examples that irks me personally. Do we really need 19 different formulations of fabric softener available in another 15 different scents from 12 different manufacturers?

I'd say it's a solved problem. Make - dunno - 3 different formulations (normal, sensitive, extra soft?), and provide the scents as small and separate ampules, and let the user mix and match the apple and cinnamon, and make the place smell of a bloody apple pie after a load of laundry for all you care. People are usually already familiar with those, and have their preferences, plus it would even be fun to experiment, and make one's own blends.

Anyway.... What I wanted to say is that I hate having too much choice (especially artifical one), and I specifically hate choice paralysis.

And I'm writing this as an European. Finding myself in search of a cereal, and ending up in a typical USican giant isle consisting just of a cereals is my idea of an nightmare. I thank the FSM for Lidl existence.


Customers didnt ask for an iPad but they sure bought the hell out of it


They asked for PADD from Star Trek: The Next Generation, and iPad was the closest the market could give them.


The underlying cause here is unbound and often unregulated profit motive with the farse that competition self regulates. At some scales competition self regulates but at scale we see now, it simply doesn't. There are too many barriers and too strong of foothold in markets.

As a result it trickles down, how can we improve our revenue stream. More data, more ads, more nickel and diming consumers, how can we lockdown control of this product/service, charge more for the same and even more for less.

Developers are, in my opinion, just along for the ride and not making these decisions so much as allowing and enabling them to happen. In the world of professions software engineering pays quite well and it pays well for a variety of reasons. People take lucrative positions and decide, reasonably, that what they're being told or pressured to do isn't that bad. It's not like the holocaust where they're turning a blind eye to genocide, they're turning a blind eye to corporate, monopolistic, and oligopic market abuses because at the end of they day they get to live comfortably.

I develop garbage I don't agree with often. I reduced my comp level to have more leverage to haggle against questionable practices but even then I still have to do some questionable things. For developers it's a choice of following along and being paid well or taking a hit and working somewhere that comps a hit less but doesn't product hostile products. I have nothing against those who choose to enable these business practices because they're building financial security in a world we've created that says these practices are OK. Businesses are sort of doing the same but they're more proactive in shaping the policy that allows these practices, so they have real responsibility here. Consumers have a responsibility as well by continuing to buy garbage they don't need that uses these practices. Voters have some responsibility for pushing politicians in who bend to the will of businesses to allow deregulation or prevent regulation for these practices. Politicians have blame for the ethical flexibility to let lobbyists and businesses incentivize them to represent businesses more than their voter base.

We have a mess on our hands with everyone having a little bit of blame here but the biggest responsibility I believe falls on large businesses and the capital holders behind them setting most of this in motion.


If you build your culture on an ethic of competitive individualism, this is what you get.

Hardly anyone is really happy. Not even those with huge piles of money.

They're comfortable and (largely) immune to everyday threats. But the system as a whole continues to be made of traps and sharp edges. And a lot of people fall through them, never to be seen again.

Not a few were convinced it couldn't possibly happen to them, until it did.


As a career front-end developer, I take affront at this. I have never argued in favour of any of the shit people face on the web on a daily basis. I’m close to wanting to get out of the industry because the product is so toxic these days. The people are largely great, I love my current team.. but every month we’re told to add more tracking, or advertising (from Google of all people). I could leave out of political differences but where am I gonna go that’s different (in London)?

Biz: we need to track our users, stick GA on it Me: we could do a privacy-friendly alternative which brings the data in-house. It would lower our lower our GDPR burden so our cookie notices would be simpler, and at the same time make it easier to link our user data with other metrics (I work in streaming video at the moment) Biz: GA is free Me: Longer term, out overall cost of development will be lower because the complexity will be lower, and you wont be leaking data about your customers Biz: but GA is free and works out of the box with more analysis than we’d use Me: Do you see how that actually makes it more complex, over-engineered and unfit for OUR purposes? It’s also a dog to use by the data people and they will ask for a different tool because they can’t change GA Biz: it’s free. The deadline is three weeks.

BIZ: we want to make more money so we’ll sell advertising Me: Ok, but content-based advertising would guve us more control over what we get linked with, doesn’t track users, lets us set our own pricing, lets us sync better with our own content (because presumably we’d be able to control the manifests better Biz: but GA gives us an admin panel and we don’t have to think about it Me: but the integration will take months and half of it’s out of our hands be ause Third Party Biz: here’s the admin key they gave us…

(Ok, so I didn’t actually have these conversations and TBH I only learned the detail of sharing manifests with a third party after I joined the team.. but you get the idea).


Dont take a critique of the job title as a slight against you personally. We, frankly, don't know you.

And as a whole industry, front-end devs have implemented atrocious dark patterns and all manners of disgusting anti-user choices. Have you? Only you can answer that - but I sincerely don't care about your personal choices. This discussion was never aimed at "the_other".


Honestly, I knew that it wasn’t meant at me. But I am DEEPLY frustrated at the lack of power I have in these situations.


I understand that you may feel affronted (and apologise for making you feel like that), but I'm actually criticising the web development as an industry and as a set of broad trends, rather than individual developers. Sometimes developers may have leverage, but from my experience in backend dev, that's rarely an option.


Start your own company


And yet almost every example of malware in the article is a native app.


Electron is not native


It's more native than a web page.

Is an app in C# not native? Or Lua? or Python?


No it is not. From https://en.wikipedia.org/wiki/Native_(computing):

"In computing, native software or data-formats are those that were designed to run on a particular operating system. In a more technical sense, native code is code written specifically for a certain processor.[1] In contrast, cross-platform software can be run on multiple operating systems and/or computer architectures."


If it runs in a DOM, it's not native. Native (to me) means it uses a layout engine that's significantly different from a web browser, uses native controls relentlessly, and integrates well with the OS. Electron cannot do that. QT, C#, Lua and Python can.


I blame Javascript and the "hipster" devs that fueled its rise unnaturally.


I've never worked with a Javascript developer who was responsible in any way for their employer's adtech and dark UX patterns strategy. In my experience, those decisions are always made by PMs and approved by the board.

Your comment seems like an opportunistic attack on a type of developer you don't personally like rather than something that is rooted in reality.


> for their employer's adtech and dark UX patterns strategy

well, generally a JavaScript developer is just a frontend developer so I'm assuming at least some have experienced a PM saying: too many people are doing X which we don't want them to do, how do we keep them from doing X? And the developer then makes helpful suggestions.


> which means the rise of Linux desktop or Fediverse platforms, and that's gonna be a while, but not impossible.

I like to hope you're right, but I don't believe you are.

Most people don't want to learn more about tech. They just want tech to be intuitive enough to pick up and use.

The mobile market appeared because Windows is too big and cumbersome. That's why people advertised it as a skill on their resume. It isn't fun. Windows is a chore to most people.

Nobody puts "experienced Android user" on their resume because it is expected that just by being alive you should be capable of using every function of an Android device. Very little functionality is abstracted away into 20 year old UI's or shell commands. There is a button for everything, and the button makes sense.

So if Windows didn't stand a chance, there is absolutely zero chance Linux will catch on at the scale it needs to for your post to come true.


Ignoring the fact that Android puts its buttons in basically arbitrary places, the fact is that it's just not a very useful platform for more complex tasks. It can do less, so there's not much to master. Windows and Linux can do more, so there is more to master because they allow more complicated interactions between things inside the computer. In other words, listing an OS on your resume means that you are sort-of competent enough to pull off such tasks at all.


The reason you don't list android user on your resume is that it's very difficult to use android (or iOS) for anything beyond passive consumption of content which is not something employers want.


Also I think Windows/Office is kinda left-over from 20-30 years back when it wasn't fully standard knowledge... So it's entirely meaningless buzzword used to fill up CVs...


I agree. Based on what I saw those that want to move either have or are exploring their options already.

Average user is perfectly content with what they have, because it requires zero new knowledge and zero extra work. Linux is still anything but that.

edit: Come to think of it. It is a good thing. Web became mainstream and see what happened to it.


A little too literal IMHO. No reason a floss mobile os can’t exist. Android proves Linux is technically capable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: