Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Spot on. Note that the issues are different (and not as bad) for Reddit and HN type things.

> It ought to be able to work out who your close friends are, and what kinds of things you normally click on, surely?

I wish this didn't even seem right to anyone, even from the beginning. It's different whether I happen to click or be interested in something once than it being what I want to see again. If I go to a movie, that's not evidence that it's my sort of movie — I don't know if I'll like it until I see it! Priority really has to be on the evaluation of things and the explicit desire to get more, not on reading things into the fact that I was open to something in the first place.

With the what-I-click approach, one day I am prompted for some reason to click one sort of thing… then I'm shown that more the next day… then one pretty-arbitrary starting point is turned into a defining filter for me forever. This reality means that simply clicking something out of curiosity threatens to define you and your experience for years.

The ethical way to handle this is to do some mix of (A) giving back control to the readers for what they explicitly choose to follow (not whatever they happen to click, and not even what they "like" because liking should not equal following) and (B) doing the opposite of bubbling and actively insert some mix of stuff-they-don't-usually-click, i.e. novelty so that people are actually exposed to new ideas and perspectives they might otherwise not even know exist.



Your general concern is well taken, but clicking on something once will not "define you and your experience for years." The system will respond to your initial click by introducing similar content to your feed, but if you don't click on those stories, the effect will be transient. Failure to engage is itself a form of feedback, for better or worse.


I don't doubt that works that way sometimes... but man somewhere along the line I googled a Nebraska Cornhusker football score and Google randomly brings it up again and again and again.... even when I give it feedback that I don't want that information, it comes back later on.

I feel like I've seen this behavior on other sites and systems as well. I've no doubt the prioritization initially works, but it seems there are other factors at play that seem to bring up old data.

In my case my team is not nearly as popular as Nebraska so I suspect the logic very roughly ends up something like "Ok he watches college football and oh hey this Nebraska story is really popular and stories about that team are big now so here ya go..."


After a university in another state sent me an unsolicited email offer (never opened), my google feed started showing me everything about that school, even going as far as setting up notifications for its class registration calendar. The feed started to loop in college football and greek life news, which are incredibly off base for my actual and actively searched interest history.

I spent a year slowly cultivating a pretty decent feed of relevant content on my Pixel, and it went full Netflix 2010 seemingly overnight.

Early Netflix suggestions, no matter how selective and consistent your selections over many years, could be instantly subverted by your lonely sister-in-law getting on your account one Saturday night and watching a few foreign language romance flicks. After that, you'd never really stop getting recommendations for "movies starring sexually aggressive male leads" or "films with actors who look like Antonio Banderas".

I think I'm done being used to train everyone's algorithms.


Heaven forbid you entertain your 6yo niece for a day, and find she's polluted your Netflix with the most insanely numb drivel that passes for animation today. It takes months to expunge.

What's wrong with Netflix anyway, that one off-topic movie can sway your 'preferences' so drastically?


I find Netflix doesn't recommend anything based on my behavior at all anymore. Now literally the first 5 or so rows of content on the home page is a random spattering of "originals". But they definitely do select the title image based on my past behavior. So not helpful


I hear ya man. These algorithms seem to fall into stupid uselessness after just one error or prioritization change .... like code does.... normally.... but it is telling that the "intelligence" behind these things is still ultra fragile as far as their ability to cease to function as far as the user is concerned.

Hal 9000 failed because he was told to lie.... You'd think that prioritization would be easy on the outside. Granted it was a story but it seems to fit that a minor change results in a collapse as far as function goes.


Yeah, and it's beyond that. What if you weren't totally opposed to continuing that interest? If you're half interested in the next article, the whole thing will be reinforced. You find yourself genuinely half-interested ongoing, and this becomes part of your life.

It's one thing when something you don't care about keeps showing up. It's another to think, "I'm actually interested, but if I click this once, then I'm liable to effectively commit to this staking a permanent place in the half-interesting things that fill my life from here on!".


Amazon is probably more well placed to make an informed decision, as it’s most likely the products you look at, but don’t buy, that you are probably interested in.

Unfortunately in reality that doesn’t seem to be the case. Last week I bought some wireless headphones, and now my feed shows a bunch of those. It also keeps showing me books by an author who I bought a series from, but never finished (on Kindle). And a selection of Spanish books, that my wife needed once for her studies. And for some reason a load of large kitchen applicances.


Apple News actually follows the model that you describe as the ethical way to do this.

In my experience, it produces better and more pleasing recommended content in it's feed due to this control.

Inference models such as YouTubes always feels like you are being pushed into certain content and actually makes me personally hesitate from sampling / clicking on certain videos.

In the YouTube model, simply sampling a single video starts up the recommendation engine to keep pushing similar videos and as a user you feel railroaded into being profiled.


EXACTLY!

And thanks for the note about Apple News. I must admit that having left Apple over their walled-garden iOS direction (turning to embrace of GNU etc.), they have stuck to a higher-road in a lot of regards compared to the other big companies. They're still nowhere near as awful as Google, Facebook, Microsoft… if only Apple weren't sabotaging copyleft and hadn't pioneered the in-app advertising, sales, and tracking that are actually so much of what's wrong…




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: