Hacker Newsnew | past | comments | ask | show | jobs | submit | phrh8's commentslogin

Not sure it would happen that way out of choice, but the role can get unexpectedly thrust upon someone, experienced or not (Think startups, desperate to make ends meet with a handful of inexperienced engineers...or even at medium sized companies, remember back when Google had "20% projects"?). At startups especially though: they don't have money to go out of their way to hire an overpriced "experienced software architect"...

It goes something like: "Hey, new Engineer, I need you to work on experimental feature X". - Then the engineer builds system Y to provide feature X, probably poorly architected and not scalable, not expecting it to go anywhere. Some months later, app containing feature X gets distributed to thousands of users, or goes viral and hits millions of users.

Congratulations, like it or not, you have now become the lead architect of the now-widely-deployed "experimental" system Y, and you are going to either crash and burn or become a damn good software architect by maintaining it out of sheer necessity. Later, you find yourself putting "Lead Software Designer" on your resume, having earned that role and title.

A lot of successful software was originally designed "by mistake" and grows far, far greater than its original purpose. And in contrast, you get the well-designed software that was properly architected, but never goes anywhere because it never saw the light of a successful deployment at scale.


I think you’ve accurately described the vast, vast majority of production code that’s hasn’t been through a second wave of engineers/management and rewritten. If it works, and requirements aren’t changing, then your MVP has become the gold standard


What future work can we see here? Is this sort of approach really pushing the state of the art or is it just an attempt squeeze another few percent out of brotli at the expense of more CPU.

My understanding is the main novel idea here is splitting compression into independent subproblems. Is there potential for this idea to become the basis for all new modern (lossless) codecs (e.g. redesigns of FLAC or PNG)?


How does DivANS compare to state of the art compression algorithms? Is Brotli widely acknowledged as the state of the art or you just compare to it because it is fast?

My understanding is Zstd was designed to be fast, not necessarily efficient, and the other algorithms listed (e.g. 7-zip, gzip) are quite ancient.


Blog co-author here: With the settings in the blog post, the DivANS algorithm skews towards saving space over speed. Brotli skews heavily towards decompression speed without sacrificing much ratio. The reason we focus a lot on comparing to Brotli, is that Brotli does extremely well on data stored at Dropbox.

However, I don't think the performance is a fundamental law, and there are clearly some clever optimizations yet-to-be done.

I was very surprised that the lzma-brotli mashup outperformed either. This leads me to think that with enough community involvement we could discover some really clever heuristics and algorithms for compression all together.


I think the global reach is the problem here: kind of like how phonebooks, public records intended for the interest of the local population get scraped and sold to ad businesses and credit bureaus.

I'm pretty sure that some of the content posted to Nextdoor could be of public interest, and it would make sense that a local newspaper or TV station may wish to report on it, just as they may report on the minutes of city council meetings or organized events.

I think the issue here is when you combine a local newspaper with the global reach of the Internet, and something that was meant for a small audience is now being shared way outside of its original target audience. I sure hope people don't go posting the minutia of my local neighborhood to BuzzFeed or reddit.

Regardless, this is just for show. This sort of policy isn't going to make any difference. We've seen these sorts of scraping issues in the past with other walled gardens: site owners make a big deal about scraping, shut down their APIs, etc. It hasn't stopped content from being scraped and reported on. Reporters will continue to report; users will continue to make anonymous story tips; and new accounts will sign up after old accounts get locked out.


These platforms have hyper-optimized their products to show as much stuff as possible that people will engage with (share, click), so that they also engage with ads and stay active.

The problem is there is often some time when there is no news. No news isn't interesting, so it doesn't get shared as much as "news". Take the moments immediately following a disaster. Two types of news articles will show up:

"There was a shooting. We don't know anything yet, but we will keep you posted". Boring.

"We know who the killer is. Bobby Bobertson done it". Woah!

Now imagine you are a heavy social media user. You don't want to stay silent on such a big news story, so you want to share something showing you are engaged. Which story do you share?

(edit: removed duplicate "the problem is")


This comment is absolutely right. The algorithms are not optimized to inform people, they are optimized to maximize the number of ads people click on while consuming whatever content the algorithms promoted to the top.

This is at best a conflict of interest when it comes to content that ought to be viewed as news. But there is significant pressure these days to blend news with entertainment (which simply means engagement, which entails ad sales).


I was never so happy to stay off social media as I was last night. I was awake, but not online, for about 2 hours after the LV shooting. My phone had a headline about 2 people dead when I went to bed. My wife woke up at 5am and shared the news with me when I woke up. By the time I engaged with the news sites, it was 8am this morning and most of the initial bullshit had been filtered away. Saved myself from a lot of unnecessary anxiety.


Science can't tell you if something is "natural" or good/bad. Science merely tells you what is/what we observe. (with some probability of being wrong, and subject to bias, etc).

Whether something is good/bad is up to us as a society to define, not science. What is considered good/bad depends on each of our individual goals/ethics/morals, which could differ from one another. This is why spoonfeeding people information saying "X is good" / "X is bad" is dangerous, and it becomes more dangerous when social media platforms allow such information to be amplified.


What usually happens with these phones is they have something like 8GB or even 16GB, but they are poorly paritioned. It could be something like 5-6GB for /system (the OS) and they somehow only leave 2GB for the /data partition.

I've seen a phone like this running Lollipop. The best part is when it decides to auto-update all the built-in (Google) apps, collectively consuming a total of 1-1.5GB of /data, at which point Google Play refuses to let you install any apps because you do not have enough free space available.


> What usually happens with these phones is they have something like 8GB or even 16GB, but they are poorly paritioned. It could be something like 5-6GB for /system (the OS) and they somehow only leave 2GB for the /data partition.

Oh, so you read bigbugbag's comment (https://news.ycombinator.com/item?id=15049992)

> > wow those people are lucky and high on consumerism, my phone had 2GB.

as saying that it's 2 GB after the system installation? That seems to make it a strange response to itg's comment (https://news.ycombinator.com/item?id=15049947)

> > > Considering how may people still have 8GB/16GB phones, I doubt it. On iOS sometimes you have to delete apps, especially ones that build up large caches like Facebook.

which is clearly referring to the pre-OS available space.


Wow, that video deserves more views than it has. Really interesting and relevant for a two year old talk.


I won't pretend to know about or have any background in medicine, but based on a cursory reading of the article, it seems to be a term for applying the Scientific method to medicine. The idea that we only started doing this 50 years ago is terrifying to say the least.

It seems to me that the problem here is that our industry, built around throwing expensive drugs at problems, paying for results and lobbying governments and insurance companies is ripe for abuse of "science".

That said, "pure" science is about more than published papers: it's about taking the data and observations you have and constructing the most likely theories and explanations around the observed evidence. If we had a capability to separate funding (and emotions) from research we might be able to produce good results given enough open data (which is itself a challenge).

As an uninformed software developer, I think medicine is a field where machine learning based tools will shine: ethical ("hippa") issues aside, we eventually might be able to feed all observed data, from diagnosis to results years after treatment into computer systems which might be able to make sense of the data and allow us to construct conclusions of the data unbiased by personal and business incentives.

Obviously our current eco-policital climate is strewn with roadblocks, but just wanted to put it out there that science doesn't have to lead us down this path if done right.


> The idea that we only started doing this 50 years ago is terrifying to say the least.

It's also a flawed notion. There has been a concerted push in the last 50 years, but applying the scientific method to medicine is considerably older. Koch's Postulates, for example, were published in 1890.


You're talking about applying the scientific method to understanding disease, which is not Evidence-based Medicine.

Evidence-based Medicine is applying the scientific method to the practice of medicine.

Please read the link before spreading misinformation.


I picked one example, but there are others that well predate the term "Evidence Based Medicine". Semmelweis comes to mind. But I've also seen Koch's Postulates used in evaluating the practice of medicine, in addition to the study of disease.

You could ask for clarification before simply assuming I don't know what I'm talking about.

Furthermore, I'd suggest that understanding disease and understanding the practice of medicine are far more entangled than your division suggests.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: