Hacker Newsnew | past | comments | ask | show | jobs | submit | adrianco's commentslogin

It’s a bubble. Buy low and sell high. Right now, holding cash is the safe option. You can predict that the bubble will burst, but not when. However with the US being run by someone who is poking everything with sharp implements I think it bursts sooner.


Cash is just another asset class, and since becoming a pure fiat it’s never been a good one: https://fred.stlouisfed.org/series/CPIAUCSL

Given the fiscal dominance we’re in now that pretty much guarantees higher rates of inflation in the future, I think there’s much better places to place your money if you want to diversify. At least cash equivalents would be better


The parent comment is probably just talking about temporarily timing the bubble pop.


How are you so sure it is a bubble?


Its hard to understand why so many California houses are built of wood. After a fire all that’s left is a few chimneys. However I happen to live in California in a house made of concrete bricks and we don’t get a higher valuation per square foot or a discount for our fire insurance. There should be more incentives to build fireproof houses.


Earthquakes?


I think there are two things that change. One is that Amazon would be controlling output scheduling of part of the the power station rather than the grid operator. The other is that this would be behind the meter for Amazon like rooftop solar, so would be accounted for differently in the annual report.


This is super interesting and useful. I tried reading the code to understand how GPU workloads worked last year and it was easy to get lost in all the options and pluggable layers.


Someone came up with the name for this: “Disagree and commute”.


I wonder if the other Boars Head plants also have issues. Avoid the entire brand? What brands have better hygiene?


I always thought Boars Head _was_ the premium brand.


Pretty scary how many people get fooled by "premium" branding. Think about what made you think that Boars Head was "premium". Was there any evidence that their meats were higher than average quality, that didn't ultimately come from the company itself? Or was it all their product positioning, price, what stores they were found in and so on (in other words, Marketing).


Compared to other brands available in the supermarket, my family finds that Boar's Head deli meats consistently taste better. There is something "cheap" that I find hard to quantify in the taste of generic grocery branded deli meats.

I'm not educated enough to know what the difference is here, but I don't think the fact that Boar's Head costs more is entirely a marketing device.


> There is something "cheap" that I find hard to quantify in the taste of generic grocery branded deli meats.

They tend to be watery and under seasoned. I can only assume it’s to make them as inoffensive as possible to accommodate the widest possible audience - but there’s no character to cheap deli meat, no striking taste.


Watery, yes, that's definitely a component of it!


To my unsophisticated palate, Boar's Head tastes like it has less filler. Will never purchase their product ever again. The findings were so egregious, it makes my blood boil.


Not to discount your experience, but taste is so context sensitive and subjective that just believing that you're consuming a higher quality product is often enough to make it "taste better". There's a great Penn & Teller's Bullshit episode that illustrates this phenomenon for fancy water[1].

[1] https://www.youtube.com/watch?v=v2qydjVbLJk


Yup, that's completely true. But as somebody who typically prefers to buy "cheap" brands, and is usually completely satisfied by them, the fact that I experience such a wide gulf between Boar's Head and other brands makes me think it's not a marketing mind trick.


Ryan - You aren't wrong, but I'd note that in my local grocery stores, it's generic brands and your Smithfield / Oscar Mayer / Hormel in the cold case. You find Boar's Head at the in-store deli, sliced to order, and priced higher. So the illusion of premium here extends beyond marketing dress.


I honestly thought Boars Head tasted better and offered more variety than other brands, but then again I never did a blind taste test. It could have all been affected by their marketing. They do a LOT to separate their products from other deli meat - even down to having separate displays and even cooler units.


Did you ever try the previous brands available at your grocer before Boars Head cornered the market? For my local grocers it was an immediate reduction in variety, increase in price and equal-to-worse quality. Nothing about that move was better for the customer.


There's a German proverb that goes like "a butcher will only eat sausage he has made himself". Now you know the background.


Where I live, both of the two major supermarket chains (Publix x 3, Kroger x 3) with stores within an epsilon of my house feature Boar's Head next to their house brands. Other than the odd Food Depot there are no other grocery corp brands within probably 25 miles.

I have historically frequently bought Boar's Head meats from the deli counter because I can get them sliced prosciutto thin and they have the texture to support that. And the overall flavor experience when buried in a creatively built sandwich is "not terrible". Absolutely revolting when compared directly against handmade pastrami, ham, corned beef etc., but not bad in a sandwich.

However: in the last year I have noticed Boarshead moving more of their cured charcuterie meats into "nitrate free", which is a lie, they use celery instead, but of course that degrades the texture and flavor a bit.

So now I just buy the house brand meats when I'm slumming, and make my own pastrami, ham, and corned beef from time to time. For two olds a 5lb batch of cured meat is a lot. Like 6-12 months. But homemade charcuterie is so astoundingly good.

So no, I don't consider Boar's Head a premium brand. At least, not anymore. I smell enshittification at work.

Also, everybody should have the chance to try out real charcuterie.


Any tips on how to get started?


It is expensive to go blind in on buying artisanal charcuterie without knowing how to optimize for what you want. Kinda like California cheese. Good, but... $30/lb for cheddar?

I have eaten at restaurants that have "charcuterie boards" but if they're reasonable in price... you get what you pay for. An easy out is to put "prosciutto" on the board as the prize. Quotes because there's orders of magnitudes difference in price and flavor quality across "prosciutto" analogues.

My recommendation is, if you are at all adventurous in the kitchen, is to buy a copy of Michael Ruhlman & Brian Polcyn's "Charcuterie", and just start making stuff that you think looks interesting. I have had quite few dinners where I contributed stuff out of that book and people raved. Nothing special about me, I just followed the recipes. It's an extraordinary cookbook. The vegetarian rilletes are great but I've had people swoon over the duck rilletes that started with me and a dusty farm and a couple of ducks (still quacking). (The book assumes you have duck parts, not live ducks.)

In the olden tymes I would say drop me a line if you needed help/more advice but in this weird world that doesn't seems to work anymore.


Thank you for the info. I’ll definitely check out the book especially since I only have duck parts and not live ducks.


From this, and linked articles on New York Times, the suggestion is laxed food safety standards are endemic across the entire processed food industry.


It requires engagement from the employees...a culture. On this, everyone is failing. Our leaders have contorted and conflicting allegiances. Their minds are filled with numbers swelling and cash multiplying. The focus is not on delivering value for all stakeholders.


And how many of the employees feel comfortable blowing the whistle? A chunk of them are bound to be undocumented/illegally in the US, and it seems like those folks are being taken advantage of already because they have so much to lose. Blowing the whistle on some weird looking green slime or leaks or whatever can mean getting deported. Not saying the worker is a bad person for looking the other way, but I am saying that we are doing this to ourselves and these are some really obvious consequences of awful policy.


From 2009: https://www.nytimes.com/2009/10/04/health/04meat.html

Slaughterhouses not testing their products for E coli, because it's too expensive. Slaughterhouses blacklisting customers who do their own testing. USDA has no effective regulatory powers to change that.


Single-page archive: <https://archive.is/OKqta>



If I remember my US History USDA was a direct response to The Jungle. Society at the time was appalled, and demanded govt action.

Similar to the response to Silent Spring and the standing up of EPA.


I’ll admit, if arguments about nitrates being carcinogenic aren’t a convincing reason to avoid processed meat, this might just be one.


[flagged]


The article paints a picture of lack of enforcement, not regulation.


Enforcement is part of regulation


How do you distinguish these behaviors?


There are a lot of regulations. They were inspected. They failed inspections. Then the inspectors did nothing. All the additional regulations can be added that we want, but if the inspectors don't close the plant, they are meaningless.


Sure, but regulation without enforcement is a meaningless concept. This just looks like defacto lack of regulation to me.


Agreed. I’m for fewer regulations, but enforced.

In this case, we have several deaths. Perhaps the head of the factory and the inspectors that failed to keep the plant safe should be tried for third degree murder or something.


You can't. It's meaningless. If the government tells you you have rights, but there's no practical way to claim those rights, then you have no rights and the government is lying in your face.


I avoid pork entirely. The business is outrageously cruel and dirty even by industrial farming standards.


What does pork specifically have to do with this? BH supplies all sorts of deli meat, not just pork


Dietz & Watson?

I have no idea if they are cleaner or not, but I like their name. :)


You can always just go to a butcher.


In order to do so, one has to have a local butcher who is still in operation.

For many, any such local butchers were driven out of business years ago by the lower cost options in the supermarkets produced by the bulk producers, so the option to "just go to a butcher" is no longer available, at any cost.


Butchers are found throughout much of the US, in at least in urban/suburban areas. If your only local food option is WalMart or Dollar Store, you might have a long trip before you, though with a chest freezer, and an icebox and dry ice for the trip itself, meat tends to freeze and preserve well.


It's available. Just be prepared for Boars Head price shock. Hell, the butcher probably carries Boars Head.


I’ve never lived in a city or suburb that doesn’t have one locally. Is it a small town thing?


I recently ate hot dogs from a local butcher.

Oh my God -- so much better.

Definitely go to a local butcher. You have no idea how much better it is. Comparatively, even the seemingly high quality supermarket Hot dogs taste like water plastic with salt, compared to the delicious meaty spongy texture sausage sticks that are local butcher hot dogs. It's seriously a whole lot better.


18 months ago when I moved cross country to the Atlanta area I was delighted and excited to find out that my house was 10 min from a real butcher! OMG, what I always wanted. Then I talked to the counter man (no butcher on site), bought the meats, and then asked some questions about the curing process, and discovered that the reason the meats tasted like Boar's Head is that butcher shop chain (3 stores) explicitly taste competed against Boar's Head. To the point of being "nitrate-free", oops celery is in the label.

No point paying the premium (I would have paid double for authentic charcuterie), and then I tried to buy a goose during the Holidays and nope, can't do that. Haven't been back.


Keep looking around. In my small town in the middle of nowhere, we have two legit butchers that draw from regional small farms. In any large city I have to assume there are options available. That stated - if there is any way you can participate in a CSA / connect directly with the rancher, I find that the meat is not only superior quality but the prices often are as well. Case in point - my partner recently bought ground beef at Walmart that was $3/pound higher than what I buy from a long-time family friend who raises his beef in an absolutely splendid grass-covered mountain paradise.


Here’s the initial AWS response to the license change that they made in 2018, which I helped write. At the time we didn’t think a new license made sense, as AGPL is sufficient to block AWS from using the code, but the core of the issue was that AWS wanted to contribute security features to the open source project and Elastic wanted to keep security as an enterprise feature, so rejected all the approaches AWS made at the time. https://aws.amazon.com/blogs/opensource/keeping-open-source-...


Out of curiosity, did you pursue a rev share model with Elastic (Co) for your Elastic managed service? I guess that's not something thay can be discussed openly but recognizing you probably had 10x their revenue in the managed service and another 10x their revenue in compute behind the OSS, I wonder if there could have been a proactive happy middle ground found years ago.

I suppose that they might not have accepted something that was too small percent wise and hence might have preferred to go head to head no matter where that might have gone.

My real sense for why they've struggled to out maneuver is their lack of execution on their managed service (9 years in market, still minority of their revenue); while you had a head start and I'm sure that's what they point to as preventing execution, if they had really focused there they might be more like Confluent in terms of being considered the well regarded SaaS leader in their segment.

But I do think it'd be a good look for AWS to proactively help these companies. I didn't think the approach taken with Grafana Labs was right... that looked more like a Faustian bargain to an outside observer (e.g. we'll cut you down at your knees and directly compete but offer you their more expensive version on our paper. It looked incredibly humiliating).


From previous threads, my understanding is Elastic is a pretty arduous enterprise sales process which turned a lot of small/mid customers away.

High vendor management overhead is a huge pain for smaller companies that don't have robust IT to manage those relationships.

The smaller/mid size startups I've worked at almost never acquire "enterprise" software and always leverage pay-by-credit-card type SaaS

Besides operational overhead there's also a much longer acquisition time (no one wants to spend 3-6 months working with a sales team to sign a contract on a project with 2-3 month timeline)


About this topic I'm going to say that at my company we had to choose a managed solution for logs, and there were several contenders. I strongly wanted to use the service offered by Elastic, the company, as we were already managing a lot of biggish clusters and we thought that going with the company behind it would be the best thing to do.

But they made it very difficult to try it out at scale (we generate quite a lot of logs) and at one point they only wanted to talk to the CTO instead of the persons in charge of the PoCs.

That move made them untrustable to me, and they were disqualified from the process. If they wanted to compete on selling the solution to non-technical people that told us all we needed to know about them and how support would be. We ended up choosing managed Opensearch by AWS, which was a shitshow in several fronts. I wish we had given Loki a bigger chance at that time. We've ended up migrating to it anyway.


Totally and that goes back to their lack of execution as a managed service/SaaS offering because the GTM for those is different, more self-service. If you can't unwind yourself from selling a heavy weight legacy style enterprise software package, you struggle to execute in SaaS, you burden yourself to be understood by your customer as the opposite of modern.


Why? Does Amazon rev share with Red Hat? Does Google rev share with Linux? Or is it the other way around, should perhaps Linux instead pay Android for putting their product in front of billions of eyeballs? I'm sure there's a way of monetizing a billion users even for a kernel.

The answer is no, because Linux is open source. It is a multi stakeholder model where no single actor is allowed to control other actors use of the product. There is ownership, but it is separated from control. This is implemented with the GPL, but the license is only a tool to achieve the outcome, a multi stakeholder product.

This is why Amazon, Red Hat and Google all can justify to employ hundreds of engineers all contributing to a common product. Amazon can work on security functionality with no risk that Red Hat will veto it because it threatens its revenue stream. And while none of the top kernel developers have made billions from their important work, they still earn well, and all the mentioned companies have grown to become billion dollar companies.

Everyone knew this in the 90s, that's why we have the philosophy around open source. Now the discourse has changed. It is suddenly immoral to earn money from someone else's product, because if you start a project then you own it outright. Not only that, but you also have a right to become rich from your work. Discussions how immoral it is for a large company to use a piece of software without paying the original author is a completely normal thing to do, never mind that they would have no problem reinventing that particular wheel in a heartbeat.

Companies like Elastic have latched on to this, and call their product open source, but call foul when other people build products and make a living from their software. They're not actually interested in a multi stakeholder model at all.

How big would Wordpress have been without every cloud actor out there offering to host instances for a cheap fee?


This is why Amazon, Red Hat and Google all can justify to employ hundreds of engineers all contributing to a common product. Amazon can work on security functionality with no risk that Red Hat will veto it because it threatens its revenue stream. And while none of the top kernel developers have made billions from their important work, they still earn well, and all the mentioned companies have grown to become billion dollar companies.

This can't be stressed enough. OSS does monetarily work for the developers too. FAANGs love paying top dollar for OSS maintainers.


As a matter of fact, yes, AWS rev shares with Red Hat, SUSE, Canonical and the likes.


There were several proposals made to Elastic at the time, but their attitude was that they controlled the project and didn’t want AWS to make big contributions to the open source distribution that would reduce their differentiation. They were also mixing licenses in the code base and deliberately making it hard for AWS to use.

I was also assisting in the deal with Grafana, which I thought was a good deal on both sides, setting up a framework for AWS and Grafana to work together over a longer timeframe. Ash Manzani who negotiated the deal for AWS later joined Grafana to run Corp Dev for them.


> But I do think it'd be a good look for AWS to proactively help these companies.

But how much value does "a good look" have to AWS?


Depends on who sits in the antitrust seat. It's pretty incredible to realize the one who does today wrote this a few years ago: https://www.yalelawjournal.org/pdf/e.710.Khan.805_zuvfyyeh.p...


I'm fairly skeptical that Amazon would seek out a "good look" like the one here, solely in hopes that it will save them from antitrust scrutiny.


AWS tends to optimize for whatever its customers want, rather than what its partner ecosystem wants. However we did spend some time trying to help open source partners figure out how best to work with AWS and to leverage the marketplace etc. to be successful by leveraging the platform rather than fighting with it. Mongo is a good example of how that can work.


... Yes, they submitted labor backed security fixes as their form of rev share.


Unfortunately many companies charge extra for security where security should be the default. Truth to be told there some some situations where extra security costs could be justified but there are not many if charge is necessary it should be considered as a temporary measure. My $0.02.


I'm not sure what you mean by "security" in this context.

Identity management, role based access control, useful audit logs; all "enterprise" features, probably are very expensive to implement, and make for obvious "up-charge" product segmentation.

I suspect there's some combination of "the community doesn't add useful implementations of these features" and "we can't possibly risk our reputation based on some community contribution" and "we can use this to segment our product to sell to some and give it away to some."

This set of features seems to always get put in the "enterprise, only for licensed / supported customers" and it stinks.... I can understand why, though, and none of these are strictly speaking "security" as much as "compliance"


Using your yardstick, we wouldn’t have any open source software, everything costs time to implement, that’s the point of open source, we donate time to the collective community. All those security features are not enterprise specific, they are rudimentary for any modern open source product


I'm saying that companies that opensource their products tend to distinguish "enterprise" and non-enterprise based on things like RBAC and audit mechanisms, neither of which is "security" as much as "compliance".

The original license owner, if a commercial enterprise trying to sell the product alongside the "open" version, has less incentive to accept those features from the community as it would reduce their sales of the enterprise version of the same thing, and may not align with their long-term product roadmap.

In open source, the team managing a codebase isn't under any obligation to accept contributions the community and you are welcome to fork the project, if you like.


RBAC is absolutely a practical security control, even for non-commercial users. Least necessary privilege is not a checkbox, it will 100% save your butt in a breach by limiting blast radius.


I'm not really sure what you're talking about.

Let's say you work at a company that uses Elasticsearch. Let's say you're running a newspaper and you've got your logs in elasticsearch. Let's say one of your reporters ends up getting chopped up while they're visiting the Ostrich embassy to get a marriage license. Now let's say you're then asked "who looked at the logs of the CMS who searched for and found the IP address that was used by that reporter on October 1st 2018"

That example, purely hypothetical, is an example of "security" but not the typical security you'll see in some open source application -- it's an enterprise "compliance" feature that won't be trivial to implement and will be judged not just on completeness but on user interface, ease of use, ease of implementation, etc.

"security" means different things to different people


Sure. But for smaller users it's easier to breathe the accounts by hand rather than manage an entire active directory installation.


RBAC and Active Directory are different things.


I think it's fair to say most security is built-in by your average developer. However, the security side of things needs efforts from a far smaller pool of experts that can make your tool as secure as is possible. I don't imagine it is as cheap to find feature developers as well as security experts or security developers. Practically speaking, cheaping out on security might cost you the reputation of the app, so doing it well will be expensive but worth it, but might never be worth it if you offer everything by default but in the end only a fraction of your customer base uses the features.


Including the recent trend of access to SOC2 reports requiring an "Enterprise" tier subscription.


We got a SOC2 cert in our bootstrapped small saas company. Then we hid the report behind Enterprise subscriptions because it takes too much time, effort and money to obtain and maintain it.

We did not get certified because we wanted it, we did because the enterprise scale customers forced us to. Due to their internal bureaucracy.


I have the same problem at the moment with Supabase. We're a startup trying to get ISO 27001 certified and need to upload Supabase's SOC2 report to Vanta, but we can't because we're on the Pro tier and they don't give access to that, even after emailing them. It's ridiculous.


It is even more ridiculous because it costs them nothing to issue an extra copy of this pdf report. They need to certify anyway because their enterprise customers will demand it.


(I work at/started Vanta. Email support@vanta.com and they should be able to give you guidance and help out. If that doesn't work, email me -- christina at vanta)


In fact I like the change. This allows them to make almost everything free of charge to individual/small companies, but could fund it from revenue of larger organization, who generally don't have problem paying.


I don't mind them requiring a paid tier to get the detailed compliance level reports, but requiring the most uber expensive "call us" plan is probably too much for many smaller companies that might still benefit from easier SOC2 complaince.


And what of small companies that need things like SOC2 reports from vendors?

If you want to work with large companies, being SOC certified makes it easier. Part of that is ensuring your vendors are also compliant with good standards and that's best done with SOC reports.


Getting SOC 2 compliance alone takes ~10k USD apart from vendor reports. Yes they may be small with employee count, but when I said small I just meant someone running something for small set of users for free or close to free. Not someone working with other enterprises.


My point is that even small companies may need SOC reports from their vendors but still not be able to financially support enterprise level plans with every one of them. By being supportive of hiding those reports behind enterprise level contracts you are effectively supporting pricing those companies out and potentially making them unable to work with larger clients.


SOC reports are only needed for SOC compliance and compliance costs 10k USD. It depends on the subscription cost, but if the company could afford the compliance they could afford extra 100 USD/month. No one expects small companies to pay few 1000 dollars per month.

Although few companies have minimum ticket size for enterprise clients and that is a bad thing IMO.


Or worse, "SSO" as an Enterprise feature. You're a 2-3 person startup, you set up GSuite, you want to set things up right, oh, "$Call us" for a tier with SSO. Nope, I guess disparate users for now. Not the worst in the world to be clear, but an entirely arbitrary gate, in my experience.


Yeah, the SSO gates are common and borderline criminal. "The only way you can use our software is insecurely"


I have long held that view also, although the post below about the cost of supporting SSO was interesting. Unlike withholding SOC2 reports which cost nothing to incremental to give to your lowest tier, SSO may increase the cost of support. I wonder how it would go offering SSO as an addon to entry level tiers that covers the incremental support cost.

https://news.ycombinator.com/item?id=41304228

This SSO cost post is also interesting:

https://news.ycombinator.com/item?id=40752518


[flagged]


Software should be secure by default. No defense of honor is necessary.

> This line of thinking has lead to many foreign wars of choice, where we send young men to die and our nation recieved nothing in exchange. "It was the right thing to do" is uttered by those who did nothing

I am not able to find any references to the war of regression or the battle of cve-2021-44228, so I'll have to call nonsense on this one.


Why?

What was paid that now this is owed?


We've become too accepting of trivial bugs and logic issues that could have been identified through proper quality controls.


You should protest.

Demand your money back.


Security should be free (or rather, things should not be released if they aren’t reasonably secure) for a couple reasons.

We’re all on the same internet, people getting taken over and used as ddos nodes, leveraged for further attacks, or leaking PII is a pain for everybody.

Skimping on security is always easier, and security is hard to detect for the end user. We shouldn’t have a race to the bottom on this stuff.

For volunteer projects, like a lot of open source, we can’t really make demands. But I think it is still unethical to release an open source project that invites itself to used in an insecure manner. It is like an “attractive nuisance” (typical example: In some jurisdictions, you might be responsible for an un-fenced pool on your property if a kid falls in it, even if getting to it required trespassing, because we don’t want a society where uninformed people die avoidably). Without a customer service relationship, open source developers don’t have an obligation to make something useful, but nobody should put harmful things out into the world.


> Why is security free?

People don't want to get hurt, physically, emotionally, financially.

> Not just in software but in general?

People being harmed is very expensive.

> you need to pay tribute to those who can

This is a very primordial view of things. Security and safety are literally the underpinning of modern, western society. The cost of that security is baked into prices for services and products, taxes and law.


> People don't want to get hurt, physically, emotionally, financially.

People don't want to be hungry, people don't want to be cold, people don't want to be bored.


Security is a basic non functional requirement for all software.


Then don't use it? It's non-functional right? I don't get where the complaints come in.

Side note: Security in these discussions is often something more like "It works with my single sign on system" or "It lets me check this box on my audit form". Security doesn't only have to happen at the app layer and it's completely doable to isolate any software in a way that is is secure despite itself. So it's less security and more convenient security that is being demanded for free most often by people who offer nothing for free themselves. The entitlement is really extreme.


> Security doesn't only have to happen at the app layer

Agreed and once an application has differentiation between a “super user” and users of fewer privileges, it needs an application security model. Additionally once there are differences in which data a user may access, it needs a data security model.


Not only do I demand secure software by default, but I actively work to terminate relationships with companies who feel how you do. They can have whatever ideals they'd like, just none of the money I steward.


In that case it isn’t really security at all, right? Integrating with some SSO system is fine to charge a premium for, as long as the default form of authentication is reasonably secure.


"as AGPL is sufficient to block AWS from using the code"

I have taken this position in another thread a while ago, but the responses seemed to indicate that this is not a clearly cut situation at all. If it was, what is the point of the "source-available" licenses in the first place? I mean, the idea that they were invented to cut out AWS is pretty prevalent, no?


Well, the comment from OP isn't necessarily complete. The AGPL is not about preventing someone from using source code (indeed that would be contrary to the spirit of all liberal and copyleft licenses), but rather the condition under which source code modifications need to be made available.

Specifically, if you offer the software for "Remote Network Interaction" (AGPLv3 section 13), well, "if you modify the Program, your modified version must prominently offer all users interacting with it remotely through a computer network (if your version supports such interaction) an opportunity to receive the Corresponding Source of your version".

I think the original challenge with AGPLv3 vs (to grossly generalize) the VC-backed open source corporate ecosystem was not around source code, but around monetization as SaaS by the hyperscalers. The problem there is even if the hyperscalers publish source code modifications (which they probably have no problem with) they have such sales efficiency and gravitational pull that they will end up eating your business.


AGPL doesn't forbid Amazon from providing a competitive service using the software. Elastic License/SSPL/BSL all do. That's the difference.


It also ensures Amazon can’t add any secret sauce to the code they offer - everything must remain open.


AWS at the time had AGPL on its list of licenses that couldn’t be used. There were other clouds in China especially ignoring the AGPL provisions and I think SSPL was used to try and be more explicit.


There's enough legal uncertainty about API calls being considered linking that it keeps coming up. Minio are probably at the forefront of claiming this somewhat implicitly while referring you to your lawyer (or their pricing page, preferably) when asked about how they understand the AGPL.

FSF/GNU have an example of an AGPL proxy becoming compliant by serving it a page with the offer to download source code on the first request, pretty far off from reality if you ask me. That's also the big other issue, AGPL is unclear about conveyance over a network. Does a header work? Does a link to the source repo work or do you need to offer hard copies? What do you do if the "networking" is a highly specific protocol that simply can't make that offer over the wire?

I much prefer the clarity of intent of the EUPL.


Nah, the AGPL is pretty clear (and way clearer than the GPL and LGPL due to combined/derived work fuzziness). The issue with it isn't anything to do with the mechanism of the license itself, because it is pretty clear what the criteria are (and offering an API over the network definitively constitutes Remote Network Interaction) and how you can fulfill the source distribution. The real issue is that the AGPLv3 doesn't preclude a third party from commercializing the software (whether modified or not).


The problem with Minio is how many layers of indirection "interacting with an API" consitutes. If I write a webapp that uses Minio in the background, Minio has stated that their belief is that your webapp is subject to the viral part of the AGPL.


That's interesting. I was reading their licensing compliance FAQ at https://min.io/compliance and it doesn't allude to that; in fact it suggests that for instance calling a REST API doesn't imply derived work (modulo the specificity piece), referencing the GPL. The omission of the over-the-network AGPL provision is notable. I wonder if it's obscure on purpose?


Perhaps even stranger, MinIO have publicly stated they have revoked an Apache 2 license grant to a third party, Weka: https://blog.min.io/weka-violates-minios-open-source-license...

Not sure what their counsel is thinking there..


MinIO has taken (and still is taking) contributions without CLA, so they likely don't even have the ability to sell license exceptions.

They seem to have at least fixed their compliance page. It used to read:

"If you are an Original Equipment Manufacturers, a Reseller, or an Independent Software Vendor that combines and distributes commercially licensed software with MinIO software and do not wish to distribute the source code for the commercially licensed software under GNU Affero General Public License, Version 3.0 (AGPLv3), you must enter into a commercial license agreement with MinIO, available at https://min.io/pricing."

And that's opposed to "FOSS".

https://web.archive.org/web/20210415185046/https://min.io/co...


translation: AWS refuses to provide unmodified open source hosted software as a service or to open source the changes they make to host it.

It's not a "can't" it's a "won't".


I don't know if this was part of the issues but adding authentication to Elastic APIs and Kibana is so confusing and complicated that it is almost impossible to do unless you go for a managed solution. I'm sure that one factor alone motivates a lot of users to buy the service instead of hosting their own using the available source.


Yeah, this is an underrated aspect of all the managed hosting options out there. If vendors made it easy to deploy their code, folks would be far more willing to run it themselves. But just rolling out a simple production-ready cluster of most software is a nightmare of complexity. (Note that while open-source software is often not great at this, proprietary software is often just as bad or worse. This is not a side-effect of open-source. It's a failure of prioritization of the operator experience.)


But why couldn't you or AWS donate/pay to Elastic for what they created to get those features in? I understand the security features you mentioned is very necessary, but Elastic will lose revenue because of this, and they are not a trillion dollars cap tech giant like AWS to support the project for free.


AWS could easily comply with the AGPL, why is AWS blocked from providing services using software licensed under the AGPL?


I tried to do this a few months ago (for videos not shorts) and found that YouTube will only let you download transcripts for your own uploads. You have to authenticate to the API. Also the transcripts aren’t that great. I ended up downloading the video and running whisper on it.


I was a fly on the wall as this work was being done and it was super interesting to see the discussions. I was also surprised that Jepsen didn’t find critical bugs. Clarifying the docs and unusual (intentional) behaviors was a very useful outcome. It was a very worthwhile confidence building exercise given that we’re running a bank on Datomic…


Given that Rich Hickey designed this database the outcome is perhaps unsurprising. What a fabulous read - anytime I feel like I’m reasonably smart it’s always good to be humbled by a Jensen analysis


A good design does not guarantee the absence of implementation bugs. But a good design can make introducing bugs harder / less probable. This must be the case, and then it's a case to study and maybe emulate.


What bank is that, if I may ask?



First brazilian fully digital bank, got pretty big in a decade.

I'd love to hear the story from the first engineers, how they got support for this, etc. They never did tech blog posts though...


Ed Wible is a founder of Nubank and chose Datomic. He and Lucas Cavalcanti gave a talk on it at Clojure/conj 2014.

https://www.youtube.com/watch?v=7lm3K8zVOdY


Ed's post when Cognitect joined Nubank is still a great read: https://building.nubank.com.br/welcoming-cognitect-nubank/


There are some videos, both of the start and of their progress. Some of the most impressive work I have ever seen, remarkable.


> I was also surprised that Jepsen didn’t find critical bugs.

From the report..."...we can prove the presence of bugs, but not their absence..."


In practical terms, if you are a database and Jepsen doesn't find any bugs, that's as much assurance as you are going to get in 2024 short of formal verification.


Formal verification is very powerful but still not full assurance. Fun fact: Testing and monitoring of Datomic has sometimes uncovered design flaws in underlying storages that formal verification missed.


Is there anything I can read about what capabilities Datomic requires of the underlying storages it uses?


What kind of flaws? I would expect performance problems.


To start with, you usually perform verification on a behavioral model and not on the code itself. This opens up the possibility that there are behavioral differences between the code itself and the model which wouldn't be caught.


Could you offer an example?


If you work through the TLA+ tutorial[1] it will help you get a good idea of the benefits and limitations of verification.

[1] https://learntla.com/


The work antithesis has been doing here has me really excited as well.


That's consistent with the usual definition of "finding" anything.


"Absence of evidence is not evidence of absence."


Thank you. I've updated my initial guess of p(critical bugs | did not find critical bugs) from 0.5 to 0.82 given my estimate of likelihood and base rates.


If you've looked, it is. The more and the better you look, the better evidence it is.


If you run it through bayes theorem, it adjusts the posterior very little.


If a test almost always finds something, then the failure of that test to find something is strong evidence.


I'd be happy to see you numbers for estimated likelihood, prior, and marginal probability if you have them. I'm curious what you get.


s/evidence/proof/.

Evidence of absence ("we searched really carefully and nothing came up") does update the Bayesian priors significantly, so the probability of absence of bugs can now be estimated as much higher.


Did you not do this work yourself before you started running the bank on it?


I doubt any organization that isn't directly putting lives on the line are testing database technology as thoroughly and competently as Jepsen. Banks jobs are to be banks, not be Jepsen.


I would have thought they would be more rigorous, since mistakes for them could threaten the very viability of the business? Which is why I assume most are still on mainframes. (Never worked at a bank)


Banks exist since a long time before computers existed, and thus have ways to detect and correct errors that are not purely technological (such as double entry bookkeeping, backups, supporting documentation, different processes). So a bank can survive a db doing nasty things on a low enough frequency such that is not detected beforehand, so they don’t need to “prove in coq” that everything is correct.


Anyone who has worked in a bank and is glad of its solutions is either a fool, clueless or politician.

Banks have to answer to regulation and they do by doing the bare minimum they can get away with.


Mistakes don't threaten them that much. When Equifax (admittedly not a bank) can make massive negligent fuckups and still be a going concern there isn't much heat there. Most fuckups a bank make can be unwound.


Mainframe systems aren't tested to the Jepsen level of standard just because they were build on mainframes in the 70s/80s. In fact, quite the opposite.


Banks are not usually ran by people who go for the first fad.js they see ahead; they usually also can think ahead further than 5 min.

Also, I'm sure they engineer their systems so that every operation and action is logged multiple times and have multiple redundancy factors.

A main transaction DB will not be a "single source of truth" for any event. It will be the main source of truth, but the ledger you see in your online bank is only a simplified view into it.


Not a big surprise. Garman has been working up to this for a while. It will be interesting to see how the VPs reshuffle and whether anyone else who thought they might get the CEO job leaves in a huff…


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: