This is strange - I wasn't expecting such a difference on CPU-only benchmarks. In my experience as a software engineer I find Linux much faster than windows mainly due to the file system being more performant (e.g. the builds run noticeably faster). For CPU-intensive tasks I always thought they were quite similar.
On the other hand since upgrading from 6.6 to 6.11 I noticed a great improvement in UI responsiveness on my old X1 Tablet and I doubt this Ubuntu uses a recent kernel even.
Disclaimer: I haven't used Windows in a long time.
That's quite a question. Lambda might take your email mainly for notification(this feature is still underway). As for privacy, I can guarantee you that lambda does not generate feeds like traditional social media apps do. Lambda just takes a list of users whom you are following and shows their latest posts. This way it does not tend to use any AI model to generate a feed for you that depends majorly on click-through rate.
You should try that when things are going well, not after purchasing power has collapsed 10s of percents over <5 years and 80% of <35yo s cannot afford to buy a flat.
"SQL Error (1064): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax near '' at line 21."
when line 21 is 175 characters long. also, the fact that anything other than InnoDB is barely real ACID. also, the stuck development and lack of support for json, uuid and many other things.
mysql being user-friendly (and especially beginner-friendly) is a massive hoax.
Just because I spent a few hours in the postgres parser guts lately:
Postgres doesn't quite have the span of the error, just a single location :). We can of course measure the length of the token at the error point, but that isn't quite the same, as the cause of the error does not have to be a single token.
postgres[122581][1]=# SELECT * FROM kdjfkdj;
ERROR: 42P01: relation "kdjfkdj" does not exist
LINE 1: SELECT * FROM kdjfkdj;
^
Wow. So much effort in reinventing the wheel. The transaction approach is obviously not suited for integration testing as commits/rollbacks are part of the package too.
There is testcontainers + flyway/liquibase. Problem solved.
Testcontainers are anything but fast in my experience. With the image pre-pulled, the most basic mysql testcontainer takes about 5 seconds from calling RunContainer to it being ready to go.
I use TestContainers in my Java Spring Boot application coupled with Liquibase. I have it set up so that the PG instance is created once with Liquibase initializing it from vanilla PG to a fully created empty application database, then all the integration tests reuse the same by overriding the stop() method to an empty one.
Running the full suite of tests is about 5-10 mins on a relatively low-powered build server, whereas on my more robust development workstation it's about 3 mins. Getting a TestContainer running an empty DB for integration tests ends up being a very small portion of the time.
Most of the time any work being performed is done in a transaction and rolled back after the test is completed, though there are some instances where I'm testing code that requires multiple transactions with committed data needing to be available. The one ugly area here is that in some of these outlier test cases I need to add code to manually clear out that data.
You're describing a pretty different testing strategy, where you only use one database and ensure isolation through transactions. It's a good strategy for your use-case, but it's not what the article talks about (they do mention this transactional approach and why it doesn't work for them)
The article is discussing the usecase where each test/set of assertions as a clean DB. I.e. hundreds of DBs per test run.
Are you talking about a single DB reused for each test? That is of course no problem...
Our suite spends 3 to 10 minutes to run too. It provisions about a hundred databases (takes 1 sec each DB template clone.. the wall clock is running stuff in parallell)
Also, every time we change code and run a related test locally we make a new DB to run its test. If that took more than 1 second I would go crazy.
https://book.mixu.net/distsys/ (Distributed systems: for fun and profit)
you should definitely start nowhere but here. it is meant to be a getting started doc and does a great job in that.
EU is not working for Firefox, I'm not sure why you'd expect that. Now people can choose, which includes the choice to stay in the same garden (less walled now).
What makes you think they expect the EU working for Firefox? It is not indicated at all in the comment. Is your world view already that there is only Webkit-, Blink- or Gecko based browsers?
We steered into a corner that is hard to get out of...
Who cares about the piping underneath? The only thing important about that is if they all display the same thing for the same HTML&JS.
Chrome has been becoming the new IE with the websites giving the best experience on Chrome - only. That's the risk.
What pipes the browser makers choose to rely on is between them and pipe makers I guess. Probably the pipe makers should focus on the developer experience.
It's all fun and games until the websites you need (bank, local government, etc) only support the pipes build by a tech corp in a faraway country. Diversity is good and healthy.
>until the websites you need (bank, local government, etc) only support the pipes build by a tech corp in a faraway country
they don't get to decide this if push comes to shove. Banks and governments in European jurisdictions obviously can be forced to comply with European laws and if there was some geopolitical question about security you can just force them to switch to a local fork of Chromium which given that it's open source is technically relatively trivial.
It's the same as Linux essentially. The overwhelming majority of commits comes from RedHat, Huawei and Samsung or other international corps which is fine because there's always the implicit option to fork it. We don't need fifty different kernels given that we're talking about open source software. In the olden days of Internet explorer and dependence on proprietary software this argument made sense because you could theoretically be squeezed without an ad-hoc alternative, but that's not the case any more.
Everyone should care. Chromium is built to further Google’s interests.
From the long lasting first party cookies to the ease of fingerprinting the engine is designed to make advertising more effective. But hey who cares about privacy.