I see where the SDL author is coming from, but is it really that hard to get local GitLab instance running? Or even just using GitLab? I saw other replies about how GitHub has a network of open source developers, but that didn't bother the SDL author before, and I don't see why it is bothering him now.
You missed a key point that the SDL author raised--they're tired of maintaining all the ancillary software around a project. Every minute they waste troubleshooting a CI/CD failure, bugzilla limitation, etc. is a minute taken away from working on their actual project. And as this is an OSS project of love and not their full-time job it's even more infuriating for them to waste their free time janitoring services.
Gitlab self hosted is fantastic, but it's far from 'set it and forget it' maintenance. There's significant technical overhead to keeping it updated, secure, migrating its database with major releases, scaling out the underlying hardware as project demands increase, etc. Moving to a hosted platform like Github solves those problems and gets them back to being productive on their project.
The authors are really lamenting that they have to give up control of their project's source code and tie themselves to a new commercial offering (something that history has shown us over and over will inevitably lead to more technical debt in the future as platforms fade--just ask any former sourceforge, google code, etc. user).
I don't blame them--it's sad to see that the state the art in self-managed OSS source code never progressed much beyond "become a domain expert in server operations and kludge together a suite of tools with wildly different UI, management and operations; also, documentation is non-existent or wildly out of date".
If open-source organisations like RedoxOS, GNOME, GTK (Who use their own self-hosted instances of GitLab), etc are able to self-host, surely SDL can too, thanks to the new technologies out there that automates this.
When your entire project is sitting on GitHub like nearly everyone else out there and something goes wrong, don't be surprised when one has to tweet at the CEO of GitHub if the repository is falsely flagged for 'some reason' or if your GitHub actions, pull requests and packages are experiencing a degraded service.
I used to think Google was the "good guy" by providing so many resources for free, but I've realized how they've exploited their market dominance by essentially manipulating the masses with their ad service. We need to move away from conglomerates and start using software that doesn't sell your soul to the devil.
You were helping google, but you were also helping everyone who used their maps from that day into the future. Your comment gives off the air of someone upset they helped the wrong party in a zero-sum game. IMO, this is clearly a positive sum one, and you shouldn’t despair at improving little corners of the world just because actors you don’t like will also benefit.
I DO see it as zero sum. OSM and Google maps compete for a limited supply of users. It's the "wrong" team because when Google gets users, they make money. When OSM gets users, the map gets better and people are freer from abuse and manipulation.
When you pay, you may still be a product. There is no (economic) law that dictates that when you pay, your data may not be sold.
Paying customers might dissapear if they find out you make additional profit through datamining or -sales. So there is more incentive not to sell or mine data, but it is no guarantee.
The only guarantee is when technology ensures the service provider does not have the data at all. E.g. through e2e encryption.
Exactly. It is more lucrative to provide a product for free and make money mining data rather than providing a paid product, and this is even making it harder for competitors to enter the market and not die instantly. The next logical step is not paid products, but rather open source ones with distribued data storage.
Are you aware of software developed as a hobby or as part of the actual product (without making the user a product) or as a limited version demonstrating full product?
Code doesn't make sense, weird ass abstraction, but works beautifully. I never could understand where the dependency injection parameters came from lmao
Absolutely. Really it’s only limitation is where the devs spend their time. What would give current godot projects more support while introducing new features.
The hardest part of 3D is the shading pipeline and making that configurable, scriptable, both? in a way that allows artistic freedom and expression.
My core gripe with Unity when it first launched is that you can tell a game was made with it (you can still kinda tell today) as all the games published felt the same in how they ran, played, etc.
Those that make it look unique and give it that polish were the better sellers for sure.
Godot will get there. I spent a long chunk of my career doing a 3D game engine side-project so I get it.
Good news is that with PBR it’s becoming a bit more standardized with how a rendering pipeline works.
I just think it's really hypocritical of the government to advocate for freedom when they're literally making the market less free. The only markets to be regulated should be the ones that would otherwise be in market failure, like medicine.