I stopped releasing binaries for a number of my tools because I didn't want to pay the $100 a year for the right to do so, and I got tired of explaining how to run them without signing.
The same thing exists on Windows, developers have to code sign their binaries. It's even worse in my experience because you have to use a token (usb key with cryptographic signing keys in it) and that's impractical if you want your ci/cd to run in a datacenter. At my company we had a mac mini with a windows VM and a code signing token plugged in just for the purpose of signing our macos and windows binaries.
Another solution that is not mentioned in the article is that users of both macos and windows should be able to easily integrate the certificate of a third-party editor, with a process integrated in their OS explaining the risks, but also making it a process that can be understood and trusted, so that editors can self-sign their own binaries at no cost without needing the approval of the OS editor. Such a tool should ideally be integrated in the OS, but ultimately it could also be provided by a trusted third-party.
I struggled with a similar problem recently. You can use osslsigncode to sign Windows binaries from Linux. It is also possible, with some pissing about, to get everything to work hands off.
In the end we went with Digicert Keylocker to handle the signing, using their CLI tool which we can run on Linux. For our product we generate binaries on the fly when requested and then sign them, and it's all done automatically.
Azure Key Vault - even in the ‘premium’ HSM flavour can’t actually prove the HSM exists or is used, which doesn’t satisfy the requirements the CA has. In theory, it shouldn’t work - but some CAs choose to ignore the letter and the spirit of the rules.
Even Azure’s $2400a month managed HSM isn’t acceptable, as they don’t run them in FIPS mode.
Nope. Notarization is not code signing. It’s an extra step, after code signing, where you upload your software to Apple’s servers and wait for their system to approve it. It’s more onerous than code signing alone and, with hindsight, doesn’t seem to have been offering any extra protection.
It's not the same, but in practice it's also not so different. Microsoft keeps track of how many times a certain executable has been run and only after a certain threshold does the executable become openable without hunting for tiny buttons. The kicker: this also applies for signed binaries.
Microsoft will upload these executables to the cloud by default if you use their antivirus engine ("sample collection").
In a way, Microsoft is building the same "notarisarion database", but it's doing so after executables have been released rather than before it. Many vendors and developers will likely add their executables to that "database" by simply running it on a test system.
On the other hand, SmartScreen can be disabled pretty easily, whereas macOS doesn't offer a button to disable notarisarion.
The important part is that once you have a code signing certificate, you can sign your executable independently, offline, without involvement from Microsoft, which isn’t possible with Apple’s notarization.
Microsoft's notorisation sounds fully automated and transparent, while Apple's is more political and hands on. Individual apps getting their notorisation slowed down to a glacier pace because the platform owner doesn't like them doesn't seem to happen in Microsoft land.
Wasn't there even a story some time ago about how some completely legit, legal, above-board app to virtualize old (pre OS X) versions of Mac OS got rejected by Apple's notarization process?
I'm honestly not even sure it's about denying competitors anything. It feels more like denying their users. Apple has a long history of intently denying users the ability to do what they want LONG before any potential App Store competitors appeared.
Notarization is the same for macOS and iOS AFAIK. Both platforms have a separate app store review process that's even more strict than the notarization process.
> Notarization is the same for macOS and iOS AFAIK.
Assuming the basic facts are straight, the the linked story explicitly proves this is false:
> UTM says Apple refused to notarize the app because of the violation of rule 4.7, as that is included in Notarization Review Guidelines. However, the App Review Guidelines page disagrees. It does not annotate rule 4.7 as being part of the Notarization Review Guidelines. Indeed, if you select the “Show Notarization Review Guidelines Only” toggle, rule 4.7 is greyed out as not being applicable.
Rule 4.7 is App Review Guidelines for iOS, so this would be a case of failing notarization for iOS App Review Guidelines, which means the policies (and implementation) are different between platforms.
(Of course there's no such thing as "Notarization Review Guidelines" so maybe this whole story is suspect, but rule 4.7 is the App Review Guidelines rule that prohibits emulators.)
The point is that notarization plays the same role for both platforms: checks whose purpose is to make sure that the software won't harm the user's device, unrelated to the App Store review process. Both platforms have an additional App Store review process which is significantly more strict, and the notarization process isn't supposed to involve App Store review for either platform.
When Apple denies notarization for bullshit reasons on one platform, it makes me highly suspicious of their motivation for notarization on all platforms.
Their decision to use the same word for both is enough for me to treat them as the same. Apple has tried to convince people that notarization exists for the user's benefit; the iOS implementation of notarization has convinced me that that's not the case.
The bigger difference is that Apple isn't just checking for malware, it's checking for conformance with various APIs, manifest requirements and so on. Not as strict as the iOS App Store, maybe, but it will refuse to notarize if it detects use of unsanctioned API calls.
You don't even need signing for Microsoft's system to do what it does - it can operate on unsigned code, it's all hash based.
Is there a concrete example of this? We know this isn't blanket policy, because of a recent story (https://news.ycombinator.com/item?id=45376977) that contradicts it. I can't find a reference to any macOS app failing notarization due to API calls.
I have experienced it myself but this was some years ago, may not be current. Think it was things they were trying to deprecate, which are now fully gone - was around the time they introduced Hardened Runtime, 2018-19 ish.
Notarization doesn't blanket block all access to private APIs; but the notarization process may look for and block certain known accesses in certain cases. This is because notarization is not intended to be an Apple policy enforcement mechanism. It's intended to block malicious software.
So in other words, using private APIs in and of itself isn't an issue. Neither is it an issue if your application is one that serves up adult content, or is an alternate App Store, or anything else that Apple might reject from its own App Store for policy reasons. It's basically doing what you might expect a virus scanner to do.
Yeah, don't disagree with any of that, but I'm looking for explicit evidence that that is true (right now it sounds like it's just an assumption)? E.g., either examples of apps failing notarization due to API calls, or Apple explicitly saying that they analyze API calls. Without that it sounds like we're just guessing?
I have the opposite experience - on macOS you can guarantee what users will see when you distribute your notarized app, while on Windows you cannot for undefined time.
How often do you notarize your apps? Why does the speed matter at all? In my cases it takes 2 seconds for the notarization to complete.
The length of time notarization takes depends primarily upon how large and complicated your app is, and how different is from previous versions of the same application you've previously notarized. The system seems to recognize large blocks of code that it's already analyzed and cleared and doesn't need to re-analyze. How much your binary churns between builds can greatly influence how fast your subsequent notarizations are.
A brand new developer account submitting a brand new application for notarization for the first time can expect the process might take a few days; and it's widely believed that first time notarizations require human confirmation because they do definitely take longer if submitted on a weekend or on a holiday. This is true even for extremely small, trivial applications. (Though I can tell you from personal experience that whatever human confirmation they're doing isn't very deep, because I've had first time notarizations on brand new developer accounts get approved even when notarizing a broken binary that doesn't actually launch.)
And of course sometimes their servers just go to shit and notarizations across the board all take significantly longer than normal, and it's not your fault at all. Apple's developer tooling support is kinda garbage.
“Notarize your macOS software to give users more confidence that the Developer ID-signed software you distribute has been checked by Apple for malicious components. _Notarization_of_macOS_software_is_not_App_Review. The Apple notary service is an automated system that scans your software for malicious content, checks for code-signing issues, and returns the results to you quickly.”
⇒ It seems notarization is static analysis, so they don’t need to launch the process.
Also, in some sense a program that doesn’t launch should pass notarization because, even though it may contain malware, that’s harmless because it won’t run.
It's more akin to an enforced malware scanner, at least in principle, kind of mandatory VirusTotal with a stapled certificate.
In practice though they use it to turn the screws on various API compliance topics, and I'm not sure how effective it is realistically in terms of preventing malware exploits.
> doesn’t seem to have been offering any extra protection.
How would this be measured?
Since no one has pointed it out here, it seems obvious to me that the purpose of the notarization system is mainly to have the code signatures of software so that Apple can remotely disable any malware from running. (Kind of unsavory to some, but probably important in today's world, e.g., with Apple's reach with non-technical users especially?)
Not sure how anyone external to Apple would measure the effectiveness of the system (i.e., without knowing what has been disabled and why).
There's a lot of unsubstantiated rumors in this comment thread, e.g., that notarization on macOS has been deliberately used to block software that isn't malware on macOS. I haven't seen a concrete example of that though?
Disabling malware via hash or signature doesn't require the Notarization step at all. Server can tell clients to not run anything with hash xxyyzz and delete it. I mean, just think about it. If disabling stuff required the Notarization step beforehand, no anti-malware would have existed before Notarization. Nonsense.
I think notarization is just a more automated way to do this approach, e.g., otherwise Apple has to hunt down all the permutations of the binary themselves. It seems like it just simplifies the process? (It makes it a white list not a black list, so it's certainly more aggressive.)
Highly suggest trying Azure Trusted Signing on a CI system with windows boxes (I use Github). Windows signing was an expensive nightmare before, but is now relatively painless and down to $10/mo (which isn't cheap but is cheaper than the alternatives).
Azure Trusted Signing is a crapshoot. If you can get running, it's easy and fast and great. But if you run into any problems at all during the setup process (and you very well might since their onboarding process is held together with duct tape and twine), you're basically left for dead and unless you're on an enterprise support plan, you're not going to get any help from them at all.
Last time I checked it's still US/Canada only. Luckily I only needed code-signing for an internal app, so we just used our own PKI and pushed the certs over MDM.
It’s also limited to companies that have a proven life span of at least 3 years IIRC (you have to provide a duns number). They may have reopened for individuals, but that means your personal name attached to every binary.
> The same thing exists on Windows, developers have to code sign their binaries.
> Another solution that is not mentioned in the article is that users of both macos and windows
The article is actually about notarization on iOS, which is vastly different from notarization on macOS. On iOS, every app, whether in the App Store or outside the App Store, goes through manual Apple review. But apps distributed outside the App Store have fewer rules.
FTA: “Apple’s complete review of apps – known as “notarisation” process - a mandatory step for distributing any software on its platforms, represents the very gatekeeping behaviour the DMA was written to prevent.”
Notarization doesn’t involve a complete review (https://developer.apple.com/documentation/security/notarizin...: “Notarization of macOS software is not App Review. The Apple notary service is an automated system that scans your software for malicious content, checks for code-signing issues, and returns the results to you quickly.”
I also expect Apple will argue that requiring code to be notarized is explicitly allowed under the DMA, based on section 6.7:
“The gatekeeper shall not be prevented from taking strictly necessary and proportionate measures to ensure that interoperability does not compromise the integrity of the operating system, virtual assistant, hardware or software features provided by the gatekeeper, provided that such measures are duly justified by the gatekeeper.”
So, the discussion would have to be on whether this is strictly necessary and proportionate, and whether Apple duly justified that.
I think “strictly necessary” is a bit at odds with defense in depth (https://en.wikipedia.org/wiki/Defense_in_depth_(computing)), where you explicitly add redundancy to improve security, so we’ll see how a judge rules that, but I can see them accepting it if Apple argues they’ll implement a similar feature on-device instead if they have to.
Suffered that back in the day with an Electron desktop app. Not to mention that the notarization and signing integration itself is completely broken. The first time you submit a binary it can take DAYS to process, and setting everything up to work properly with GitHub Actions CI/CD is absurdly time-consuming. It's ridiculous, and if you add this new notarial verification policy on top of that... In the end it's just Apple being Apple.
Can the fsfe also sur Google to try to prevent them to force the registration of all developer than want to install app on any Android phone outside of the play store?
Again, I would happily donate to such an initiative before it is too late!
I suppose this kind of notarization across all digital platforms will have even more importance once the EU CRA (Cybersecurity Resiliency Act) takes full effect end of 2027.
Free/libre refers to user freedom. Mandatory licensing would restrict developer freedom in favor of user freedom, a common feature of consumer protection laws.
As an iOS user, I love this and you are free to hate me for it. It keeps my grandma safer from scams. This is why I bought her an iPhone.
I don't want to hear any of the usual "don't use sideloading if you don't like it". I don't want it to exist so nobody can talk my grandma into installing a fake bank app over the phone, like they did to her once when she had an android phone and stole all her money.
Yes this is not foolproof still, some scam apps might make it past notarization. Just like cover fees in clubs and gates in gated communities -- it does not keep all the riff-raff away, but it helps.
"Fabricated hypotheticals"??? How did you like living through the 1990s and early 2000's when Windows was an unfettered vector for viruses. Your position is elitist at best. Only the anointed few who know how to make keep their systems safe from exploits shall have access to computing. Ask your friends in who are not in the software business how they like checking the cryptographic signatures of the binaries that are about to install from the command line. What they don't know how? Well no compute for them.
It's not "elitist", it's principled. The only question that matters is if a business practice is violating fair market principles and relevant laws or not. "What about my grandma" is not an argument and not relevant to the judges' judgement. The world doesn't revolve around OP's grandma.
Furthermore, the most potent attack vector was, is and will always be social engineering, which is much more likely on smartphones than on dumb phones. So if it's not concern trolling, then the obvious move is to buy a dumb phone for grandma instead of depriving everybody else of their freedoms and rights.
>Yes, i imagined this happening, my grandma imagined her bank account being empty, and the police imagined filing a report.
People have vivid imaginations and still none of that is relevant to what constitutes an anti-competitive business practice that is in violation of fair market principles and relevant laws.
>I cannot buy her a dumbphone because we use whatsapp to keep in touch and google photos to share photos with her.
Good news! Yes you can! There are dumb phones with whatsapp and you can share images on whatsapp too! https://www.dumbphones.org/ - check the "Whatsapp Support" filter option.
WhatsApp does not handle phot albums. Google Photos does. And she is perfectly happy with her iPhone. If you aren’t, Google makes Android phones you can use where you can side load anything you wish.
Sure, whatever excuses you can conjure up to keep the propaganda going.
>And she is perfectly happy with her iPhone. If you aren’t, Google makes Android phones you can use where you can side load anything you wish.
It is irrelevant what system I am personally using, since it's about the principle and not about personal preferences; which you clearly have no interest in understanding. This is also where your "just get an Android" propaganda falls apart. Google learned from Apple that they can get away with much more anti-competitive BS so they followed suit: https://www.makeuseof.com/androids-sideloading-limits-are-an.... This where big tech shills lead society towards; depriving everybody of their rights and freedoms with their bogus apologia.
Finally, the world still does not revolve around your grandma. And if she can use all those apps, then she can also learn to just keep using the Apple Store. None of your contrived fear mongering narratives hold up to scrutiny, any astute observer will see right through them.
The submitted article is about iOS, not macOS. Apple unfortunately used the same word "notarization" on both platforms, but the processes are not even remotely similar. Perhaps the confusion was deliberate, but in any case, many commenters here are confused and mistakenly believe that iOS notarization is like macOS notarization.
iOS notarization is still manual review by Apple, but with fewer rules and restrictions.
> If you’ve opted into alternative distribution for customers in the European Union, you can choose to make your app version eligible for distribution on alternative app marketplaces or websites only by selecting to have it evaluated based on the Notarization Review Guidelines (a subset of the App Review Guidelines). Otherwise, App Review uses App Review Guidelines to evaluate your app version to make it eligible for distribution on the App Store, alternative app marketplaces, and websites if approved.
DMA is about increasing competition of app stores. It is not about giving "freedom" to people. Notorization is an independent process from running an app store on Apple's platform.
Then if Apple chooses to serve this market demand by allowing unnotarized apps to be sold in their store, they must allow third party app stores to also sell unnotarized apps.
The key thing here is that the Apple App Store and third party app stores must be on an level playing field to compete on.
The problem with this reply is it starts with the phrase "if Apple chooses". My point is it stopped being their choice when they sold the hardware to someone else.
Notarization doesn't involve any sort of editorial control. It's just a virus scanner that's run up front and then stapling an attestation to your application that it passed the scan. It does not involve looking at the content of your app and making any value judgements about it; it's purely an automated static analysis system checking your application for known malicious code.
UTM wasn't denied notarization because some virus scanner found that it was a virus, but because it violated App Store guidelines. That's editorial control.
You're talking about notarization on macOS. Notarization on iOS is vastly different. On iOS, notarization is more or less App Store review but with fewer rules.
Honestly, iOS notarization really muddied the waters. IMO, because Apple decided to name them the same and thus presumably considers them the same, we should be just as critical of and worried about notarization on the Mac as we are of notarization on iOS.
That doesn't matter as it also gives editorial control over the Apple App Store itself. The DMA is not about giving full editorial control to competitors. It's about allowing for competition on a level playing field for alternate app stores. Since the Apple App Store also has to only sell notarized apps, they do not have an unfair advantage.
Everything you can get in an alternative app store has to be approved by Apple and they only approve stuff they'd allow in their store, making it not an alternative.
To be an alternative to the Apple App Store, it just needs to be able to match the abilities of the Apple App Store. Again, the DMA is not about freedom, but about fairness between app stores on the platform. Apple can define the playing field such as selling notarized apps only, but it must be an level playing field among all app stores.
Software freedom, at least for end users, is a smokescreen, too. I can revert your argument: "you want more ransomware because of a few OSS enthousiasts?" What we need is a way to curb the excesses, such as high entrance barriers to the store.
A phone/tablet is a tool, with very intense usage, and huge privacy value, not an engineer's toy.
The real smokescreen is this freedom vs security false dichotomy. If you give up freedom for the promise of security, you get neither. Look at the App Store. It's full of harmful garbage designed to extract value and waste your time by any trick necessary. It's one step short of ransomware. Oh, unless you use an app for your important documents, then it comes under new management and demands you start paying monthly or lose your stuff. Suddenly that lack of freedom to continue using an old version of the app or to dig around its internals and pull out your data becomes a loss of security. It's fine though, because this type of ransomware is totally legal and inline with your benevolent platform dictator's policies.
Your argument falls apart when you consider iPhones' 60% market share. People have spoken out about whether they want dangerous, uncontrolled third-party apps on their phones.
This is called the tyranny of the majority, where you're arguing that because most people don't care about freedom, therefore freedom doesn't have value. It's not a sound argument, much like saying freedom of speech doesn't matter because most people have nothing to say.
Editing to add: it seems particularly ironic that you think iPhone users make great purchasing decisions when they buy the phone, but are incapable of making good decisions when selecting software. What accounts for the discrepancy?
I don't care about what the riff-raff think, it is morally wrong and defies human freedom and dignity to require everyone walk around with a locked-down surveilance device in their pocket in order to function in the economy.
60% of society could be raptured tomorrow and the world would be better off.
Just in case you unironically don't understand this and aren't just playing it up:
Allowing third party installations does not mean uncontrolled third party apps. It merely means users have to option to install software on their phones - which continues to limit the softwares capabilities until the user was prompted to allow each.
You could argue "but a braindead person can randomly go on a phishing website, randomly download some .app file and suddenly - through magic go through a theoretical installation dialog to finally explicitly grant this malware problematic permissions... And I'm sure there are going to be people that will do exactly that... But without it, they'll still manage to do the same to the same effect, just without the app installation by inputting their bank credentials in a phishing site or similar
The thing your citing as a problem solved by disallowing app installs isn't actually solved - and it would not become more problematic either.
Finally, the fact of the matter remains that almost nobody would actually use the capability to install from third party stores, as you've correctly insinuated. But if anything, that should be another proof that allowing third party installs doesn't reduce security.
People just like to have everything provided to them from a single source, and will usually pay a premium for that.
Most people are stupid and short-sighted. Pointing to the stupid in support of your argument doesn't help it.
And, the app store does absolutely nothing to prevent "dangerous" apps. Apple doesn't review the code. In fact, if your code is reviewable, it's even harder to get it on the app store.
At the end of the day, the App Store and Play Store are filled with adware, spyware, and other malware - because Apple and Google like it that way. That's what they want. They don't give a single flying fuck about your security. They care about extracting 30% while simultaneously doing as little as possible. That's completely at odds with security, yes, and they know that. They just don't care.
What point are you even trying to make? That's not a counter-argument unless you assume that people in aggregate always make great purchasing decisions. Wait until you hear about cigarettes, heroine, slot machines, snake oil, tulips, and the rest of the effectively infinite list of fun and unique ways people make terrible choices or are bamboozled into acting against their own and others' interests. This is a comment thread about protecting people from scams. The premise acknowledges that people make widespread poor decisions. Is it so unthinkable that buying an iPhone is one of them?
They are using it as a proxy for "people with low technical skills" (which is a specious argument since it was a friend of my parents who got me into programming and he remains one of the best I've ever known) and making the usual argument that we should limit control of our devices to make it safe for them.
I actually don't have (much) of an issue with walled garden approaches as long as the wall has a gate that is easily opened, give me an OS level toggle with a warning of "Here be dragons" and I can live with it - it's not ideal but it's not a terrible trade off.
It's something Android has had previously (but they seem to be trying to lock that gate) and iOS less so.
How about instead of a single os level toggle you get a trillion dollar company, renowned for their high quality design, invested in providing the best possible UX while respecting the user as the owner of the device?
Which is something I find very annoying, because I know a lot of people who are parents (or adults) or grandparents which have greater technical skills than their children.
Right, if we could educate users on the tools they use, and if the trillion dollar companies could provide tools to help community members protect each other, we wouldn't be here. Apple doesn't have to be a dictator if they would help the community support each other. Instead they took the easy way out of stripping freedoms from everyone so they can control every device out there. It's a minor inconvenience to be involved in protecting vulnerable people in our community, it's tragic that people just said Apple should take that role.
They don’t. You can still run any software you’d like. You just get warnings, so people like parents don’t just randomly open malicious programs from the internet.
App developers do know. I can't say that I've ever worked on an app where this request has been made. Neither the App Store Connect Agreement[0] nor the Apple Developer Agreement[1] stipulates that the developer can be compelled to surrender their source code.
All the relevant agreements can be found here, so if there's something that specifies this kind of overreach, I'd both be very surprised and interested.
“If you are required by law, regulation, or court order to disclose any Apple Confidential Information (which can include requests related to legal investigations or audits), you agree to give Apple prompt notice and to cooperate in seeking a protective order or confidential treatment of such information”
They haven't read the document properly. Here's the definition:
> any information disclosed by Apple to you in connection with Apple Events will be considered and referred to as “Apple Confidential Information” and are subject to the confidentiality obligations of this Agreement
The definition of Apple Events:
> As an Apple Developer, you may have the opportunity to attend certain Apple developer conferences, technical talks, and other events (including online or electronic broadcasts of such events) (“Apple Events”).
> I still don’t see why you would want your parents to run untrusted software on their devices, but you do you I guess.
I don't trust Apple's App Store review. They've approved countless scams that have tricked Apple users out of a lot of money, perhaps $billions in total.
Sadly about 98% of real world users are going to fall into scams, ransomwares and stuff. They are not mentally challenged, there are just so many traps/fakes/tempting stuff that we as IT people are more aware of (but even we still fall into some).
We also can't count on every person being able to check every single thing they do: how do you check if some food or drug you get is good or not? you can't really, you have to trust someone who knows.
It’s a bit like the Elizabeth Warren toaster analogy. If you bought a toaster with shoddy wiring and it caught fire and burned down your house, everyone would blame the manufacturer and not sneer at you online for not learning electrical engineering and not checking the wiring yourself before using it.
It's more like if I buy a reliable toaster, but I buy bread that's secretly poisoned by the manufacturer and hurt myself. I'm not gonna demand the toaster maker add a poison sensor to the toaster and say "how dare they didn't protect me!"
I don't buy this in the first place. It is reasonable to expect consumers to do some background research into the products they buy. In fact, it is the only way capitalism can function as a meritocracy.
Society should be more dangerous as a means to force people to learn more about technology they rely on.
I'm not sure what you think is so harmonious about it. I think there are gobs of iPhone users that wanted a free store, created Cydia, had it shut down, and have been fighting ever since with tools like AltStore to try to restore their own ability to install software of their choosing on their phones. Simply searching any search engine for "build and install iOS apps locally" results in gobs of discussions where people are trying to figure out how to actually get control back of their device and work around all of Apple's restrictions.
Further, the state of affairs has steadily gotten worse over the years as Apple tightens their restrictions, adds more barriers to running apps of your choosing, and having agency over what programs you can actually run.
This is a war on general purpose computing. And sure, there are gobs of people who don't care, but there are many who do, and they're fighting for the rights of all of us. My own mother-in-law who spent thousands of dollars on Kindle books didn't understand that she couldn't ever read any of those books using anything other than Kindle, and that she could never give them to somebody else to read (like my son who doesn't have an Amazon account). These people making these decisions are not well-informed. They assume they're not being screwed over, but they're in for a rude awakening.
We are rapidly moving to a world where there are no options for people to run software of their choosing on mobile devices. And we already know that the mobile manufacturers operate at the behest of the US government. This is not a pattern that I think is going to serve us well in the coming decades.
At some point, you have to figure out who your mobile devices are working for.
How can we trust software anymore? Open source projects are being sold to bad actors. Python default repos are full of malware. Originally blessed and trusted apps are being bought by software companies is dodgy countries. It seems like we can only trust big software companies like Microsoft and Oracle.
I'm building an application that allows you to send a file to your colleagues. That's hardly a revolutionary or unusual use case, and it definitely requires network access and full access to the local file system. I also need the ability to lock files, writing file locks anywhere on the system, and I need to be able to index the contents of files.
Not only are all of these functions and corresponding permissions completely standard for all kinds of applications, they belong to the core of what any system that calls itself an "operating system" should deliver to developers and end users.
You can see it in action. I have a M1 Ultra Mac Studio, an insanely powerful machine, and when building open source software, actual compilation flies but the autonomy step crawls because IIT has to build test binaries to test OS features and notarization slows that down dramatically.
Notarization is completely optional when building any OSS software on a Mac, and not part of any default build process I know. A Mac can sign builds for running locally, a process which is fast, completely local, and does require building test binaries or anything like that. Even a Mac building for an iPhone in developer mode has a local cert it can use, and doesn't require notarization.
Notarization is only needed when distributing binaries to others. Personally I do it once a month for the Mac app I distribute.
The post I wrote to point people at anyway:
https://donatstudios.com/mac-terminal-run-unsigned-binaries
reply