Yes, I find it VERY hard to believe that this went on and nobody mentioned it at all. Not one disgruntled employee, not anyone mentioning it to a friend at a bar? No way.
Uhm. Just because you didn't believe it doesn't mean that it wasn't known. The CCC was talking about this stuff years ago, some of it conjecture, some of it information from old sources/anonymous sources and the like.
Now we have proof. So that's that.
Also, given the strong "disincentivization" for whistleblowing that is going on at intelligence agencies, I am not at all surprised that it took a few years before somebody went public with it.
Hell, it used to take decades before this kind of stuff became public. Just this year we were made aware that the West German spy agencies opened _all_ cross-border inter-German mail. You know, the kind of thing that we in West Germany were told only the East Germans would do. That was kept secret for more than 20 years after it stopped being done, even though thousands of people were involved.
Well, it is easy to dismiss people saying kind of stuff as 'crackpots' or 'tin-foil-hatters'.
I personally keep to a simple rule. If something is technologically feasible[0], economically viable and someone has an incentive for it, it will happen, period. This rule of thumb successfully filters out weather control crackpottery while correctly identifying surveillance capabilities (and no, it's not hindsight bias; the only thing I was surprised about Snowden revelations was that it was underwhelming).
[0] - and keep in mind that pretty much anything is.
In what way were the revelations underwhelming? For me overwhelming is their ability to coerce companies and their willingness to do so, in spite of the potential for a loss of trust in US companies. Sometimes such maneuvers can have a big cost long term - I can imagine at least governments switching software, hardware or services to more local providers. I mean, think about backdoors - you may trust the NSA, but if backdoors exist, it is only a matter of time before they get discovered by other parties. Countries now also have the ultimate argument for the balkanization of the Internet. This can't be good and all of this distinction between US citizens and foreigners doesn't mean anything for foreigners.
The technological ones. I was expecting (and I still believe) that they're tapping much better into the Internet that it is revealed, and that there are many more hardware backdoors out there.
> Countries now also have the ultimate argument for the balkanization of the Internet.
I agree. This is very bad. I didn't say I was underwhelmed with outcome (though I am with the reaction of public and of Internet companies), just with the revealed capabilities of the NSA.
I wouldn't be surprised if they talked about it but if it's rebranded it could be seen as "Yet Another Tool That IT Provides". People talk about their favourite consumer apps, but only ever complain about the shit they're forced to use at work. In this case, the tool is working fine so there's no need to sing its praises. If it was a shitty search engine, they'd be telling everyone "we collect all this data and for the life of me, we can't even search it! what's the point of violating the constitution at all?!"
When your livelihood is based entirely on your security clearance and your ability to keep working in the security industry you make it a very big point to not talk about this stuff outside of work. I had a friend who went to work for a 3-letter agency and started consistently avoiding bars/drinking because he was afraid he'd say something. This is exactly why whistleblowing is a problem in the intelligence community - they'll sue you to oblivion and/or imprison you for saying anything at all.
In that sense, we found that all container-based systems have a near-native performance of CPU, memory, disk and network. The main differences between them lies in the resource management implementation, resulting in poor isolation and security. [1]
In my experience the measure of good logging is the ability for someone, either yourself or an operations engineer to be able to easily solve problems using only the logs. They shouldn't have to refer to the source code. Developers should be using this as the yardstick when asking questions about their log output.
Shame about the lack of support for older macs. I've upgraded my late 2006 MacBook with a new SSD and the thing runs great. Best upgrade I've ever done. Now mountain lion isn't supported. Looks like apple trying to increase those profits even further.
Your Mac must be one of the following models:
iMac (Mid 2007 or newer)
MacBook (Late 2008 Aluminum, or Early 2009 or newer)
MacBook Pro (Mid/Late 2007 or newer)
MacBook Air (Late 2008 or newer)
Mac mini (Early 2009 or newer)
Mac Pro (Early 2008 or newer)
Xserve (Early 2009)
Looks like apple trying to increase those profits even further.
Actually, its likely more a technical reason than a financial one. IIRC, the first generation of Core 2 Duo machines shipped with 32-bit EFI, even though the processor is 64-bit. The 64-bit kernel, which is now the default and only kernel in Mountain Lion, requires 64-bit firmware which your machine doesn't have.
They probably could have engineered the 64-bit kernel to work with the 32-bit EFI, or the probably could have shipped an EFI update for old Core 2 Duo machines, but those machines are 6 years old. I'm sure they felt their limited resources could be better spent elsewhere.
Yes, because we all know that money is the only obstacle to creating a product. You definitely don't need a lot of uber-talented employees focused in certain areas or anything.
As the comment above infers - this is getting into mythical man month territory, where adding extra people may even slow the entire project down.
Apple seems to have incredible discipline around scope, and they have delivered a series of amazing products. It's something we can all learn from.
Sure, you can't just throw money at a problem and expect it to resolve quickly. At the same time, Apple chooses which problems it wants to solve, because it has the capacity to plan for the support and engineering time required.
It isn't like these support issues are a surprise to them, they just have different priorities than fixing them.
It isn't like these support issues are a surprise to them, they just have different priorities than fixing them.
Exactly! Like I said, I'm sure they felt their limited resources could be better spent elsewhere. Supporting 6 year old hardware is a very, very low priority, so they didn't spend any resources on it. As SJ himself said, focus is about saying "no". This video discusses OpenDoc, but it could just as easily apply to supporting 6 year old hardware: http://www.youtube.com/watch?v=H8eP99neOVs
Apple is always pushing forward. Sometimes perhaps a bit too aggressively, but they're always pushing forward.
""The idea they couldn't afford or have the capacity to hire the talent they want is ridiculous."
This is true to some degree. I'm certain that if Apple decided they were willing to pay $800K/year each + signing bonuses of $3 million dollars for a team of 4-5 share OS engineers, they could, in fact, resolve this. (I base the $800K on what top flight OS/Kernel consultants make - around $400/hour, plus the money you would need to pull them away from their stock/equity packages at whatever other company they were working at, sometimes known as a "Make Whole" package)
The problems with that, though, is that the average Apple Sofware/OS engineer makes around $200K/year (fully loaded) - so you would end up with a $600K (+signing bonus) disparity - which would be unfair to existing employees.
You can't increase the salaries of all similarly qualified engineers to $800K - that would mean increasing salaries of _all_ engineers - and Apple's R&D costs would skyrocket, and all their other salaries would be out of whack.
Sometimes, even though you can buy your way out of a situation, the peripheral costs of doing so are so great, that what seems to be only a $10million problem to solve, might up ending costing you hundreds of millions of dollars.
So - you suck it up, stay focussed, and do what you can with the people you have.
Of course, what _most_ companies do, is instead hire some average (or, often, below average) engineers to try and do the port - they muck it up, and they ship a crappy product.
Better to do fewer things better. For example - photo stream and iTunes Match have been working well for me - but I wonder what percentage of Apple Employees use iCloud.com as their principal mail client, as compared to google employees using gmail as their primary email system?
> The idea they couldn't afford or have the capacity to hire the talent they want is ridiculous.
Actually, it isn't. Hiring people is hard, and hiring bad people is expensive. It isn't even really a problem you can pay your way out of either, judging who will do well within an organization is incredibly difficult.
All that aside, I would be surprised if Apple didn't have internal budgets for their various projects. The latest version of OS X probably gets a budget on par with that group's market value, so it only has the hiring capacity of its tiny slice of the richest company in the world.
Not trying to refute what you said, but am curious what makes you think they are the most desired to work for? Do many of your programmer friends want to work there?
Mostly the survey that got posted to HN a month or two back about Apple being the #1 reviewed place by employees, which is, admittedly, not the same as most desired place to work.
How much money they have is irrelevant. They're not a charity. I'm sure they have enough data from OSX Lion upgrades and periodic software update checks to know that the cost of supporting old 32-bit systems was more than they would make from upgrade purchases.
Well they didn't become the richest company in the world by being bloated, slow, bureaucratic and inefficient. It's arguably what nearly killed them in the first place.
Welcome to the club of end-of-life Apple device owners. The 15 million or so first-gen iPad owners will also be joining us in October, when Apple tries to force everyone to buy yet another iPad by releasing iOS6 and dropping support for the first one 18 months after they stopped selling it. There is no technical reason: the 4th-gen iPod Touch is supported and has almost identical specs. The 3rd gen iPod Touch is also being dropped despite being almost completely identical to the iPhone 3GS, which does get iOS6.
For a company that keeps boasting about its green credentials, they do generate an awful lot of waste.
(not to mention the extra work for app developers, who can either drop 15 million users's worth of adressible market, or make use of iOS6 features - the most attractive of which, such as autolayout, have no sensible fallback)
1) There IS a technical reason. The iPad 1 has a larger screen and as such requires more RAM than the iPod 4G. Also as for the iPod 3G "almost" identical makes a difference.
2) Apple releases a lot less products than the Android OEMs. So not sure compared to whom Apple is wasteful.
3) Personally I fail to see how autolayout is a particular attractive feature. And I haven't found it that hard to mix match SDKs/targets on iOS.
Re 1: The iPad has 786432 pixels vs the iPod 4G's 614400. Not that big a difference, particularly as that mainly affects the framebuffer. If each pixel in the back, front and Z buffers uses 32 bits, that's a whopping 2MB RAM difference, or less than 1% of the total. So, no technical reason, end of story.
Re 2: The android makers don't seem to brag about being environmentally friendly. Apple can be held to higher standards as they're making that a supposed selling point.
Re 3: I guess it depends on the apps you're making. For autorotation and dynamically-sized content in particular it's extremely useful and saves quite a bit of manual work.
I'd venture to say you in the minority for upgrading a 6 year old Macbook....just sayin.
It's actually due to GPU capabilities of the older Macs. Thus, as well as your SSD'd 2006 Macbook runs, it still will not perform up to the standard they want on Mountain Lion.
I disagree that this would purely be done for a "profit increase"
Unfortunately the graphics card is simply not up to the task of running Mountain Lion. I don't necessarily think this is a play for profits, hell the OS is being sold for $19.95 and can be installed up to 5 computers that you own free of charge...
Unfortunately there is no way to upgrade the video card in a laptop.
Personally I think it is an excuse to get people to upgrade. I see absolutely nothing that would require a crazy new video card. (I've been running ML for months)
In other words: my terminal windows, browser, text editor and twitter client run just as well as on Lion.
> Personally I think it is an excuse to get people to upgrade. I see absolutely nothing that would require a crazy new video card. (I've been running ML for months)
The 32-bit Kernel no longer exists/is no longer maintained/was not upgraded for Mountain Lion.
The unsupported models have video cards that do not have 64-bit graphics drivers available for them.
The 64-bit Mountain Lion kernel is incapable of loading the old 32-bit graphics drivers for these cards.
Thus these video cards are literally incapable of supporting Mountain Lion.
>Personally I think it is an excuse to get people to upgrade.
Well of course it is, they're a hardware company. I don't think there's some conspiracy to deprecate 6 years old computers when the shelf life of a computer in the industry is 1-3 years - in other words, I don't think they're being unreasonable.
I think they're just standardising on hardware that is capable of OpenGL 3.0 and above. Most of the older Macs that don't support ML are the ones with Intel chips that only support 2.1 if I remember correctly.
Oh, I've owned some Macs whose graphics cards were definitely not up to running the OS that they were running. Turning off the genie effect because it skipped frames? Yep.
I agree that that mac is six years old, but what is so different between the graphics requirements for lion and mountain lion? Seems like they didn't have the resources or inclination to upgrade the 32bit drivers to 64bit.
And yes - I'm aware that the upgrade costs $20 bucks. How much is a new macbook?
I'm sure that will be easily hacked to enable, it's enabled on my entirely unsupported Mac Pro 1,1 so I imagine enabling it on models that ML actually supports will be pretty easy. $20 says you just have to alter a plist.
Actually they're just all machines that either use EFI32, or have a GMA950/X3100, or both. Writing up new firmware and drivers for swaths of hardware that hasn't been sold for 3-5 years would have been a nice gesture, but I hardly think it's fair to hold it against them if they choose not to.
There's nothing to re-enable. The 64-bit Kernel has only ever been able to load 64-bit drivers. To use 32-bit drivers, you had to be running the 32-bit Kernel.
So to enable these machines to run Mountain Lion, the enterprising dev would have to backport all of the changes from the 64-bit-only Mountain Lion Kernel to the unmaintained 32-bit Lion Kernel.
Right.. Should have though of that. It's the same with other kernels.
Perhaps a simpler workaround would be to run Mountain Lion under VMWare. I believe it supports graphic acceleration and there should be no problem running a 64-bit OS on a 32-bit host with hardware visualization. It might just work.
The simpler workaround is to use the Chameleon bootloader to turn your Mac into what is essentially a Hackintosh. This allows you to boot the 64-bit kernel via the BIOS emulator, bypassing EFI32 entirely.
Catch is it requires two internal drives and a supported graphics card to get it to boot, and the GMA950/X3100 has been cut. This means it's the unavoidable end of the line for MacBooks, iMacs and Mac Minis cut off by this update. It would require someone to write a 64 bit graphics driver/kext which is well beyond the scope of the hobbyist world.
However for us few Mac Pro 1,1/1'2 owners, this means our machines can keep on chugging. EFI is solvable limitation. No support for your GPU isn't. Problem with your solution is any machine old enough to be cut off by this would be pretty crippled by the overhead of running two operating systems at the same time.
While VMware requires a 64-bit CPU to run 64-bit software, every Mac that supports Lion already has a 64-bit CPU, and is capable of running 64-bit software outside the kernel. VMware Fusion 4's OS X video driver is not hardware accelerated, however, so YMMV, especially on older hardware (not sure about the latest Fusion Tech Preview). You're right that it should boot, though.