If they are not smart enough to keep malware off of what should be the most secure systems around, perhaps they shouldn't be building the fricking FLYING REMOTE-CONTROL DEATH MACHINES for a while, until they can figure out the basics.
I am totally with you. If software is going to operate deadly weapons, it sure as hell better be secure.
But you are glossing over a LOT of detail here. The military doesn't work like Apple: they don't design, oversee, or directly control the construction of the hardware they use. And they shouldn't - the government is woefully inefficient at building products, that's what corporations are good at.
Here's the situation:
- The Air Force contracts General Atomics Aeronautical Systems to build UAVs. You can bet your ass the contract covers things like "protected from malware"
- General Atomics contracts out the different components of the UAV. No device worth $150M gets built by one company alone. The radar, the metal shell, the inside components, and each component of the software are all made by different companies.
- Each component is meticulously specified and rigorously tested. The makers of a component is contractually liable if they fuck up, giving them an incentive to do it slow & right. That's why it's so damn expensive.
- General Atomics puts the pieces together into the final product and delivers it to the Air Force after another round of rigorous testing.
- A team of guys in the Air Force are trained on operating the UAVs to deploy on missions.
=====================
So to say something like "Ugh, military, don't deploy UAVs if you can't keep it virus free!" is an oversimplification. These are extremely complex machines, with highly specialized embedded software, meant to deliver explodey things with extreme precision, while being operated from very far away. You can't just slap Norton on these things and call it a day.
teej, I assure you, I am under no illusion that they can "slap Norton on these things and call it a day". I am at least somewhat conversant with the realities of designing complex military systems. But if the systems really are so highly specialized, and I assume they are, that's still no excuse. At all. If they can't keep malware off them, they have no business flying them, at least for the time being. Which is all I was saying. I know it's not easy.
The systems are less specialized than we all would hope. Even with our massive budget the military is still 'forced' to use existing tech, which opens them up to situations like this.
And if you think this little press release means anything to actual national security, you have much to learn about our secret war against terrorism.
>they are just wined and dined by the contractor when they should be directly overseeing and controlling.
There are actually very strict controls on how much government personnel are allowed to accept from contractors. IIRC, the limit is something like $20-$50 per year in gifts. When contractors host large events with catered lunches, they put out bowls or some other sort of receptacle so that government personnel can pay for their lunch, otherwise it would count towards that annual limit.
Enforcement at the level of "you didn't pay for that six-inch sub and can of coke" is not really practical, but quite a few government personnel have gone to jail in recent memory for accepting more lavish gifts from contractors.
Now, if you send your lobbyists to buy expensive meals for legislators (you know, the ones who actually decide how the money gets spent) and write them big checks, that's generally perfectly legal.
But the computers controlling the Drones seem to be running some sort of Windows variant. There's no real need to control the drones directly if you can control the computer that controls the drones.
The UAV itself will have a computer running a commercial RTOS. The computer on the ground which the operator sits and and uses to interact with the UAV is almost certainly a Windows box. And as someone else said, the military's way of securing Windows machines like those has traditionally been not to hook them up to a network in the first place, instead of installing anti-virus software. That actually worked really well until portable USB devices came along. The result is that the military is only now getting up to speed on securing these types of computers; it's not that they're dumb about computers, it's that in the past they dealt with the threat operationally rather than technically.
Unless policy has changed dramatically since I was in USB drives can be used after they have been classified, properly marked, and scanned. That being said policy and reality are very different beasts. While deployed we had exactly 0 instances of malware/virus on our unclassified NIPRNet devices and at least 2 dozen malware/virus outbreaks on our SIPRNet machines. Usually these came about from the fact that those on SIPRNet tend to be of higher ranks and "above the rules" just like in a corporate structure. The other common offenders where MI and Signal geeks who "knew" better and assumed that their stuff couldn't possibly be infected.
I was told recently by someone working with DoD equipment that although USB flash drives were banned, certain USB hard drives were still OK. He was telling me this because it was so hilarious and alarming.
I was talking to a guy who makes "encrypted" USB drives at the NSA TCC recently. It sounded scarily hand wavy to me. I was asking, "but where is the key stored" and he tells me with a straight face, "right on the drive".
My experience with these is that you must either use your PKI certificate or a password as the key to decrypt the drive. The default configuration is generally to use the PKI certificate on the chip embedded in your ID card. Since you have to have that card in your computer to be logged in to begin with, using it to access other stuff is essentially effortless.
The hard drive has to be scanned by an administrator before you're allowed to use it (not sure what this process entails). It also has to be encrypted, and won't mount unless it is encrypted with the proper DOD-approved software.
As far as I know, SSDs are not allowed, only magnetic drives.
I'm pretty sure it won't mount that, either. The only external storage they'll mount are external hard drives that have been encrypted with their approved software.
And, have you seen all the computers necessary to carry out a drone operation? I guarantee you not all of them are running an RTOS. Probably not even all of them onboard the drone.
Military acquisitions take a long time. To give one example, I know for a fact that there are airplanes flying right now that use DEC Alphas to control their weapons systems. Those planes first came into use in the early 2000's. An older version of that plane is still in use, and will be for several more years; you don't even want to know what it's using.
Soft real-time systems aren't used for things like drones. Look at things like INTEGRITY from Green Hills for that sort of task: http://www.ghs.com/customers/bae_herti.html
It really shouldn't be so hard to put a TPM in autonomous killer robots and only let digitally signed code run. That should make it much harder for hackers.
If they are not smart enough to keep malware off of what should be the most secure systems around, perhaps they shouldn't be building the fricking FLYING REMOTE-CONTROL DEATH MACHINES for a while, until they can figure out the basics.
Capisce, guys?