>I'm always astounded by the self-centeredness humans are capable of.
In this instance I'm sorry but this is the wrong take. The fantastic article directly addresses that in fact, and it jives with what I was taught as part of first responder and mountain rescue training in the US, as well as have heard from EMTs and volunteer firefighters I know:
>"However, research has shown that when untrained civilians are unexpectedly placed into an emergency aboard an aircraft, many people’s brains revert to what they already know, which is to stand up, grab their bags, and walk to the exit, as though nothing is wrong. This behavioral tendency can be short-circuited if the flight attendants loudly and assertively order passengers to leave their bags behind and exit immediately. But on flight 1492, the order to leave bags behind was not heard by the majority of the passengers because the senior flight attendant forgot to press the PA button before making the announcement."
Again, this jives with everything from military to emergency response of all sorts: in high stress maximal flight/fight rapid response sorts of situations, humans tend to (a) revert to whatever "muscle memory" or drilled in training they've got, if any, or else whatever basic instinct/patterns they've developed, (b) follow authoritative instructions, if available and simply/rapidly understandable, (c) panic, or (d) freeze up. Just as with everything else with safety, humans must be recognized as humans and be part of an overall systemic approach if we wish to improve outcomes as much as possible.
So if you're dealing with untrained random civilians who have no particular "muscle memory" to draw on beyond the typical, then crew procedures, aircraft design etc have to account for that. That's just part of the responsibility of running a civilian facing service involving life/safety. Better training for the cabin crew might have helped here just as better training could have prevented the situation happening at all, and identically better mechanical designs might also have helped and be worth considering in principle if this was frequent enough. This could range from how PA systems work (perhaps when an emergency landing is triggered, PA should automatically go to open mode and stay that way, or perhaps the evac warning including "LEAVE ALL BAGS BEHIND, EVACUATE NOW OR DIE" should be fully automated and just start broadcasting once emergency slides are deployed) to having overhead bins automatically seal and be impossible to open so somebody could at most spend a few seconds trying before realizing they can't (this would require actual study and cost/benefit tradeoff investigation of course). But the take away in disasters should not be any sort of moral one liner. These are massive systems with large numbers of people being forced to deal with a (literally here) by-the-second lethal scenario. Safety is a systemic issue.
The scientific record is pretty clear that most humans struggle to do even basic things in emergency, high pressure situations like being in arms reach of a 600C fire
I remember the fire alarm going off at a hotel I was staying at. Rushed the stairwell and it was already jam-packed by people who had brought every single piece of luggage they owned with them.
Actually the article (and a sibling comment) touches on this and it's not necessarily so simple ("there but for the grace of God go I")
> In one sense, this blame is constructive insofar as shame is an effective motivator for people who might otherwise try to get their luggage during a future evacuation. However, research has shown that when untrained civilians are unexpectedly placed into an emergency aboard an aircraft, many people’s brains revert to what they already know, which is to stand up, grab their bags, and walk to the exit, as though nothing is wrong. This behavioral tendency can be short-circuited if the flight attendants loudly and assertively order passengers to leave their bags behind and exit immediately. But on flight 1492, the order to leave bags behind was not heard by the majority of the passengers because the senior flight attendant forgot to press the PA button before making the announcement.
The privacy policy on the website specifically states that they are collecting the PII and may use it to offer products or services, either by themselves or via a "business partner".
It isn't paranoia when the threat is real.
> We may share Your information with Our business partners to offer You certain products, services or promotions.
> To provide You with news, special offers and general information about other goods, services and events which we offer that are similar to those that you have already purchased or enquired about unless You have opted not to receive such information.
You don't even need to do that, as a loophole has existed for a long time. By switching to a cheaper subscription plan, you won't incur any charges, and it will trigger a new "30-day grace period for cancellations." This allows you to cancel immediately without any fees.
It's a semantics debate all around, but the article's main point is that "show" suggests a mere demonstration, whereas "engage" implies active participation by the learner in the process of learning.
Doesn't seem terribly useful. I mean it only obscures that it prints "ok". If you're looking at the logs, you probably already figured out someone is attacking you, and if you didn't, seeing "echo ok" will not help you figure it out.
If the only thing the command does is "obscure what it does", then the only thing it obscures is "obscure what it does". I guess there's no requirement that whoever writes these scripts is a genuis.
People writing malware generally don't want to deploy it on honeypots, because then they're handing their payload (and other tradecraft) directly to analysts.
So often the first stage is an attempt at honeypot detection, or more broadly, device fingerprinting.
A bad honeypot might not even run a real /bin/sh, and this detects that right off the bat.
The part that baffles me is the cognitive dissonance between wasting time in this manner and subsequently complaining about not having enough time to do things.
Depends really. I am the entire ops team for a non-profit volunteer org, I host their services on the cloud specifically so when something fails I don't have to do anything. Would much rather have a cloud computing company's ops team working to fix the issue instead and we'll just pay our monthly dues.
The same can be said of many smaller companies where uptime can be maintained cheaper and more consistently by offloading ops work to a cloud computing company.
I'm always astounded by the self-centeredness humans are capable of.