That's how negotiations work. Vendor makes an offer based on their understanding of your company's needs (usually based on an RFP or initial discussion), your company pushes back on price (if its needs are met) or indicates that it has greater needs and vendors revises the offerings and ups the price. Sometimes they offer a low price to get a potential customer to bite but if they don't accept immediately they withdraw the price because that customer is going to be the kind that demands/needs extra hand-holding. (My current $dayjob does that. If a customer isn't sophisticated enough to take a good price when they're offered it, it means they're going to require a fair amount of extra work so we withdraw the price and the next offer will be significantly higher.)
Also...it's still Red Hat. They're owned by IBM but they're still allowed to operate independently.
But back to the original point: you shouldn't be paying as much for OpenShift as you were for the equivalent VMWare offering. We used OpenShift at my last job and VMWare at the one before it; OpenShift was cheaper than VMWare was before the Broadcom acquisition.
You're leaving out bending the customer over the barrel come renewal time once services are migrated and there's lock-in.
And sorry bud, but the whole "operating independently" thing...I don't buy it. I've worked for too many companies that were owned by someone else and purported to operate independently. It's just a flat-out lie.
I've worked for too many companies that were owned by someone else and purported to operate independently. It's just a flat-out lie.
I don't doubt it, given that this has been Broadcom's MO from the beginning. But IBM is not Broadcom, and while they've definitely messed things up, they've recognized the value in letting Red Hat remain independent.
You're leaving out bending the customer over the barrel come renewal time once services are migrated and there's lock-in.
This is easily resolved by negotiating a longer contract, and planning for alternative vendors prior to the expiration of said contract. The amount of the potential increase at renewal is capped at the cost of switching (see, for example...all the VMWare customers switching off VMWare because its significantly cheaper to take the one-time switching costs than to pay 1000x every year).
This is all part of basic Negotiating 101. It sounds like your company isn't any good at it, and they could save a lot of money by getting better negotiators. (Now you know why Legal gets paid $$$ to play solitaire most of the day.)
I guess we just don't see eye to eye on these things, which is okay. I freely admit that my employer isn't too sharp on its use of technology. But I also won't be convinced that folks like Red Hat and IBM aren't total bloodsuckers. :)
The plan is to go native with our IaaS since they offer a native solution that is comparable to what we had with OpenShift.
We have always been hybrid cloud and I don't see that that would change in the future. Honestly the future will probably be what was always predicted: have a set of "core origin" servers that are on-prem and then a cloud membrane around that.
On-prem might still mean using a vendor for the actual care and feeding of hardware, there's no money in us running our own datacenters.
> Americans surprised when their economic and political system worked exactly in line with historical trends.
But that's not accurate. Post WWII up until the mid 70s saw an explosion of middle class earnings and relative wealth, and a large shrinking of wealth inequality in the US.
So we just need a nice all encompassing global conflict again that largely leaves the American industrial base alone and then when it is the only one standing there can be another growth in the American middle class.
In all of these cases real incomes grew enormously. Yes, a big part of that was starting from a low base after the destruction of WWII. But I'd argue it was also a strong consequence of the technology of the time: the was an explosion in consumer goods enabled by new tech, but companies still needed lots of employees to create these products. In the past ~25 years I believe tech has instead allowed more wealth to accrue to a smaller and smaller subset of people.
Yes, all fueled by ridiculously abundant/cheap oil. This is something that might not happen in Earth's history ever again, not to mention the climate change issues (which at least weren't clear until much later, 80s rather than 50s for oil depletion issues).
Yeah, I'm personally of the opinion that the 50s-60s economical benefits are not generally sustainable. Similar to China's rise up until now, it's the result of a one time boom often as a zero-sum game with other parts of the world. The humans on the planet are definitely getting a more comfortable life over time, but any individual state with our current political systems I don't feel ever leads to that 50s-60s level of purchasing power for a long time.
I'm not sure about that. Very little was digital back then. It was far easier to claim lack of earnings back then than it is now, even with the high rates
The political philosophy guided by neoclassical economics; the political economic philosophy that has governed most public discourse since the early 80s.
Investopedia seems to agree with you: "Neoclassical economics theories underlie modern-day economics, along with the tenets of Keynesian economics. Although the neoclassical approach is the most widely taught theory of economics, it has its detractors."
According to https://www.reddit.com/r/AskEconomics/comments/5wdup7/are_ec..., however: "The vast majority of all economists today work in the New neoclassical synthesis. This paradigm is essentially a combination of the best ideas from the New Keynesian and neoclassical trains of thought. The idea of separate schools isn't nearly as relevant as it used to be - these days ideas that work get added to the synthesis regardless of where they come from."
There's not much in so called "New Keynesianism" that Keynes would acknowledge as being consistent with his theories.
Be very wary of anything you read on /r/AskEconomics; contributors are regularly banned for asking challenging questions that point out deficiencies with the orthodoxy. It's essentially an echo chamber with the top level replies carefully vetted. What it highlights through that is the deep (and justified) insecurity that permeates the economic orthodoxy.
The Neoclassical school is laughably simplistic in its model of the world. To reduce an entire economy, with all its diversity and irrationality, to a handful of variables connected by simple relationships is frankly absurd [1], and really a reflection of economics' physics envy. They make strong claims about how the models are built robustly from micro-foundations whilst ignoring fundamentals such as irrational agents and missing variables, and blithely ignoring important problems such as the SMD result [2] which basically means only a single representative agent and a single representative commodity can be considered (I've seen models that claim much more than this, but practically the higher dimensions are immediately integrated out). That St. Louis fed model is considered advanced because they have 2 representative agents!
Moreover, the models don't maintain important invariants, such as stock-flow consistency, that absolutely must be true as a matter of accounting.
One might give some allowances for all these theoretical problems if the models validated and made reliable predictions, but when it comes to anything of importance, they're little better than a first order Markov model. It's frankly absurd, and a testament to the power of rhetoric and vested interests, that we've built so much of our political economics on such shaky foundations.
There's basically no alternative being seriously entertained in mainstream politics, even on the left, to what amounts to an academic justification for inequality, and that is why we are in the mess we are in, Trump and all.
Kind of, yeah. I remember being a teenager in the 90's and it really felt like things were going to be radically different, and better. The cold war was over (well, we thought it was), anybody could talk to anybody else anywhere, anybody could publish anything, and surely this would mean regular people would be more empowered than ever before, right?
It's hard to explain how _cool_ Google was circa 2000-2010 or so. How they genuinely seemed a bit cyberpunk and they had figured out how to do cool amazing things and make money and not be evil.
Sadly, it was not to be. But maybe I was just a naive teenager.
Naive twenty something (back then) here. The latter half of the 1900s changed so drastically that yeah... a Star Trek like utopia seemed plausible, if not inevitable.
It wasn't until the post-9/11 mobile revolution and normies embracing the internet (late 2000s) that things took a hard turn for the worse. I was honestly surprised (shouldn't have been), and now sorry I didn't do anything to reverse the trend.
We need a well-capitalized organization to keep general-purpose computing alive, along with privacy, security, and autonomy. There are lots of little organizations of course, but they are unfocused and operate like ants in a realm dominated by BigTech giants.
----
Reply to below, I can no longer post for the next hour:
----
Right. Unfortunately I don't have the capital, but would love to work on the problem... even for free/cheap in my spare time. And will.
For example, been testing the new Starlite tablet with Phosh... and it is soooo close! I'm about to start learning how to develop for it. But it would go faster if say... starlabs, purism, system76, pine, riscv companies, FLOSS peeps would collaborate more effectively. They do to some extent, but don't often push in the same direction.
One major problem is the quality of documentation of interfaces. One of those boring things most don't want to do without a paycheck. Despite decades of experience with Linux I don't (yet) know where to start with wayland, dbus, gstreamer, gtk, etc. A book that pulled all this together for developers would be a big enabler. Think it would need to be sponsored as it won't be sustainable on its own.
"It wasn't until the post-9/11 mobile revolution and normies embracing the internet (late 2000s) that things took a hard turn for the worse"
I think this doesn't leave enough blame at more technical people embracing the same platforms as normies. After years of bulletin boards and forums where people built up small communities online, everyone migrated to behemoths that actively undercut any chance of that kind of community (examples including Facebook's restrictive interfaces and aggressive push to merge personal and online lives, Twitter's character limit, Reddit's tree-based ranked discussion structure or its obliteration of any iota of personalisation to profiles).
Even with the current BlueSky boom it's wild that so many techies tried to persevere with Twitter in the last few years (the boosting of subscribers to the top of all replies should've been instant death).
The few forums I was on back then that actually survived that mass migration are _still_ around and some of the only fun parts of the internet.
not an expert at all but maybe if ipv6 was embraced it'd result in people returning to doing a lot more grassroots stuff and just by being fun it'd massively challenge the grimness of the last 10 or so years of an increasingly restrictive online experience.
Ok, but I think the gravity of normal folks bends the industry whether we techies like it or not.
To further subdivide the techie contingent, lots of them have no problem using Windows even though Linux/FLOSS has been viable for a decade or two. So even most techies don't care about the problem.
I don't know how much the internet changed or I changed. Finding some niche forums on Prodigy (so not even the internet) and talking to a small group of people felt a lot different than just going to reddit and finding a forum for whatever random thing I'm looking up.
Did it? I personally feel the pervasive optimism lived on in the zeitgeist until about 2015 or 2016. And to be clear, I'm not saying that Trump being elected is what ended it; rather, I believe it was the hyper-polarization (already being talked about by then) of that election that really quashed it.
Everyone's different but I think I felt like the increasing polish and commercialization of the web killed it slowly. And that makes sense to an extent, as there was money to be made people would invest more (and have entire teams) in making ad-optimised content instead of just having one person cranking out homestarrunner or thebestpageintheuniverse or what have you.
Also, the closing of open systems. This whole idea of "whatsapp me or slack me or discord me" - that's ridiculous! It's _obvious_ that I should be able to use whatever client I like to talk to people, just like I did with gaim and AIM and MSN messenger and ICQ etc. etc. Now we're perilously close to the point where websites will just block you if you're not logged in (conveniently via Google using their browser, of course! Firefox users can get lost.) I can even get the need for it as AI makes bots increasingly good, but it sucks.
Edit: Also re: open systems - we went from default-open with desktop computers to default-closed on phones. Now you and your work exist at the pleasure of and for the purpose of enriching Apple and Google. Android SHOULD be something you can run and do with as you please, but of course you can't if you want to be able to do things like use your banking app.
Years before 2015 “the internet” for most people had been replaced by “social media” and its was pretty well understood that big tech companies now had a means and motivation to monetize our most toxic traits.
The optimism about the internet’s influence peaked when things were highly decentralized with personal websites, mailing lists, web rings, etc. it was hard to imagine an entity big enough that could manipulate “all of the internet”.
Eventually centralizing forces like google/yahoo/myspace made things much more usable, for a while, until their hacker-ethos were overtaken by an MBA-ethos.
I’m not sure that positivity died with 9/11, but I can look back and recall a large number of people struggling after the 2008 crisis, and whole economies never entirely recovering, and so optimism had taken some hard knocks well before 2015/2016.
Remember how one of the early episodes of Portlandia around 2012 waxed nostalgic about the 1990s as a sunnier time?
For me and my cohort (I'm in my early 50s now), yes, absolutely. Again, from my point of view, things got even worse post 2008, with the rise of mobile and social media. The tech world specifically evolved in directions dramatically opposed to the pro-human mood of the 90's towards a much more predatory stance.
Interesting, based on the replies here, maybe it's a somewhat generational thing. I was born in 89, so I remember 9/11 when it happened and the wars that followed, but it wasn't something that I put that much thought into. Similarly, I was barely out of high school and into college in 2008. My main concern was playing video games with my friends and trying to come up with excuses to skip my Minnesota History class.
I'm sure generational differences are a big part of it. I was already in the workforce by the mid 90's, and honestly, it was such an optimistic time, for everyone. Lots of hope for the future. I wish I could share what it was like. It was a remarkable, special time, and even though we thought we knew it, we didn't really appreciate what the actual special aspects were until long after the fact.
I don't think it did. The utopian optimism of tech changing the world for the better epitomized much of the 2000s and 2010s. Neither did the author of the featured article, which is referring to the current day as the death of tech utopianism, even though I don't think they argued this point well, considering that they simply pointed towards a selection of high profile examples of right wing members in tech as evidence of the demise of utopian optimism overall.
90s scifi had some penetrating foresight tho... I feel like we're inching closer to Neuromancer's universe as the war festers on in Ukraine, Japan keeps on roboticizing and somehow a mixture of corporate and populist right wing captures the electorates worldwide.
>It's hard to explain how _cool_ Google was circa 2000-2010 or so. How they genuinely seemed a bit cyberpunk
I think that's very much an insider's view, the sentence is in particular funny because "cyberpunk" is not exactly a term of endearment. Mike Pondsmith and William Gibson are hardly disenchanted Zoomers or Millennials. I think Google still does seem a bit cyberpunk, and I don't mean it as a compliment.
I think the John Barlow, cyberlibertarian school of thought had always more to do with what's later been dubbed the "Californian Ideology" rather than technology per se. I don't think it was ever a mainstream view.
Well, maybe it's because I'm Californian. I don't think I'd call myself an insider, I never worked at Google and I'm from Sacramento (which felt painfully un-cool back then!). And the Google I'm talking about would be staffed by Gen X'ers/Xennials at the time mostly - Someone who's 25 in 2001 was born in 1976.
I don't think an embrace of cyberpunk ideas, whatever those are, was entirely mainstream, but the idea that the world was opening up, the internet interpreted censorship as damage and routed around it, and it would help bring down dictatorships, was definitely in the ether.
It definitely brought about a lot of positive changes. It's fair to be disappointed that some of the things we hoped for didn't materialize, and that a lot of negatives were even worse than expected.
The historical trend is for improvements followed by lulls. But we never can predict in advance how far the improvements will go. We do feel that there was a lot left on the table.
If you were alive back then: yeah, pretty much? The expectation was that merely getting a tech education would seamlessly and immediately roll you into a six figure job no matter which industry you were interested in, because much like AI today, tech was literally seen as the magic ingredient that had been missing all this time.
Hindsight's cynicism is the enemy of understanding history in this case, obviously there was no golden age, but at the time the graph was going up, and money and not just the promise of an easy life but constant stories of people making it big because of their skills (unlike, say, crypto) made a lot of people go "this time it'll be different". And because in a rare few cases it was, everyone bought into it.
In the year 2000, Google was fresh, the Internet was becoming a normal thing for people to use and it was supposed to get rid of the old power structures and bring about a new age of egalitarianism and meritocracy. Plus I was younger and much more idealistic. And to be fair, it has caused revolutions and caused new power structures to be established, and torn down old ones. But as the old adage goes, it turns out that power corrupts. So meet the new boss, same as the old boss. I'd like to pretend I'd do better with my money if I'd invented PageRank back in the 90's, but having seen how money corrupts people, I'm not convinced that I would.
Asking sincerely because I'm not well-versed on this topic - do we have actual proof that humans are causing global warming?
My understanding is that the climate will change independently of human activity. For example, we know that there was an Ice Age and it was not ended by human activity but rather natural processes. So the climate has been known to change historically without human involvement.
So here is where I'm looking for clarification: I thought the "controversy" over climate change was the degree to which human activity is accelerating a natural process of warming?
Said differently, the planet is warming by itself, but humans pumping hydrocarbons and other things into the atmosphere is speeding up that process. But the cause is not solely because of human activity.
Thanks in advance for thoughtful responses, I'm really just trying to learn here.
The rate of change in climate (due greenhouse gas inputs from human activity) is much faster than it has occurred naturally in the past. If you are genuinely curious, the US EPA's climate change indicators report would be a good place to start. https://www.epa.gov/climate-indicators/climate-change-indica...
Yea, the climate change narrative is highly overblown, but it will take a few more years before the overtone window shifts for larger swaths of people in power to accept it.
This article gave me some great laughs. It's clear that the author only studied Buddhism for four years (a very short period of time) and has some very fundamental misunderstandings of Buddhist teachings.
For example, this explanation of karma:
> Together, these tenets imply the existence of some cosmic judge who, like Santa Claus, tallies up our naughtiness and niceness before rewarding us with rebirth as a cockroach or as a saintly lama.
That is not at all how karma should be understood. If I go give out $10,000 to the homeless, I'm a damn fool if I think I'm going to get $100,000 back in the mail!
> It's clear that the author only studied Buddhism for four years (a very short period of time)
This kind of response always sounds like victim-blaming to me.
If your system of <whatever> is so complicated that someone can't even hope to get the basics right in four years of study, maybe the right thing to do is, in fact, to ditch it.
Pretty much any introductory guide to Buddhism from any of the major schools will explicitly say "karma is not a deterministic system of merit and demerit". I'm not sure how the author came to that understanding of karma, but it's radically at odds with what the vast majority of Buddhist traditions actually teach.
Up front though in every other thing I’ve read is that this is going to be hard, so that doesn’t seem like a fair argument either. I don’t tell people that the VI text editor is easy to learn or even that it’s even superior to anything at all, only that’s worked for me and that I hope everyone finds something that works for them. It’s not like we need any more barriers to helping people code and write, yet they all take a set of complex practicing to be proficient given we aren’t wired by genetics or anything do such complex activities such as language and walking upright.
Bad analogy in my opinion. Learning basic vim can be done in a couple of hours and I think would benefit literally every programmer to know the basics because it is so much nicer to even navigate a single line of code using the vi keys (w, b, f, % etc) than any other alternative I've heard of.
belief systems can take years to fully adopt, since they attempt to rewire your way of thinking about the world. it's easier for the ones that dangle clear rewards and punishments, though some might call them shallow. for the ones more focused on inner retrospection, there's no obvious short term reward they can offer.
NO, not victim blaming. Where is the victim? And where is the blame?
But it does read like someone that has just gotten into Buddhism recently, has become lost in the 1000's of different sects and writings, and his 'takeaways' after a brief exploration seem very wrong. He is arguing against Buddhism for things that aren't Buddhist, more misunderstands of something he heard once, and sounded Christian.
Since in Buddhism there is no central authority, and every monk can write up some hot take on their views. It can be very difficult for newbies to wade through it.
I’m interested to see where this goes. That said, I am now planning to delay upgrading my iPhone 11 Pro until iPhone 18 comes out.
IMHO Apple and everyone else is moving way too fast with adoption. The deployment surface of iPhone is huge - I’m interested to see how Apple handles their first really serious issue (like “diverse Nazis”).
Also - current AI programs are complete and total pigs. iPhone 16 offers 8GB of memory and 1TB storage. I know the programs need the memory and so forth but still. I get it but I’m also going to wait now for the vendors to figure out the new future.
In the meantime, I will watch and wait. Plus, if Apples history is any indicator, the first 2-3 versions will be lame but then around 4 or 5 it will take off.
Imho, one of the reasons Apple built out their hybrid security/privacy arch was so they could trade data transfer for cpu/mem, when it makes sense.
The user sees some additional latency, but Apple can deliver equivalent functionality across hardware generations, with some calling out to servers and some doing on-device processing.
Honestly, I'm mostly impressed that Apple is aiming to deliver OS-level, works-in-all-apps functionality.
Imho, that's what users really want, but MS and Google's product org structures hamstring them from being able to deliver that quickly.
(I'm just a programmer so it's fascinating to me to consider how actual brain scientists model consciousness in their work.)