I run a side project called http://moodpanda.com , people rate their mood an a simple scale and other members give feedback, I think the feeback is what keeps people coming back to our app, not the rating part but the feedback they get from others.
As a CTO you will hopefully ALWAYS have better developers than you working for you, that's the point, the CTO role is not to be the best tech guy in the room. Its the guy that sets the vision, talks to investors, sets high level tech strategy (with a lot of team involvement), hires, fires, keep sales guys away from dev's etc
Eg. sort out everything that would distract your developers from building.
I'd Hire the full stack guys first, but spend a lot of time talking with them, the first hire is always the hardest but the most important. Its also hard not to hire people like yourself so take it slow, some of the best guys I have ever worked with are mid 30 with wife and kids that just know how to get stuff done.
Sorry about that, I was using "guys" as a generic term, but it got me thinking that in 18 years of development, I have worked with over 200 Dev Guys and only 2 Dev Girls.
Most programers are male? You can actually make that assumption factually.
How do you feel about the phrase "MIDWIFE's are in demand"? Is it appropriate or do you feel that people should have to say "Midwifes, both the females that practice midwifery and the males in in this profession, are in demand"? Can we just assume that the majority of midwifes are women and the majority of programmers are men without that being offensive?
> How do you feel about the phrase "MIDWIFE's are in demand"?
Well, for me, bad, but mostly for reasons unrelated to the main issue in the discussion -- because it uses an apostrophe improperly for "look out, here comes an 's'", and because the plural of "midwife" is "midwives". ("midwifes", without the gratuitous apostrophe, is a word, but its the third person present of the verb "midwife", not the plural of the noun "midwife".)
Good idea. I'll likely up the prices or drop the board allotments. I wanted to launch with something a bit more generous as it seems easier to put prices up. Then I can see how people are using it to get a feel for where the bands should be.
It did seem that most people using 3 or less boards didn't have much need for a dashboard which is why I pegged the free one to that. Something like Corrello was more of a nice to have to them it seemed.
Nothing's wrong with spinning up more AWS boxes. If it costs 300$ annually to solve a problem that would cost 5k$ in development to fix, I believe it's a wise choice.
Yeah, down the line you will eventually have to do optimization, but you will prioritize.
I wasn't advocating for badly written code to run on a whole datacenter. I was just pointing the alternative with the assumption that the code was somewhat healthy and adding one new instance to cover the sub-optimization wasn't a big deal.
Of course, if you have 10k users and it runs on 3 machines, you got a problem which no amount of boxes can solve.
I'm hardly what most environmentalists would call an "environmentalist", but one cost here is the increase in carbon footprint. Of course, to the company the cost/benefit analysis errs on the side of just spinning up more boxes. But from a larger perspective, taking some extra time to make more efficient use of machines could have a drastic impact. Many optimizations don't require months to implement. Many of those are even avoidable with a bit of foresight.
In a certain, quite limited model of economics that could actually be named as "wise".
Once a more holistic view is taken, wide spread total costs and benefits are taken into consideration, once costs are not only defined as money flowing out of my own pocket, once not only "Gesinnungsethik" but also and more importantly "Verantwortungsethik" gets applied, well,
in such a world we would probably wish, that Amazon would change its pricing policy to:
- get the first 2-5 AWS instances almost for free
- and pay for the next few 100 exorbitantly much more money.
We all would benefit from the cultural, technological and social changes this would help to spark, I think.
But who am I to question current culturally entrenched "economic" thinking...
"social changes from the pricing policy changes of marginal EC2 instances" sounds ridiculous when given this context, right?
It changes when the context are not "marginal EC2 instances" but instead energy and resources burning machines, used (often) by ignorant software developers and their organizations allowed and actually encouraged, partly even actively driven into such purely self beneficial behavior models.
For a definition of social: http://en.wikipedia.org/wiki/Social ... obviously driving software development into a scarcity of computing power would haven "social" consequences: in the development teams f.e. interactions and priorities would need to change dramatically.
But maybe software development turned almost into a "commodity" because of the commoditization of computing power available to even the most ineffective mental artifact aka program.
And that in turn was possible in large extent by off-loading the true costs of assembly / dis-assembly / disposal and the resources needed to build those machines onto people in underdeveloped regions of the world...
Now what could the social change be, the more expensive computing devices could allow for in those regions?
If people cared about "effectivity" not only via a detour to "uh, i need to recharge my phone, again?!" f.e. but essentially because because they would have to pay the true cost for their ineffective setup of hardware and software?
It sounds ridiculous because Amazon doesn't really control anything. The only result of their pricing change would be the switching to a competitor.
It changes when the context are not "marginal EC2 instances" but instead energy and resources burning machines, used (often) by ignorant software developers and their organizations allowed and actually encouraged, partly even actively driven into such purely self beneficial behavior models.
Fair enough, and I agree with you that more reflection is needed on the ethics of our industry. No argument there.
That said, I think you're miscalculating the result of such a switch. The fact is that servers are pretty efficient.
Say one of the developers commutes to work, doing ~12 miles each way on her/his Prius. If he works for two days optimizing the code, the energy cost of his commute will be ~130kWh.
With that same energy, you can run a PowerEdge R420 on full power (CPU benchmark) for almost 40 days! And remember that each of those would power a bunch of EC2 instances.
The reason EC2 instances are cheap is because they're actually cheap, both in terms of energy and resources.
Obviously this is not about one provider of such easily affordable computing power.
Sure, the servers are more efficient than they used to be, and as stated elsewhere yes, because they are shared they are more efficiently used than machines which are not shared.
"they're actually cheap, both in terms of energy and resources."
... well I disagree with this: they are cheap to us because we don't pay adequately for them: not for the energy, the labor and not for the rare earth elements f.e. All of which quite conveniently is actually payed for in just very few regions of the world: by the people there.
Sure "the market" came up with this prices but the same market simply ignores certain kinds of costs, which are not visible to us. One name for those is: externalities. One of those is the "total energy budget" needed to build and dispose such a machine and the power needed when running the quite often ineffective apps. I'd add a whole bunch of political / sociological costs to that.
And yes, I know that the real "power benefits" of optimizing code just don't add up into a significant number today. I think this is due to the externalities we don't pay for. I can imagine a world where it would be economically justifiable to pressure for effective code. Today only the very big fish actually feel the need to make some of their code effective. The others just consumer what is already prepared for them: the machines already running at the centers.
As of the impact of the switch: I did NOT calculate. So you are technically right (probably ;) and only as far as you chose (maybe not consciously) the boundaries of your model ;)
When a few years ago the power consumption of data-centers appeared in the world wide energy consumption overviews, I guess "we" knew, yes they are a big deal.
Sure, we don't pay for the full costs of running those machines.
But we also don't pay for the full costs of the stuff needed to have programmers optimize those programs.
A watt used to run an EC2 instance would become more expensive in the world you're suggesting - but so would the watt used to power the light while the developer worked on optimizing the code, or the watt used to power his car, etc.
So unless there's any particular reason why the costs of running EC2 instances are particularly less "priced in" than other costs, I see no reason to think the balance between the two options would change significantly.
When I say they're cheap, I mean relatively. Whether they're cheap in absolute terms (which is what you're arguing) is irrelevant to my argument.
To make an analogy, a bowling ball still weights more than a feather, even if you measure it on the Moon.
Now what could the social change be, the more expensive computing devices could allow for in those regions?
The social change in those regions would be the factories closing down and moving to other countries, as the low prices would be no longer so relevant, and so the workers would return to the famine and poverty of the 60s and 70s that they are just beginning to escape.