Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Neither matter, what matters is profitability.

It doesn't matter if you can serve 10,000 requests per second if the cost of that request exceeds what you're paid for it.

For most sites ruby is profitable, the value add from ruby is decreased cycle time between releases which can increase the profitability of your site more than increasing the pages per second can.

The vast majority of sites don't need to scale beyond a single server, and when you do need to scale I prefer things like drop in support for S3, drop in support for memcached and a whole host of other performance increasing techniques over raw pages per second.

The benchmark benchmarks the very simple case of a single PHP page, what real PHP app have you used that is a single PHP page? The request times for something like WordPress are insane because of the number of PHP files that need to be interpreted per request.

edit: To clarify the jist is that low margin activities where increasing the speed of your page by 5X would have a great impact on your profitability are not the areas where you should focus you efforts. Instead focus on high margin areas where you could write your pages in SnailScript and you'd still be profitable.



1) AMZN and GOOG have proven that pageload speed does in fact directly relate to profitability.

2) Regarding "The vast majority of sites don't need to scale beyond a single server," you could make the same point that the vast majority of sites are not profitable either. It's a total non sequitur in either case.

3) Your point that it "costs less" to develop in Ruby which can increase the profitability of your site is an incorrect assertion. It may be true that the time to develop in ruby can decrease your R&D costs, it does not affect your cost of goods sold and has no impact on your margins. If you just say "it costs me $100 less to develop in ruby, thus I've made an extra $100" you need to take an accounting course.

More to the point, if you are building your business with your optimization being focused on decreasing development time, you have already lost sight of the goal.

PHP sucks, but there are a ton of profitable companies that use it because it is wicked wicked fast. In my mind, that makes it suck less. :-)

-David


On your point #3. Actually, you are incorrect. Profit is revenue - all costs. Whether or not it counts as COGS is only relevant for gross margin, a metric which isn't the best to judge software companies. R&D is an expense, so it affects your profitability and your net margin. If your company spends $100 less, that's $100 more you have in the bank, $100 less that you need to sell people on.

http://www.investopedia.com/terms/g/grossmargin.asp http://www.investopedia.com/terms/n/net_margin.asp


gross margin is unquestionably the metric used to gauge software companies. That's why we are so scalable. Especially SaaS. It's because our COGS are typically so low that we can afford relatively large R&D budgets (compared to other industries).

Lots of software companies have gross margin's north of 60% or even 80% (GOOG, afaik) which is unheard of in other industries.

If you focus your business on growing revenue and having a reasonable gross profit margin, you are focusing on the right knobs. If you are focusing on decreasing R&D as a means of maximizing net profit, you are focused on the wrong things.


> PHP sucks, but there are a ton of profitable companies that use it because it is wicked wicked fast. In my mind, that makes it suck less. :-)

I think in this mini benchmark, it's not so much that "PHP" is fast (as a language, it isn't, really), rather that Ruby + its various frameworks are pretty pokey.

In any case though, while I remain a Ruby fan, the grandparent sort of misses the point: by being more efficient, you have a wider range of things that can make you money. If you have to have a huge amount of resources to get some pages up, doing ads or some other low-margin activity might not even be feasible like it would with a faster solution.


You're actually on to my point which is to stop doing low margin activities and focus on high margin activities. Low margin activities are where writing a page in C / C++ might be a really good idea. The performance critical parts of your site can always be rewritten in assembler if need be.

If your margin on a page request is 5% your business is probably fucked anyway (unless your serving billions of pageviews per month, there are only a few sites that work on this business model). I'd much rather have a business with orders of magnitude fewer requests and a profit margin on the order of 10s of thousands per page request.

What do you think 37Signals margin on a page request for basecamp is? My guess is at least 10,000%.


"You're actually on to my point which is to stop doing low margin activities and focus on high margin activities."

That's something I would certainly endorse. I didn't get that out of your post, I completely agree. Engineers who start companies often overlook this fact and focus on the entirely wrong set of problems when building their business.


I shouldn't have thrown in that line about PHP, that was a total tangent. And in any event, I agree with your response. :-)


My point is that out of the gate I'd rather have 2011 tools running on 1999 hardware than 1999 software running on 2011 hardware. A 2011 dev team will run circles around a 1999 dev team because their tools are better and allow them to iterate much more quickly. By the time your startup overwhelms a Pentium 3 server you should have enough profit to by new stuff.


Good luck getting a PIII to serve Ruby on Rails content fast enough to keep people interested and cheap enough to still make a profit from advertising.


I actually tried this recently. Page load speed isn't too bad at all, but the time it takes to start up the rails stack is awful. It's unusable for dev work.


OTOH, his shorter time to market also imply smaller product cost. With that difference, it could make sense to invest in beefier servers.

You could end up having 2011 technology on 2014 servers competing with 1999 tech on 2011 servers.


  AMZN and GOOG have proven that pageload speed does in fact
  directly relate to profitability.
Let's not forget that's the client side speed not a server side. The difference is important: the time to generate html on the server may be just 10-20 percents of total load time.

On the other side, client-side optimization does help your servers: imagine if you cut from 60HTTP requests to just 6—and did that with proper caching policy, so your servers won't be hammered on subsequent page loads just to answer "304 Not Modified".

Ever since I got interested in client-side optimization (a bit over two years now) I am amazed how neglected this aspect is. I may be biased, of course.


What's stopping you from profiling your rails app after you finished it before the home built php solution? You always want to decrease development time, product out the door is what matters as Windows and Linux/Unix prove that even with superior alternatives (plan9, inferno) being first and with momentum is infinitely more valuable.


As per point #3, remember that you amortize the cost of r&d along some payback period. As such it does effect your margins.

I do agree with your point though. Decreasing development time is the means to the end of making more money out of the product and making a greater return for your (money/time) investment.


Page load speed does relate to profitability, but your language choice most likely won't move the needle so far that you notice a difference. My startup's Rails-based homepage averages ~11ms render time.


Studies have shown that people convert better and view more pages if a site is faster [1]. And page load speed is a factor in your Google ranking [2].

[1] http://www.watchingwebsites.com/archives/proof-that-speeding... [2] http://googlewebmastercentral.blogspot.com/2010/04/using-sit...


That's all well and true but honestly, how many currently unprofitable sites would become monstrously profitable if they could increase their requests per second by a factor of 5?

The study you link to shows a decrease of in rev of 5% by adding 2 seconds of load time vs. 50 ms. There are very few businesses whose costs are dominated by servers. Most spend more on a single developer than servers. Yes, Google, Twitter, and Facebook could probably do well by doubling their requests per second but the average companies costs are dominated by employees, that why it's called Ramen profitable and not EC2 Micro profitable.


No need to put words in my mouth. There are no magic bullets to monstrous profitability.

But website speed is one of those things (among many others!) that has a measurable impact on your business. This isn't just a nerdy pissing contest.


There is nothing in 99% of Ruby slowness culture that is due to optimizing for the programmer IMHO. It is just the missing profiling step or the right design to be fast most of the times. So many things can be improved without killing any abstraction opportunity.

On the other side, profitable or not, users don't like to wait that 200 milliseconds more because the code is not written in the right way.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: