Hacker Newsnew | past | comments | ask | show | jobs | submit | binaryfinery's commentslogin

/world runs out to buy kinect just to be able to participate in class action suit.


Solid State Discs in everything.

Ok, perhaps not what you were asking, but they made a big difference for me. I have two, raid0 in my desktop, and a sandforce in my MBP. What a difference. Compiling, linking, copying, everything not just faster, but almost instantaneous. Yum.


Buying an SSD was a huge disappointment for me. Programs were not "almost instantaneous" but just 2-3 times faster to load if there was a difference at all. So OpenOffice took 3 seconds instead of 7, not worth the huge price difference for me. Compiling is CPU bound, I noticed no mentionable difference there. It might be worth it if you use a "bloaty" OS with virus scanners and indexers running or have specific use cases where very fast access is needed, but for a lightweight system the difference is neglible. The only real difference I actually noticed without looking at numbers was that after startx the XFCE desktop was ready before the monitor did its resolution switch. Lots of random reads in that process I guess. From a HDD it takes a couple of seconds (once per day...).

I returned it. And before some is implying I am dumb: Yes, I had a fast and quick SSD and my system was setup to use it well.


You need more CPUs then :-)

Also, there really is a difference between SSDs available right now. My Kingston SSDs have blocksizes of 120k (!). My OCZ Vertex2 has blocksize of 4k. Its 4k write speed is off the chart. Like 40x faster than some SSDs.

I'm on OSX (2 cores), Windows (8 cores) and CentOS (4 cores).


Oh yes, I regret having bought a 2 core CPU. Maybe I will upgrade to a 4 core some day.


Yup. And PCI-e SSDs are going to rock-n-roll next year. 160GB at 700MB/s w/ 10us latency for $300? Yes please.


"Mac iOS dev tools are beautiful and extremely slick"

And yet have features found in SlickEdit circa 1992.

Twenty fucking years later I'd expect my IDE to a bit smarter. It might look beautiful (to you) but its brain dead.


Do you have some implementable ideas?


Absolutely:

http://blog.binaryfinery.com/monotouch-qa

This is written for potential clients, rather than hackers, but I do go a bit deeper into why ObjectiveC sucks near the bottom.

Basically, use good tools. What hacker would create a website without analytics these days? Yet Apple wants us to settle for an editor that is dumb as bricks, when tools like Resharper or Eclipse that can not only analyze our code but refactor it automatically. XCode has 6 refactorings, and all of them will fail to do what you want, or just flat out refuse. Resharper has, I think, 120?

Resharper analyzes my code as I type. I develop iPhone apps on my PC in C# and even before I compile it on the Mac, I know its going to compile because Resharper has been analyzing it the whole time.

There are basically two problems with XCode and Objc and neither of them are solvable:

1) Writing tools for languages that have header files that can be included multiple times, and which require a class to be declared all over the place, is hard.

2) There's a shit load more developers for java, C, C++ and C# than there are for Objective-C. This may change, enough to incentivize groups (companies/oss) to write better tools, but they'll still be years behind groups like jetbrains or eclipse.

Now, C and C++ may be 20+ years old, but they've been in constant use and development that whole time, and by a majority of programmers. Yet even these languages have poor tools (the whole header issue). Objective-C is 20 years old, but with, what, a 15 year hiatus.

So the solution is to say "Why the fuck am I using a 20 year old language that lacks 20 years of tool development" and move on to something thats either a) newer or b) well established. I recommend MonoTouch.


The problem Feathers describes has a simple cause: the classes in java, C# etc are a premature optimization. The rewrite described, in my experience, is caused because having to reclassify everything is annoying and error prone when:

a) classes are part of the language and,

b) classes are defined by a stream of characters in a flat file on a disc

C++ makes it even more difficult to automate the redoing of classification by requiring that the definition of a class is stored in multiple such files.


YAGNI.


Oh, yeah, of course. You ARE going to need it. You're different. Mod me down for my lack of understanding how awesome you are.


That's why I use google appengine. It would have handled what you describe without even going over the daily free limit. Of course, you wouldn't be cool as rails.


You could be cool as Clojure, though: https://github.com/gcv/appengine-magic


Whats the start-up time? i.e. when google spins up a new instance to handle traffic, how quickly can it come up?


In the free configuration, I've clocked it between 4 and 6 seconds. (Sometimes up to 10 seconds, but not recently.) With the 1.4.0 release of the App Engine SDK, the cold start time has become less important, though. First, you can pay $9/month to have three instances running all the time. Second, App Engine can send a "warmup" request to your application.


As you read these wonderful theories, remember that economist who was chosen as the best economist to run the US Federal Reserve said in 2005 that "now is the best time to get an ARM".

These genius suggested buying a house on an adjustable rate mortgage, at the peak of the market, when interest rates were the lowest they'd ever been. Bravo. Apparently, even the best economist in the USA is unable to see a bubble when its staring him in the face.

Fuck economists and their religion. Welcome to HN: where people have brains, can do math, and critical thinking.


Nice, neat, simple equations and math, at the Macro level.

To put this in hacker terms, these simple rules are about as simple as the http protocol. Now imagine the software that implements this simple protocol. Think of all the different ways to exploit it. Now imagine all the complex systems that can be created using these simple rules. Now add hackers, DDOS, government censorship, wikileaks, peering disputes, etc.

Economists would have us believe that these simple rules are "how it is". Now imagine the millions of people it takes to actually implement these simple rules. Imagine what it takes to actually make wine in Portugal. Now add that these people want pensions. Now include that these simple rules assume currency equivalence, or something like the gold standard to mediate production disparities. Now add the governments on top of this "protocol".

To believe this "proven economic theory" you have to live in a virtual machine.


Yes, quite right. We'll benefit in the same way that countries like Equador or Nigeria benefit from the USA's innovation, wealth and power.

Which is to say, if that if we have something that a Chinese superpower wants, they'll take it by force, or by installing a puppet government, and if we don't, then good luck buying those medicines: they are 10000yn, with $10 to the yn.

The "economists" who take your statements as fact do so because they are paid to do so. The ones who advocate protectionism are dismissed as communists or marxists.

The trade theory you talk about assumes a balance of payments, required by something like the gold standard, which simply doesn't exists now. This is economic war and the USA is losing. I did some International Relations at Cambridge University. You should try thinking instead of just "accepting as fact".


"A (protectionist) wall may keep the outsiders out; but it also keeps the insiders in...."

As opposed to our policy, which is to give hundreds of billions of dollars to China every year. That's working out so well! Or are you one of these people that doesnt believe that empires fall, or that an empire can fall simply because it runs out of money?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: