Hacker Newsnew | past | comments | ask | show | jobs | submit | tomtomtom1's commentslogin

I think the object-oriented paradigm, is the most hated aspect:

- "Object-Oriented design" ( class diagrams, use case diagrams, Abbott Textual Analysis... and all the bike bikeshedding fun of UML, Rational Unified process..) generally pushed by the likes of IBM, Oracle, and heavily taught in SE courses.

- a watered-down version of the above, where formalism is discarded, but the first problem-solving step is to decompose the system into classes. an infamous example is the chess-board interview. I think the author is criticizing this part.

I think the majority of people don't hate things like vector<int> or set<int>, despite them being classes.


a quote on this topic, during the early days of the Afghanistan, was "CNN shows us where the missiles are launched and Al-Jazeera shows us where they land"

a More concrete example is "On February 8, 2018, it was reported that Qatari leaders had reassured the leaders of Jewish American organisations that Al Jazeera would not be airing its companion documentary series on the Israel lobby in the United States. According to Haaretz, the Qatari government had reportedly hired Republican Senator Ted Cruz's former aide Nicolas Muzin to open communications channels with Jewish American organisations. Earlier, the network had sent letters to several American pro-Israel organisations informing them that their employees would appear in the documentary. These letters generated speculation that the Qatari government had reneged on its earlier promise to block Al Jazeera from screening the controversial documentary which, like the earlier British series, had utilized clandestine footage and recordings of pro-Israel activists." (https://en.wikipedia.org/wiki/The_Lobby_(TV_series)


I don't understand why no one is blaming the education system?

How is it possible that students that are being considered for "advanced placement" are not familiar with file formats?

how horrible and backward your education system must be? This belongs to basic literacy in the modern world.

Note we're not speaking about old people who were exposed to files for the first time while they had to balance a family life and other responsibilities.

We're speaking about people that were forced to spend more than half of their lives learning !!!


Because in a world where apps do everything for you and 80% of your use is on a phone and the remaining 20% is either for a once a year use case, you don't need to worry about image formats.

Even as a software engineer, I haven't used a desktop or windows in 5 years. I haven't used TheGimp (or any other non web/app image manipulation software) in 3 years.


that's interesting, so you use an iPad for your day to day programming?

from my experience converting ppt/word to pdf is a common enough use case that I would've assumed the majority of non-tech people run into it.

Latex is also a common case where you run into file extensions.

note I'm not expecting them to understand the difference between extensions, just what they mean and how to google how to convert between them.


They don't learn any critical thinking skills. Memorization != learning.


when I last bought a laptop, notebookcheck's [1] cpu ranking was helpful.

[1] - https://www.notebookcheck.net/Mobile-Processors-Benchmark-Li...


how can you reconcile that with the fact that job-hopping in 2-5 years is proven to be a more effective way to raise your salary?

Being seen as "local" may have its benefits but I doubt that being one is more beneficial.


That is why I said "At least for a while" at the end. You can take initiative and feel responsible in that 2-5 years.


"your job is not your job; your job is to find a better job."

Adams, Scott. How to Fail at Almost Everything and Still Win Big: Kind of the Story of My Life (p. 31).


I genuinely hate the software architect title. It's like when we were in school and the teacher told us to write an outline before writing. No problem ! the majority wrote the text first and then the outline.

That's because people in general write to think.* The level of deep thought achieved through writing is harder to achieve beforehand.

In the same sense, the level of deep thought achieved through writing code, is hard to achieve beforehand. It's actually worse because your engineers, dangled in this web of classes, won't be able to think about the big picture for themselves.

An Architect thinks in terms of what sounds good, what is beautiful in OOP-land not in terms of what is easy to write, and what's performant to implement.

One poignant example is the interviewer who wants you to implement chess pieces as classes. which sounds good in the architects' world but is blatantly insane if you think about the actual code and the actual challenges you're facing.

* writing to think and writing to be read are conflicting goals that why editors exist.


would ios_base::sync_with_stdio(false); also help?


In my benchmark, the c++ version uses the c api (printf rather than cout), so it wouldn't make a difference.


> An infinite number of zeroes. . .and then a one. . .wait, you can't do that.

why not? why can't an infinitely small number exist?


It can, and infinitesimals are a part of so-called nonstandard analysis, but you cannot write an infinitesimal using decimal notation. "0.999…1" is simply meaningless, a contradiction. If you have a "…" it means there's no place where you could put a "last" digit. If "0.999…1" doesn't feel impossible enough, then what would "0.999…9" mean?


People in this thread seem to think that “9 repeating” is not an infinity of nines but is instead “write or think of nines until you get bored and then write something else”


Because if it has 1 at the end, then this will mark it’s end, thus making it finitely small.


if we say that infinitesimals exist. that 1/3 != 0.33.. and 1 != 0.9999... and the probability of possible events is never 0.

what are the properties that we would lose?


If we say that infinitesimals exist, it still happens that 1 = 0.999…. It just happens that 0.999 ≠ 1 - 𝛚.

0.999… = 1 is a property of the way we write some rational numbers, not of the number system itself.


wouldn't 0.999.. be equal to 1 - 10^w since it's only a countably finite series of nines.


0.999...9 with a countable finite amount of nines is clearly less than 1.

0.999... with infinite nines is equal to 1.


sorry I typoed I meant, countably infinite. by w I meant the ordinal.


I have never seen a number system with infinitesimals where the addition wasn't updated to ignore smaller classes if they come with larger ones.

That is, for any number system I've seen, 1 = 1 + dx, and infinity = infinity + 100.


The obvious one seems terrible enough, that division is no longer the inverse of multiplication: (1/3)*3 != 1


Nonstandard analysis exists (with infinitesimal and infinite numbers) , but 1/3 and 9/9 is the same there. The problem is that the numbers 0.333... and 0.999... don't really exist.


Completeness is one of the most important properties of real numbers. Basically, you will have to completely throw away real analysis.


1/3 does equal 0.3.. though.


more like 1/3 != 0.3333 ....

as in 1/3 does not have a decimal representation. you can only approximate it but never reach it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: