Hacker Newsnew | past | comments | ask | show | jobs | submit | rebootthesystem's commentslogin

Looks like I great article. I'll have to defer reading it. During a quick scan I saw a mention of the SMA technology that allows for dedicated outlets to be powered-up in case of grid failure.

I built a 13 kW ground-mount system feeding a pair of SMA inverters. I have tested this feature by disconnecting from the grid and enabling the outlets (one per inverter). I didn't quite get to the 2,000 W rating SMA claims but got close. Which means that with this size of an array and two inverters I get somewhere between 3kW and 4kW of power to run various devices while the sun is up.

Considering that we might have a couple of power outages a year on average (if that), I felt this was a reasonable investment. Going with batteries is just too expensive and not justifiable at all given the reliability of the grid. One way to think about this is that the grid is your battery. A stretch, I know.

Funny that there's a picture of a gasoline generator towards the end of the article. My guess is that I am likely to invest in a 5 kW to 6 kW generator before I ever add batteries to this system. Again, it's a matter of ROI. Also, I would not go with a gasoline powered generator at all. Gasoline degrades with time and could be a nightmare to maintain the system with sporadic use. I think a propane fueled generator might be a better idea. The fuel does not degrade. So long as you don't have leaks it'll be there ready to go when you need it.

I know way too many people who have been mercilessly duped by these solar companies who come in, hook them on some kind of a lease, install inadequate systems and move on to the next victim. Lack of understanding on the side of consumers has created a situation where solar is equivalent to magic and unscrupulous actors can take advantage of them. That part is sad.


My guess is this won't be a popular post given the average age of HN participants.

There's nothing whatsoever wrong with C. The problem are programmers who grew up completely and utterly disconnected from the machine.

I am from that generation that actually did useful things with machine language. I said "machine language" not "assembler". Yes, I am one of those guys who actually programmed IMSAI era machines using toggle switches. Thankfully not for long.

There is no such thing as an "array". That's a human construct. All you have is some registers and a pile of memory with addresses to go store and retrieve things from it. That's it. That is the entire reality of computing.

And so, you can choose to be a knowledgeable software developer and be keenly aware of what the words you type on your screen actually do or you can live in ignorance of this and perennially think things are broken.

In C you are responsible for understanding that you are not typing magical words that solve all your problems. You are in charge. An array, as such, is just the address of the starting point of some bunch of numbers you are storing in a chunk of memory. Done. Period.

Past that, one can choose to understand and work with this or saddle a language with all kinds of additional code that removes the programmer from the responsibility of knowing what's going on at the expense of having to execute TONS of UNNECESSARY code every single time one wants to do anything at all. An array ceases to be a chunk-o-data and becomes that plus a bunch of other stuff in memory which, in turn, relies on a pile of code that wraps it into something that a programmer can use without much thought given.

This is how, for example, coding something like a Genetic Algorithm in Objective-C can be hundreds of times slower than re-coding it in C (or C++), where you actually have to mind what you are doing.

To me that's just laziness. Or lack of education. Or both. I have never, ever, had any issues with magical things happening in C because, well, I understand what it is and what it is not. Sure, yeah, I program and have programmed in dozens of languages far more advanced than C, from C++ to APL, LISP, Python, Objective-C and others. And I have found that C --or the language-- is never the problem, it's the programmer that's the problem.

I wonder how much energy the world wastes because of the overhead of "advanced" languages? There's a real cost to this in time, energy and resources.

This reminds me of something completely unrelated to programming. On a visit to windmills in The Netherlands we noted that there were no safety barriers to the spinning gears within the windmill. In the US you would likely have lexan shields protecting people and kids from sticking their hands into a gear. In other parts of the world people are expected to be intelligent and responsible enough to understand the danger, not do stupid things and teach their children the same. Only one of those is a formula for breeding people who will not do dumb things.

Stop trying to fix it. There's nothing wrong with it. Fix the software developer.


> There is no such thing as an "array". That's a human construct.

Oh yeah; social construct, I would say, like gender.

> I am from that generation that actually did useful things with machine language.

Unfortunately, most of them are undefined behavior in C.

> You are in charge.

Less so than you may imagine. You're in charge as long as you follow the ISO C standard to the letter, and deviate from it only in ways granted by the compiler documentation (or else, careful object code inspection and testing).


This is a typical misinterpretation of the reality of programming. There is no such thing as undefined behavior. Once you get down to bits and bytes in memory and instructions the processor does EXACTLY what it is designed to do and told to do by the programmer.

Despite what many might believe the universe didn't come to a halt when all we had was C and other "primitive" languages. The world ran and runs on massive amounts of code written in C. And any issues were due to programmers, not the language.

In the end it all reduces down to data and code in memory. It doesn't matter what language it is created with. Languages that are closer to the metal require the programmer to be highly skilled and also carefully plan and understand the code down to the machine level.

Higher level languages --say, APL, which I used professionally for about ten years-- disconnect you from all of that. They pad the heck out of data structures and use costly (time and space) code to access these data structures.

Object oriented languages add yet another layer of code on top of it all.

In the end a programmer can do absolutely everything done with advanced OO languages in assembler, or more conveniently, C. The cost is in the initial planning and the fact that a much more knowledgeable and skilled programmer is required in order to get close to the machine.

As an example, someone who thinks of the machine as something that can evaluate list comprehensions in Python and use OO to access data elements has no clue whatsoever about what and how might be happening at the memory level with their creations. Hence code bloat and slow code.

I am not, even for a second, proposing that the world must switch to pure C. There is justification for being lazy and using languages that operate at a much higher level of abstraction. Like I said above, I used APL for about ten years and it was fantastic.

My point is that blaming C for a lack of understanding or awareness of what happens at low levels isn't very honest at all. The processor does exactly what you, the programmer, tell it do to. Save failures (whether by design or such things as radiation triggered) I don't know of any processor that creatively misinterprets or modifies instructions loaded from memory, instructions put there by a programmer through one method or another.

Stop blaming languages and become better software developers.


> This is a typical misinterpretation of the reality of programming. There is no such thing as undefined behavior. Once you get down to bits and bytes in memory and instructions the processor does EXACTLY what it is designed to do and told to do by the programmer.

Sure.

Only problem is, all you have to do is change some code generation option on the compiler command line and millions of lines of code now produce different instructions. Or, keep those options the same, but use a different version of that compiler: same thing.

> The processor does exactly what you, the programmer, tell it do to.

Well, yes; and when you're doing that through C, you're telling the processor what to do via sort of autistic middleman.

C is not the low level; you can understand your processor on a very detailed level and that expertise won't mean a thing if you don't understand the ways in which you can be screwed by the C language that have nothing to do with that processor.

I suspect that you don't know some important things about C if you think it's just a straightforward way to instruct the processor at the low level.

> Languages that are closer to the metal require the programmer to be highly skilled and also carefully plan and understand the code down to the machine level.

C isn't one of these languages. (At least not any more!) It's considerably far from the metal, and requires a somewhat different set of skills than what the assembly language coder brings to the table, yet without entirely rendering useless what that coder does bring to the table.


> all you have to do is change some code generation option on the compiler command line and millions of lines of code now produce different instructions.

It is the responsibility of a capable software engineer to KNOW these things and NOT break code in this manner.

You are trying to blame compilers and languages for the failure of modern software engineers to truly understand what they are doing and the machine they are doing it on.

If you truly understand the chosen language, the compiler, the machine and take the time to plan, guess what happens? You write excellent code that has few, if any bugs, and everyone walks away happy.

And you sure as heck are not confused or challenged in any way by pointers. I mean, for Picard's sake, they are just memory addresses. I'll never understand why people get wrapped around an axle with the concept.

I wonder, when people program in, say Python, do they take the time to know --and I mean really know-- how various data types are stored, represented and managed in memory? My guess is that 99.999% of Python programmers have no clue. And I might be short by a few zeros.

We've reached a moment in software engineering were people call themselves "software engineers" and yet have no clue what the very technologies they are using might be doing under the hood. And then, when things go wrong, they blame the language, the compiler, the platform and the phase of the moon. They never stop to think that it is their professional duty to KNOW these things and KNOW how to use the tools correctly in the context of the hardware they might be addressing.

I've also been working with programmable logic and FPGA's, well, ever since the stuff was invented. Hardware is far less forgiving than software --and costly. It forces one to be far more aware of, quite literally, what ever single bit is doing and how it is being handled. One has to understand what the funny words one types translate into at the hardware level. You have to think hardware as you type what looks like software. You see flip-flops and shift registers in your statements.

This is very much the way a skilled software developer used to function before people started to pull farther and farther away from the machine. It is undeniable that today's software is bloated and slow. Horribly so. And 100% of that is because we've gotten lazy. Not more productive, lazy.


> It is the responsibility of a capable software engineer

Nobody is saying that it's a acceptable for an engineer to screw up and then blame it on the tools (compiler, slide rule, calculator, ...).

However, if something goes wrong in your work, it's foolish not to recognize the role of the tools, even though it's not acceptable to blame them as a public position.

As objective observers of a situation gone wrong in engineering, we do have the privilege of assigning blame between people and tools. Tools are the work of people also. The choice of tools is also susceptible to criticism. We have to be able to take an objective look at our own work.


I don't understand how anyone can spend a career in software development, and still have such a poor understanding of the process. Space and time are far from the only concerns.

>As an example, someone who thinks of the machine as something that can evaluate list comprehensions in Python and use OO to access data elements has no clue whatsoever about what and how might be happening at the memory level with their creations. Hence code bloat and slow code.

Not having to care about details that aren't contextually important is a good thing. When someone is constrained more by development time than by computational resources, working in a high level language means you're explicitly shunting low level concerns so you can spend more time dealing with domain logic.

There are many situations where finishing something faster, which will run 10x slower and use more memory, is a worthwhile tradeoff.


Nowhere did I say that modern languages don't have their place and advantages. I use them all the time. In fact, I prefer them when they make sense for precisely the reasons you point out.

You might be reading far more into my comments than what they were intended to address. Namely that blaming languages for the failings of software engineers is dishonest. A true software engineer will know the chosen tools and languages and use them appropriately. Blaming C for pointer issues is dishonest and misguided. There's nothing wrong with the language if used correctly.


BTW, even physical machines have undefined behavior, when values exceed the specs and there's no telling what might happen ... I remember the days when people would destroy their monitors by giving them scan frequencies they can't handle. And there are CPU operations that have undefined behavior due to race conditions ... you can get one of several outcomes.

But there's no arguing with extreme ignorance coupled with extreme unwarranted arrogance.


> BTW, even physical machines have undefined behavior, when values exceed the specs and there's no telling what might happen

And if you (plural) are an ENGINEER, it is your JOB to KNOW these things and prevent them from happening.

I get the sense that the term "software engineer" has been extended so far that we grant it to absolute hacks who know nothing about what they are doing and what their responsibilities might be. Blaming a language, compiler and machine are perfect examples of this.

True engineering isn't about HOPING things will work. It is about KNOWING things will work. And testing to ensure success.

I've been involved in aerospace for quite some time. People can die. This isn't a game. And it requires real engineering not "oh, shit!" engineering that finds problems by pure chance. Sadly, though, we are not perfect and things do happen. It isn't for lack of trying though.


> I've been involved in aerospace for quite some time.

That's nice; not all engineering is aerospace and not all aerospace processes are always appropriate everywhere else.

Even in aerospace, still I don't want to write code that depends on knowing exactly how the compiler works. I will write code mostly to the language spec. Then treat the compiler as a black box: obtain the object code, and verify that it implements the source code (whose own correctness is separately validated).

Safety is not treated the same way regardless of project. For instance, an electronic device that has a maximum potential difference of 12V inside the chassis is not designed the same way, from a safety point of view, as one that deals with 1200V.


rebootthesystem seems to be a chatbot that specializes in shouting cliches and non sequiturs. His responses to me indicate a complete failure to understand what I wrote. smh


Nice try at a weak ad hominem.

Your parent comment is utterly irrelevant. The conversation is about the C language and the perception some seem to have that it has problems. My only argument here is that a capable software engineer knows the language and tools he or she uses and has no such problems, particularly with a language as simple as C. Things like pointer "surprises" are 100% pilot error, not a deficiency of the language itself.


> There is no such thing as undefined behavior.

Read the C Standard. (Do you even understand that it defines an abstract machine? Do you have any idea what an abstraction is?)


You just proved my point. A programmer who truly knows (a) the machine they are working with and (b) the language they are using will know exactly how to use both in order to deliver intended results.

For example, reading the processor data book to understand it, the instruction set and how it works could be crucially important in certain contexts. I would not expect someone doing Javascript to do this but how many have studied the virtual machine in depth?

Don't confuse being lazy with problems with languages and compilers.



That's a great story, thanks!

I once worked with on a project that needed specialized timing in relation to high speed (well, 38.4k) RS422 communications. I don't remember all of the details, it's been decades. I remember one of the engineers came up with a super clever way to trigger the time measurement and actually measure it. Rather than using a UART he bit-banged the communications and actually used the serial stream for timing (meaning the one's and zero's). It worked amazingly well. If I remember correctly that was a Z80 processor with limited resources.


"You just proved my point."

This is the least intelligent and least intellectually honest hackneyed phrase on the internet. In this case it's a complete non sequitur. It would tell me a lot about you if you hadn't already made it evident. Over and out, forever.


A shift from logic to ad hominem is always an indication that there's nothing further to discuss. Live long and prosper.


"There is no such thing as an "array". That's a human construct."

There's also no such thing as a computer, or memory, or operating systems ... they're all just a bunch of molecules.

I too am from the generation before people understood the power of abstraction ... but I'm intellectually honest and so I managed to learn.

> Fix the software developer.

Which one?


So you claim an array actually exists in a computer?

OK. Prove it. And you have to do it without laying out a set of rules and conventions that might allow us to interpret a list of bytes as an array.

An array is a fabrication by convention. At the simplest level it is a list of numbers in memory. Adding complexity you can store additional numbers that indicate type size and shape. Adding yet more complexity you can extend that to be lists of memory addresses to other lists of numbers, thereby supporting the concept of each array element storing more than just a byte or a word. And, yet another layer removed you can create a pile of subroutines that allow you to do a bunch of standard stuff with these data structures (sort, print, search, add, subtract, trim, reshape, etc.).

Nowhere in this description does an array exist. There were experimental architectures ages ago that actually defined the concept of arrays in hardware and attempted to build array processors. These lost out to simpler machines where multidimensional arrays could be represented and utilized via convention and software.

Arrays do not exist. If you land in the middle of a bunch of memory and read the data at that location without having access to the conventions used for that processor or language nothing whatsoever tells you that byte or word is part of an n-dimensional array. The best you can say is "The number at location 1234 is 23". No clue about what that might mean at all.


No, it does not. I used APL professioally for about ten years back in the 80's. I love the language. It is incredibly powerful. Once you internalize it's like playing the piano, you don't think about the mechanics you play music.

However, the language did not stand the test of time for far more important issues than the inconvenience of the character set and the keyboard.

And, no, J is not a successor to APL, even though Iverson created it. J is an abomination. He made a mistake. He thought that abandoning notation --which is incredibly powerful-- would solve the APL popularity problem. What he ended-up creating was a royal mess of the first degree. It's garbage.

APL could be very useful today but someone with the time and context needs to organize an effort to evolve it into a modern language that retains the power of what got branded as a "tool for thought" while adding layers of functionality that are sorely missing. I wish I had the time to embark on this journey. I would love to do something like that, but I can't.

Again, the character set and keyboard are not the problem. I used to touch type APL. Didn't take that long to get there. People learn to drive vi/vim. It's a matter of having to have a reason to make the effort.

And the ecosystem. That's another huge issue.

This has two aspects:

Finding qualified programmers and having access to libraries so you don't reinvent the wheel.

Back in the day I used to do a lot of work with Forth as well. Great language for the right applications, but finding qualified Forth programmers was difficult when the language was popular and it became nearly impossible with the passage of time.

APL suffers from the same problem, a seriously limited talent pool.

I probably don't need to explain the value and power of having libraries to support a wide range of applications. Python is a good example of this today. You can find a library to do just about anything you might care to approach with Python, from desktop through embedded and web. In many ways the breath and depth of available libraries an be far more important than language capabilities and sophistication. After all, if you had to write OpenCV from scratch there's no amount of APL magic that is going to make you more efficient and effective than a 15 year old kid with Python and OpenCV.

I see APL mentioned on HN with some frequency. I feel that some here are in love with the idea of APL rather than understanding the reality of APL. Again, I love the language, but there's a reason I stopped using it about 25 years ago.

What's interesting is that C, which I started using way before APL, is still around and very solid language (with lots of libraries) for the right applications.


> Finding qualified programmers and having access to libraries so you don't reinvent the wheel.

Lots of niche languages have the same problem, notably lisp, but it doesn't do to say they aren't popular for those reasons. It's circular reasoning. Languages get those things by being popular. They get popular by having those things.

Every current "popular" language with good libraries and a large userbase started with no popularity, no libraries, and no users. They built these things over time.

The problem is these languages can't create a robust community. They are powerful, so people don't need large teams to do what they want. They are different, so it is a bigger investment to understand them. The combination means they attract the kind of elitists who are not willing to help newcomers or write basic libraries, the kind of people who are perfectly capable of reinventing every wheel and doing it better than last time.

No one teaches these languages. How popular could they get if companies and universities spent millions of hours collectively drilling even the most marginal programmer on how to use them like they do for C++ and Java?

They would never do it though. Large companies don't want more powerful languages. They will take the productivity loss for fungible employees. It's part ego. Middle managers look much more important if they have 20 programmers write 1,000,000 lines of code over 5 years than two programmers write 10,000 over six months even if functionality is equivalent. It's part bargaining and risk. If you only have a few programmers, the individual programmer is worth a lot more. It is also riskier to employ one because she could leave or get hit by a bus at any time.


> It's circular reasoning. Languages get those things by being popular.

Maybe it is but it's reality. Also, there's the other kind of reality: Languages don't matter. Solving problems is what matters.

I've programmed in everything from Machine Language (note I did not say "Assembler") to APL, passing through languages like Forth, C, C++, FORTRAN, Objective-C, Lisp, PHP, JS, Python, etc. At the end of the day the ONLY thing that matters --if it isn't a hobby-- is solving problems computationally. I have no cult adherence to any language whatsoever. They are tools, that's all.

My best example of this was making tons of money solving a problem using Visual Basic for Applications, which allowed me to use Excel to automate a time consuming task in a CAD program. It just so happened that this CAD program could be automated using VB. Put the two together and several months of work and we had a tool worth quite a bit of money.

APL still has lots of value...in the right circles. I believe it still sees professional usage in the finance industry.


> No one teaches these languages. How popular could they get if companies and universities spent millions of hours collectively drilling even the most marginal programmer on how to use them like they do for C++ and Java?

Schools and universities have been teaching Pascal/Lisp/Caml/Scheme for decades, yet (almost) nobody used those languages to produce actual software, neither as a job nor for free software side projects; and a majority of jobs implied the use of a member of the large C family (or VB at the time).


Weren't some of APL-related languages, to keep it on topic, like A+ and K created at or on contract with a large company?


I think most of the users here realize the reality (lack of users and libraries), but also recognize the power you described. I'd love for a simple open-source & multi-platform interpreter with a built in keyboard and package manager. Simplicity is key here. J basically has this, but I agree the syntax is difficult for me.



I find both APL and Forth fascinating. I would not try to promote them as a replacement for newer more approachable languages, but I think that learning them gives you different points of view and are worth learning for every programmer (same with FP, for example).


I still think Forth beats C as a language for microcontrollers programming. I didn't try it myself, yet, but I think a built-in REPL, nearly 1-1 correspondence with ASM and programmable compilation (among other things) can sum up to a really nice programming environment.


Yes, APL and Forth are excellent in this regard. They give you a very different set of mental tools with which to solve problems computationally.


Could you go into more detail about your opinion of J? I have not seen any negative opinion about it before and I am very curious as to what issues you believe it to suffer from.


One of the most powerful aspects of APL is its notation. Ken Iverson himself wrote a paper titled "Notation as a Tool For Thought". Here it is:

http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pd...

I remember watching Iverson deliver a presentation in person about this very topic.

Anyone familiar with fields such as mathematics or music understands the power of notation. The integral symbol conveys information and allows you to think about the problem rather than the mechanics.

APL in early times suffered from a unique problem: You had to physically modify your computer and printer to be able to do APL. You had to remove and replace the character generator ROM from you graphics card (who remembers those cards?). You had to get a new keyboard or put stickers all over your standard keyboard. And you had to change the print wheel or print ball (IBM printers) to be able to see, type and print APL characters.

It was a pain in the ass. Only the most interested cult members endured that level of pain for an extended period of time.

Years later Iverson decided to transliterate APL symbols into combinations of standard ASCII characters. This was a knee-jerk reaction to the above stated problem. What he did not have was the vision to recognize that technology would take care of this on its own. Not long after the introduction of J everyone could display and print graphics of any kind. The APL character set, the symbols, ceased to be a problem in that regard.

Iverson took the wrong road with J out of --conjecture on my part-- commercial interest rather than language interest. He violated something he personally talked about: The value of notation as a tool for thought.

J doesn't need to exist. If we are to evolve APL and move into a world where symbolic programming is a reality (something I think would be very powerful) we need to move away from typing ASCII characters into a keyboard and move into a paradigm where advanced software engineering has it's own notation that can be used to describe problems and create solutions with the kind of expressive power we have not seen in mainstream computing in years.


I have been in physical product development and manufacturing for three decades. Doing business with US manufacturers has become more and more difficult over time. What you describe here is very true and only the tip of the iceberg. I have, for example, sent out 50 requests for quotes for machined components only to be utterly ignored by most of the shops I contacted. The same exercise with China results in an almost overwhelming number of quotes received almost instantly. They are open for business. Have been for a while. I, frankly, have no clue what game we are playing.


> Doing business with US manufacturers has become more and more difficult over time.

Any thoughts on why?


In my experience, it's a combo of issues.

US manufactures were the first to automate, so many of them are with 1st/2nd generation gear that is designed for high throughput operation and not around setup time. These machines can take several hours to change what they make.

Also, manufactures have become jaded with new customers as they are constantly asking for them to move mountains for overseas pricing. Manufactures typically have a big enough collection of frequent customers that they can use bad first customer support as a means to filter out people who will go overseas anyways after they hear the quote the manufacturer spent a fair amount of time on. If a customer is willing to pass the gauntlet of trying to contact you, they are much less likely to disappear after you give them a quote.


I tried to get some waterjet cut parts about 5 years ago near Santa Barbara. The only shop I could find that could do it had no way to get the data into the 486 - yes you read right - other than hand replicating my file in his ancient CAD program.

My experience overseas is more like fire off an email with an attachment and have perfect parts a few weeks later.

I believe this is changing though. There are some awesome short run PCB assembly services in the states now for instance.


Now that the old equipment is at the end of its useful mechanical life people are switching.

In the manufacturing I’m involved with recently got rid of their last machine that used 8080 era processors and character only green CRTs.


That's not quite right. You can hold a short position for as long as you want. There is no contract saying you have to close the transaction in 30 days. In that sense the otherwise excellent analogy provided isn't accurate (which isn't a problem, it's meant to be a simplification).

What does happen if the stock goes up is that your brokerage company will ask you to put up the delta. In other words, if it goes from $100 to $110 you'll be required to deposit the equivalent of $10, the delta, times the number of shares you shorted. If you shorted 1,000 shares you'll have to deposit $10,000 for every $10 of upwards movement in the stock price. If you have long (traditional stock buying) positions in your account your broker might actually sell those automatically to cover this delta.

The other important point is that this is a loan. Which means you will pay interest on the funds, in this hypothetical $100,000. The interest charged can vary. If, for the sake of an example, we assume 5% simple annual this means $5,000 per year or just over $400 per month.

I used to day trade (about 20 years ago) and would use shorting multiple times per day. I am not sure I would consider shorting for long term (> 1 day) positions. As many have said, the potential for loss is great.


I have to say, I've been running my own mail server without a single problem for years. OK, the way I do it is a stretch of the definition of "running my own web server" but I have full control.

What do I do?

Get a nice VPS from a company like GoDaddy. Setup whatever domains you need. Setup as many email accounts as needed. And off you go. No problem. I don't even have to think about email.

I thought about rolling my own on one of our Linode servers but every time I compare the no-brainer of doing the above to what it would take to run this ourselves on Linodes I can't justify the pain and aggravation.

What I don't like about the Gmail approach (other than it is Gmail and I do not trust Google to not shut down all of our accounts for some stupid reason) is the cost. I can spend a few bucks a month on a VPS and have a hundred email addresses. The same on Gmail would cost significantly more and you would be under their irrational thumb.

A few years ago I looked into running Zimbra on Linode. Back then it was so resource hungry it just didn't make any sense. I wonder if it has gotten any better over time? I really like the concept.


Amazon has the power to fix this. They don't seem to care. Simple logic and heuristics would be enough.

Amazon allows anyone to post a review. You don't even have to buy the product. That is fundamentally wrong.

That's the first easy step: If you did not buy the product on Amazon you cannot post a review.

Amazon allows people who receive deep discounts to post reviews. That is ripe for manipulation.

And so, the second filter is simple: If someone doesn't pay at least N%, say 50%, for a product they don't get to post a review.

Amazon allows people to post a review at any time, even before the product ships. You can post a review for toothpaste before you actually use it.

The third filter would include a variable purchase-to-review period. The length of this period is different depending on the type of product. Maybe someone who buys a USB cable can post a review a couple of weeks after actually receiving it --but not sooner. Someone buying weight loss pills might need to wait 60 days.

In other words, introduce some common sense into a process that would only allow actual retail buyers of a product to experience the product for a reasonable amount of time before allowing them to post a review, positive or negative.

It goes beyond that. Negative reviews need to be routed to the vendor before they appear publicly on Amazon. Why? Amazon needs to give vendors a first shot at solving the problem. The current system is moronic. People can give a USB cable a bad review because they don't like the color. The thing might work just fine but the person wanted a different shade of green and can give the product a 1 star review. This is nonsense.

In general terms, Amazon reviews, due to Amazon's own incompetence, are pretty much worthless these days.


The problem I have with this woman is that she actually demeans women. There is no way I would have my little girl watch her videos as a role model. The entire thing is plain stupid.

I would have exactly the same reaction if we were talking about a guy going around town in tight fitting speedo's doing technology projects and interviews. Ridiculous. Stupid. Demeaning.

I suspect that because she is a woman I am not allowed to object to her walking around nearly naked, nipples showing through nearly see-through clothing, nearly all of her ass out there to behold, etc. I, personally, find it disgusting and as far from role model behavior as one can get. I am not a prude by any measure.

She does nothing for women other than to propose that to be accepted in technology you have to walk around showing your tits and ass in a ridiculous outfit that paints you as a sex toy and, yes, know your shit too. I find that repugnant. Just as I would for a man who might behave behaving similarly.

Raise your hand if this is what you want to teach your little girls. Exactly.


She's not supposed to be a role model for you to show to your little girl, or took aesthetically comfortable for you, or to be an example of how women should or not be accepted in technology.

She is not yours - nor even other women's - servant.


You're completely missing the point.

Not everyone has to be a role model for your little kid.

I don't want my kid to hear people swearing, but I'm not going to slander Linus Torvalds for doing so. I'm not stupid enough to believe that just because someone is crass they can't possibly be a good maker/developer/engineer/etc.


The part you guys are missing in my comment is that I said this is the problem "I" have with this woman.

I am not trying to pass judgement for all of humanity here. If you are OK watching videos of a nearly naked woman playing with tech and think that's great for women and humanity, so be it. I happen to thing it's disgusting.

All I can speak for is myself. I see this person as a clown who demeans every single female engineer I have every worked with or hired. She might as well be doing soft porn videos and make more money, she ain't too far from that.

Because of that I can't respect what she is doing.

Do we really want to portray women in tech this way? Really?

At one end of the spectrum people are up in arms about politicians objectifying women (and worst) and meanwhile, on the tech side of things we are OK with a woman walking around with tits and ass hanging out there?

Is the tech community OK with women being portrayed this way? I am not. Definitely not. If you defend her you are defending the objectification of women. Which is shameful.

Sorry, this is disgusting to me.


I look at it this way. I wouldn't recommend it, because society still discriminates against you based on how you look and dress. I mean, if you show up at a job interview with 98% of your body tattooed you're also going to face bias.

Conformity in a way increases opportunity. On the other hand, she's a transhumanist, so she probably believes all of these rules and customs are pretty primitive and totally changeable. I mean, who says you ever have to wear clothes if you're in a warm climate? Who says our genitals shouldn't be seen? These are all developed customs and specific to each society. She's deliberately breaking the mold and refusing to be shamed.

I'm ok with people portraying themselves however they want to portray themselves. I can make a rational argument for "playing it safe" and being conservative, then again, when you're 70 years old and lived a life of conformity, maybe you'll regret you didn't live life openly and flamboyantly like she does.


> Wow, I had no idea this was happening!

This is, by far, my primary complaint about Kickstarter. They insist on sending me emails with "Things we like on Kickstarter". It is, generally speaking, full of absolutely irrelevant projects in the context of someone with a history on KS indicating I am only interested in technology projects.

It is truly infuriating because it always leads to exactly what you said: I had no idea project x was on KS.

I don't know what it is. Some kind of an internal cult. Or is it incompetence? They have my entire history on their site spanning years of supporting projects. Yet they think I might find campaigns about bow-ties and butterflies interesting? Unbelievable.

I wonder how much better campaigns could do if KS actually got their shit together?


Having worked in aerospace I find it very interesting that a nation like North Korea can manage to launch such rockets with regularity. There is no way they are able to do this on their own. That much might be obvious. The real question is: Who's helping them and what's their objective?


>Who's helping them and what's their objective?

Who's helping them is obvious - China.

What's their objective? Destabilizing and undermining American military hegemony in their sphere of influence and establishing themselves as Asia's sole superpower. I don't know how North Korea works into that, though. Maybe China just wants a proxy to harass the US with while having politically plausible deniability. Maybe they're hoping to step in when NK takes it too far and look like heroes, while making the US look feckless and weak.


Seems if you say anything negative about China even if completely accurate on HN you get downvoted. Yes China has assisted NK:

http://freebeacon.com/national-security/china-sold-trucks-us...


It's relatively common on most social media sites including HN, reddit, etc. It was particularly noticeable during the coverage of China's DDoS attack on github a few years back.


Alternatively they may have got engines from Ukraine https://www.nytimes.com/2017/08/14/world/asia/north-korea-mi...


> look like heroes

No one is gonna fall for that. Especially since most of its neighbors are aware of China's belligerence, and are hostile to China (India, Vietnam, Taiwan, Japan, South Korea, Indonesia) or are skeptics of China (Australia, New Zealand)


In the 60s Russia and the US manufactured thousands with 60s technology. North Korea has access to enough ore to make good enough metals and knowledge from history of what works.

The difficult part for them is propellants. Historically they’ve used kerosene and nitric acid but the newer rockets use hydrazine compounds and dinitrogen tetroxide oxidiser, which means either they’ve advanced their chemical industry a lot or they’ve found a supplier.


Even Hamas can launch rockets with pitiful infrastructure, what makes you think a nation like North Korea that has granted itself the capability of using all productivity and resources in the land wouldn't be able to do this?


Hamas launches the equivalent of sugar rockets with an explosive attached and lacks any precision.

North Korea is tinkering with a delivery system capable of placing nuclear weapons in the exact position necessary for optimal damage.

These things aren't equivalent, it's like comparing a rudimentary wooden wagon to an F1 car in terms of technology.


Not that they're not deadly on a smaller scale.


Not sure what you're saying is obvious. What makes you think they can't do this on their own?


There was a great podcast on NYTimes' The Daily about new advancements in the rocket engines used – their conclusion is that with Ukraine in flux, the source of Russia's missile engines (which Russia has recently stopped buying) are selling them off to North Korea, or an intermediary.


Do you listen to Arms Control Wonk? It's a podcast; they go deep, deep into the weeds of these missiles and what we know about them. In one podcast recently they talked about the cooperation between Iran and North Korea.


What makes you believe NK can't launch rockets alone?

It's not like it's rocket science. Er, ok, so it is rocket science but still. Other groups of people have done it, what makes it impossible for NK?


The impressive part is the pace. There are multiple launches a year, and constant incremental increase in capability. At this rate, they will have the ability to strike anywhere in the world in just a few years.


They're using the lean startup methodology. Iterate. Iterate. Iterate.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: