Hacker Newsnew | past | comments | ask | show | jobs | submit | ebassi's commentslogin

If your theme engine is as simple as Raleigh, then the chances of crashing applications because of a bug in the theme are smaller. Sadly, most (if not all) theme engines were far, far more complex than Raleigh.

We have bug reports coming especially from people using the Oxygen theme, which had a tendency to (against all recommendations) poke at the widget structure behind the style contexts because what could possibly go wrong.

Given the ability to inject random C code into applications that do not expect that, people will abuse it; hence the switch to a pure declarative language like CSS to describe the aspect of widgets. Your app may look like crap, but it won't crash because of that.


I agree, though none of the theme engines I've used regularly crashed. There were two KDE theme engines which managed to crash in their gtk variant, but those had more issues than that. No crashes whatsoever with all of gtk2-engines and also murrine or aurora.


> A cynic might say that they decided to scrap everything and start again

and not only this cynic would be wrong, but he would also be full of crap, because it's such a trivial thing to check in the public repositories, public mailing list archives, and public bug tracking system.

but, obviously, crapping on a bunch of volunteers trying to work on the project they like is easier.


That's certainly how it has looked from the outside.

Are you saying they didn't force incompatibility with Gnome 2? Or that they didn't just decide to change everything to exactly how they wanted the desktop, regardless of users?

Maybe they didn't scrap some of the underlying libraries, but the whole look, layout, interaction pattern/user flow and pretty much everything else was a total change.


no, it wasn't - mostly because of political bullcrap coming from the "I'm purer than thou" crowd. obviously, the biased perception that gnome was using C# exclusively was also due to the same crowd.

this also led to the C#/mono platform lagging behind, so that these days writing gnome apps in C# means using deprecated libraries, which is unfortunate.


Ah right, maybe that's where I got the impression. I used to follow Nat Friedman and Miguel de Icaza's work in the early/mid 2000s and picked up a 'C# is the future of GNOME' vibe.


the usual ridiculous trope that "if only we had a single $PROJECT all the people working on the current competitors would work on $PROJECT instead and everything would be great" is enough to discount this whole lot of drivel.


I use a wiki for strawman feature proposals and for the roadmap; each entry in the roadmap gets a bug in Bugzilla, and all milestones/releases get a tracking bug depending on the bugs that should go in. milestones/releases are every six months, so there's an inherent deadline in there.

it's an open source project, so this not only helps me having a clear idea of what I'm meant to be doing, but it also helps others to contribute to it, by giving a precise idea of what can be worked on at any given time.


> I am somewhat of a misanthrope.

you use that word. I don't think it means what you think it means - or that it has the implications that you think it has; I am a misanthrope, but that does not influence my reaction to the floor plan of the office I work in. actually, I'd most likely go insane in a cubicle farm. there are ways around to signal that you don't want to be disturbed in an open space.

given that you write:

> I don't trust most people. I don't expect them to trust me.

followed by:

> The social bullshit and paranoia are intolerable.

I do assume that you're referring to your own paranoia, here...

> [snip the rest of the workplace description]

... which leads me to think your problem is not the office planning, or your social issues (with or without medications): I think it's your current job. my entirely serious suggestion is to either change it, or ask to work remotely, if you still think that you'd enjoy it more (and if they don't allow remote work, I'd seriously consider changing jobs anyway).


I was exaggerating the anger and grumpiness.

That said, open-plan signals to me that the company doesn't actually value productivity so much as image and availability. This is something that I have learned with age not to take personally, but I still dislike it.


That's just what it signals to you. To others it signals a healthy environment. If you have a problem with it I seriously suggest you take it up with your management, but I very much doubt that their thought process was anywhere close to the one you describe. If it was they'd be shooting themselves in the foot deliberately.


a three lines program generates 95 lines of error messages; let's throw a template into it, so that it goes up to 112 lines, but look: at the beginning you'll see the error we added!

coupled with the: "To be fair, this doesn't entirely replace decltype. auto doesn't perfect forward. But it seems to work as expected, most of the time" it goes a great way to instil me a sense of safety.

not.

that post should be taken as an example of why C++ has become a liability, and the day won't come too soon when this joke on the whole industry finally gets dumped like COBOL in the nearest ditch, along with the corpses of all the projects that made the mistake of actually using it.

if it were a living thing, I'd shoot C++ in the face to put it out of its misery.


I don't get all this C++ hate from people that has never even used it.

Personally I want a better C and C++ satisfies most of what I want.

    1. Type level polymorphism (generics/templates/etc)
    2. Easier ways to manage memory (constructors/destructors, auto/shared pointers)
    3. A modules system (namespaces)
    4. Lambdas, etc to make life easier (C++11)
    5. A standard library with useful data structures
    6. Fast
It seems all to me all these haters still think that C++ is just C with classes, but C++ is a fairly vast language and OOP is just a small part.

C++ is blazing fast, it has an elegant standard library and templates do make sense if you know what you are doing.

You are trying to use cutting edge features that aren't properly standardised yet and complain that the error message is a bit long? Granted, template error messages aren't always the one liner you are expecting, but they are readable.


> I don't get all this C++ hate from people that has never even used it.

wrong. I use it every day, on a fairly large code base.

obviously, for portability reasons, nobody working on the project is allowed to even think about C++11, as well as other headache-inducing features - and that includes a fair chunk of the standard library.


> a three lines program generates 95 lines of error messages; let's throw a template into it, so that it goes up to 112 lines, but look: at the beginning you'll see the error we added!

95 lines of error messages sounds terrible, no doubt. But which language is better in the real world? I am currently consuming a JSON web service written in Java, and I get no less than 30kb of stack trace back when something blows up. Rails' stack traces are filtered by default, but sometimes you still have to jump in there.

And C++ has the advantage that much of this can happen at compile-time.

I agree that almost everything about C++ is broken, but the verbosity of error messages is the smallest of it IMHO.


I'm thinking most of the "C++ school" is affected by excesses, still, Java and C# are slightly better

" but the verbosity of error messages is the smallest of it IMHO"

It's merely a consequence of the verbosity/complexity of the language itself


I read that blog a bit and it seems to have the following pattern overall: elegant 5-line Haskell program. Then follows 100-line unreadable C++ implementation. My conclusion is that C++ is a very bad language to try to do Haskell :-)

This doesn't mean that C++ doesn't have its uses, but for your sanity's sake don't use it for high-level metaprogramming.


The issue you describe is more a compiler-specific issue than a language specific issue. Yes, the language allows a lot of flexibility and therefor in some situations it's hard to nail down an error to a nice, specific error message (mostly when templates are involved), but it's not impossible, as clang++ shows.

Believe it or not, there are people (like me) who actually enjoy coding in C++.


It's a language specific issue. Sure clang++ has improved error reporting and GCC is actually quickly catching up. But the problem is a language issue because an error in a template confronts you with implementation details of the template. There is nothing the compiler can do about it.


Or you can use Clang.

Feel free to shoot C++ in the face. Don't forget to rewrite most of compilers and operating systems in use in the process.


Except that clang++ is only marginally better. E.g. trying to copy a std::copy a string vector in a size_t vector will give you this beauty:

http://pastebin.com/H68j6E50

Cascade a bunch of errors, end you'll soon be writing a script to analyse error output ;).


Clang is not the solution. Yes it has some better error messages. Although GCC is quickly catching up.

But it can't solve the core problem. It's simply the language. The lack of Concepts (or something similar) means that error messages simply refer to implementation internals of the Template you are using. Easily giving you a long cascade of error messages when it's in some Template that is used by the Template.

All of the sudden you have to look at implementation details of your standard library or something even worse such as boost. Just to figure out what the mistake was.

And there is nothing Clang or GCC can do about it.


Linux doesn't use C++ and AFAIK the BSDs also don't.

So go ahead, I won't have a problem with a dead C++. :-)


Then prepare to drop GCC. Then drop whatever browser you are using and most of the cross platform compatible GUI frameworks (GTK, QT etc).

https://lwn.net/Articles/390016/


GTK is written in C, not C++.


Sorry was thinking of something else.


There is much more to the OS than the kernel. Lots of important software is written in C++.


that's not a problem in the long-term perspective as it doesn't force anyone else to use C++


What would the clang error message be?


It's roughly the same as the improved GCC error message ("index out of range"), but with only 29 extra lines of incomprehensible compiler meandering and stdlib internals!


projects that made the mistake of actually using it

You mean stuff like Unreal Engine... ?


the data collected from the history is not associated with the user account, only analysed; and the plugin/extension respects private/incognito mode, according to the privacy policy: https://lumi.do/about/privacy


So how are they supposed to personalise your content if they don't know who visited what?


the issue is not (only) having distasteful content; the real issue is censoring links to content that embarrasses redditors, like the Gawker outing of the reddit creep, in a shockingly hypocritical display of a double standard.

you cannot "uphold the principles of free speech" by keeping creepy content under the "it's not illegal" moniker, and at the same time censor the outing of one of the moderators of said content - another perfectly legal action.


not every video frame is encoded in RGB. actually, very few transports use that.

the in-memory layout of a YUV-encoded frame is different than the one of a RGB one. it needs conversion, which implies an intermediate copy, as well as traversing a buffer that can go up to 1080 rows.

most video players (and Flash on Windows and MacOS), these days, use shaders and multi-texturing to do that conversion on the GPU; it removes a copy, because you just push the YUV frame as it is to the GPU - and your CPU can go back to a low C state.

I think Flash on Linux does this as well, on nVidia: the blog post linked is from 2010. the way Adobe detect capabilities is, in itself, hilariously bad - instead of checking the GL required GL extensions like any normal people, they do a check on the GL vendor string. the Mesa guys pointed that out, but I doubt they ever received a response.


My system isn't shader capable and still playing HTML5 video tag consumes only ~30% of CPU while playing flash ~90%.

There is no excuse for that - I could understand higher CPU usage, but not by factor of 3.


Are you on Linux ? That is the only plat that I see taking higher cpu usage while playing viedo in flash. Both windows and osx now play flash with little cpu usage.


Yes, I'm on Linux - but it still doesn't explain why flash is consuming so much CPU (I'm using VESA drivers for graphics)


Why don't they encode the elements to YUV once, and then flip to YUV (so not needing to convert each time), as opposed to converting the video to RGB?


I _think_ you lose a _lot_ of color going from RGB->YUV, also you are just replacing one expensive color space conversion with another.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: