There may be bugs, but not hallucinations. Bugs are at least reproducible, and the source code of the verification tool is much, much smaller than an LLM, so has a much higher chance of its finite number of bugs to be found, whereas with an LLM it is probably impossible to remove all hallucinations.
To turn your question around: What if the compiler that compiles your LLM implementation “hallucinates”? That would be the closer parallel.
I think the idea is that you'd have two independently-develooed systems, one LLM decompiling the binary and the other LLM formally verifying. If the verifier disagrees with the decompiler you won't know which tool is right and which is wrong, but if they agree then you'll know the decompiled result is correct, since both tools are unlikely to hallucinate the same thing.
No, the idea is that the verifier is a human-written program, like the many formal-verification tools that already exist, not an LLM. There is zero reason to make this an LLM.
It makes sense to use LLMs for the decompilation and the proof generation, because both arguably require creativity, but a mere proof verifier requires zero creativity, only correctness.
These are most likely people with easy guessed passwords like “password”. The notification suggests the attackers purchased these email/password combos in bulk. That’s likely all this is.
It's kind of weird to see big corporations want to attract talent but then act surprised or salty when people don't like being treated like a line item in a budget.
It's nothing personal. I'm sure your company is great and you truly believe in your mission beyond just providing value to shareholders but humans like stability and to be treated like people. You might be a Fortune 500 but you're not god just a job.
If you think these things aren't fair or suck, you should try treating people better or else deal with the organized labor that will make decisions for you.
Sorry, it's getting kind of annoying to see employers not getting this.
1. Don't conflate business interests with the interests of the population of a city. I don't doubt there is rising demand for private armed security from the 1% of Portlanders who own businesses, that does not reflect on everyone.
2. Portland has seen rising violent crime but this is part regression to the mean (Portland is the safest city of it's size, even after the 2020 spike) and part of a larger trend (crime rose everywhere in the U.S. since 2020). The assertion that policing has broken down and the city is lawless is laughable.
1. They may only be the 1% of portlanders who own businesses, but they own 100% of the businesses. The other 99% have to buy bread from somewhere, so it totally affects them. Further, it's happening because of how they voted.
2. Anyone who's ever been to Portland wouldn't or couldn't call it an average city. Even before the pandemic there were homeless people everywhere. I couldn't turn my head and not see a tent. It didn't look particularly fun either; Portland gets a lot of rain. Also the article doesn't assert that policing is broken down, they assert that the police are underfunded and take two hours to get there.
> Further, it's happening because of how they voted.
[citation needed]
Its happening because business-backed city leadership decided to deprioritize affordable housing in an effort to uplevel the tax base and get rid of all the artist types, with the completely forseeable result being an explosion in homelessness
It’s interesting that Portland is seeing record homicides because it’s far more gentrified than ever. Portland had only 54 homicides in 1993, when crime was much higher nationally, and Portland specifically was much seedier. Last couple of years are close to double the 1993 peak, but population growth since then is only 50%.
No this is incorrect, I know people that left from Portland and Seattle following the "firey but mostly peaceful protests" the fact of the matter is they gutted the police, the police that remain know that if they actually do their job they're likely to be the next national outrage, and they are being told by their bosses and the politicians to do nothing. As such the criminals now know they can run wild with no prevention it isn't gentrification it is what happens when an entire city turns their back on any semblance of law and order. Portland will not recover for a long time if ever.
My experience has been that "migrate on touch" is a reasonable strategy, so if you have to make a change to a file, use the "Code > Convert Java to Kotlin" (control-alt-shift-k)
A reasonable strategy if you never ever merge two branches. If there's a non-homeopathic chance that a parallel change to the file in question might eventually pop up I'd limit "migrate on touch" to occasions when you do major rework and not just a minor touch. If it's possible to occasionally enforce a "branch singularity moment" I'd go with migrate on major rework until a branch singularity opportunity comes up and then do the bulk conversion to what I affectionately call "shit kotlin" (the endless procession of exclamation marks that faithfully recreate each and every opportunity where the java could, in theory, achieve an NPE) in one go. And leave only the cleanup of that mess to "on touch". If it later comes to parallel cleanup, that wont be half as annoying to merge, not even remotely.
What I haven't tried is "migrate on touch" with a strict rule that there must be explicit commits just before and after the conversion (plus a third commit documenting the file rename separately, before or after). That could perhaps work out well - or not help much at all, I don't feel like I could even guess.
But other than that, the intermediate state of partial conversion is surprisingly acceptable to work with, I'm not disagreeing!
IIUC, there's already automated tooling that will do the conversion and not produce a giant mess like c/c++ to rust does so the cost is predominately CPU and not SWE.
But then you have to use Kotlin, which isn't just Java with nullability types, but a language with quite a different design philosophy, and a language that is increasingly at odds with the evolution of the JDK (partly but not solely because it also targets other platforms, such as Android and JS). It appeals to some but certainly not to all (interestingly, it hasn't significantly affected the portion of Java platform developers using alternative languages, which has remained pretty much constant at about 10% for the past 15 years).
I tried Kotlin, and while I liked it, iterop was still somewhat annoying, Java's lambdas are better, using the Java 8 stream API is ugly, and the code ends up being similar enough that I'd rather use Java and avoid tooling hassles.
The article actually addresses Kotlin. They'd love to switch to it but they just can't do it overnight because they have so much mission critical Java code. So, this is a stop gap solution for legacy code. They published another article some time ago how they are switching to Kotlin: https://engineering.fb.com/2022/10/24/android/android-java-k...
Migrating millions of lines of code is a non trivial effort. They'll be stuck with bits of Java for quite some time. So, this helps make that less painful.
So you're in the exact same case as you were in Java, which was my third point. But the type is a special type to let you know what you're doing is unsafe.
Migrating from Java to Kotlin looks nice and easy on the surface (optionals!), but the lack of checked exceptions will absolutely bite you sooner or later if you are consuming Java code. Better carefully read the docs and source of all your transitive Java dependencies.
I can't think of any popular language that would take more than a few days to get acclimated to as an experienced developer, so that's not a very compelling argument.
It's always the (usually quite bad) tooling, learning about platform/SDK shittiness and pitfalls, and figuring out which parts of the open-source library ecosystem you want to engage with, that takes like 90+% of the time getting decent with a new language, in my experience. Getting comfortable with the language per se takes low tens of hours at most, as you wrote.
Kotlin (the base language) is really not that different from java. I went from 0 to standing up new backend services with limited friction. Coroutines and maybe frontends are a different story. Java doesn't yet have a coroutines equiv so that was a larger hurdle for me.
Most of the changes for me from 10/20+ hours to now we're more about identifying a style that works as effectively as I can. These types of behaviours are normal in all but the most idiomatic languages, so if anyone is doing java dev as their daily language, Kotlin felt very natural(though you really are limited to Intellij since the IDE does a ton of lifting to make your life easy).
Well, C and Scala are some counter examples that immediately come to mind.
Kotlin is probably more similar to Java than any other mainstream language. There’s almost no learning curve there, while going from Java to other “easy” languages like Python requires significantly more time to get used to.
Scala it depends how you want to use it. If you're going for full FP then sure it can take a little bit longer, but you can also just use it like Java+ if you really want...
I think there's a difference here between getting acclimated to scala (for new code, presumably), which is reasonably easy, and getting acclimated to a scala codebase that was already written by someone else.
You can do the first one basically the same way you'd do kotlin, the second one can get pretty hairy if someone decided to bring in a bunch of macro heavy DSLs and syntax extensions.
Ooof, I think most devs can modify an existing code base in a few days. To learn all the idiomatic styles, tradeoffs of the major libraries and different build systems take months, maybe years IMHO.
I can't think of many other languages that will compile into a Java codebase, and be interoprable in both directions, as well as Kotlin. It's a lot quicker to pick up than e.g. Scala IMHO.
You would be surprised how easy it is for people to get sucked into a crypto bubble vortex that makes you feel like you're missing out big time by not being invested in crypto.
I was pretty heavily involved in the personal finance community on Twitter and there's two camps.
1) VTSAX and chill (basically dump money into an ETF and forget about it)
2) Moar passive income by side hustles and crypto
The latter became more and more common and ultimately drowned out the former. I believe it's because the market was doing so well that folks' risk meter just wasn't registering.
Probably the same reason why people choose to get into MLMs.
> well that folks' risk meter just wasn't registering.
That's because they were probably still in school back in 2008. I remember the days of late October 2008 like it was yesterday, and back then I was a no-name computer programmer working for an independent mortgage broker, not a big finance schmuck from Wall Street.
That one is easy... FOMO - Fear Of Missing Out. People see the 10,000%+ returns extreme early adopters obtained and think they need to get in before the good-getting is done. Most of the time they're wrong and just throwing money into dark pits...
"Gamified" trading apps like Robin Hood have made it all too easy to feel much lower risk that it is in reality though.
For the rest, crypto can be part of a diversified investment strategy. Not all crypto is outright scams... but you do need to be able to handle the volatility.
I bought some (emphasis on the some, sadly) Bitcoin when it was $80. I’ll never get a return like that in my life. Other people are chasing that dragon. Unfortunately it leads them to burgeoning “shitcoins.”
It’s all fine if you view it like the lottery and put “fun money” into it. It’s not fine if it’s your primary investment vehicle. For what it’s worth I still think Bitcoin and Ethereum will be fine and bounce back up, eventually.
“they abolished the fundamental distinctions between investment and speculation… they ignored the price of a stock in determining whether or not it was a desirable purchase.”
Benjamin Graham & David L. Dodd, Security Analysis, 1934