Hacker Newsnew | past | comments | ask | show | jobs | submit | throwaway31338's commentslogin

Since the genie is out of the bottle, when it comes to ALPRs in the United States, I'd rather just have all the data publicly available. If the cops, data brokers, and insurance companies can see it I should be able to as well.

I should be able to see the comings and goings of law enforcement, elected officials, etc, if they can see mine.

Alternatively, lock it away behind judicial oversight. Make the cops get a warrant. Criminalize companies collecting the data from offering it in any manner other than by the order of a judge.

I feel the same way about tracking cell phones, publicly-owned surveillance cameras, privately-owned surveillance cameras that are "voluntarily" offered to law enforcement, and, in general, any dragnet surveillance available to law enforcement. If it's available to law enforcement and not being conducted on an individual basis under a judicial order (or, heck, even just probable cause) I think it should be available to the public, too.

"But stalkers!"

Tough. That's the price we have to pay for keeping law enforcement in check. Either adapt or take this power away from law enforcement.


For that matter, it’s not unheard of for members of the US’ 18,000 law enforcement organizations [0] to engage in stalking behaviors themselves. Especially when there’s no oversight for a specific surveillance technology…

[0] https://en.m.wikipedia.org/wiki/Law_enforcement_in_the_Unite...


> Since the genie is out of the bottle, when it comes to ALPRs in the United States, I'd rather just have all the data publicly available. If the cops, data brokers, and insurance companies can see it I should be able to as well.

I try to help folks understand that locking up public data up with a privacy law only blocks them from seeing it.

It is still trivially available to those who use personal data to negatively impact others.

ref: https://datarade.ai/data-categories/b2b-contact-data/provide...


  > "But stalkers!"
  > 
  > Tough. That's the price we have to pay
I wonder if that won't really work well in real life.

I remember reading the difference between a citizen and a police officer is that a police officer can arrest people for misdemeanors (while a citizen can do a citizen's arrest for felonies I believe).

There's probably a good reason citizens shouldn't easily be able to prey or stir up trouble with no friction.

that said, surveillance by private companies should be regulated, and people should have access to data collected about themselves.


>"But stalkers!"

The stalkers are already in the house, this list isn't exhaustive by any means:

USA:

N.J. cop used police databases to stalk ex-girlfriend, investigators say https://www.nj.com/monmouth/2023/01/nj-cop-used-police-datab...

Officer Fired for Allegedly Using Police Database to Stalk, Harass Women https://www.newsweek.com/officer-fired-allegedly-using-polic...

Australia:

Former policeman accused of using force database to stalk ex-wife and girlfriend https://www.theage.com.au/national/victoria/former-policeman...

Former federal police officer faces new charges over stalking of ex-girlfriend https://www.canberratimes.com.au/story/6138318/former-federa...

(Note the two above articles are not the same person)

UK: Met police officer 'used CCTV cameras to stalk his ex-girlfriend after telling her to take up sex work to pay her bills' https://www.dailymail.co.uk/news/article-11868575/Met-police...

Creepy cop saw attractive woman on the road and 'looked up her license plate number so he could stalk her on Facebook' https://www.dailymail.co.uk/news/article-2178556/Officer-Jef...

Large miss-use in just California:

https://www.eff.org/deeplinks/2025/01/california-police-misu...


I agree with what you're saying, but don't understand the point - if police officers in the US (about a million adult professionals) are abusing this data, wouldn't opening it up to ~300 million random people result in far more abuse than we're already seeing?


There’s three main points here:

1. Don’t build systems that can be abused to start with, because they will be abused, but if we must build one then see point 2:

2. Put access to this sensitive information behind a judge’s signature. Because see point 3:

3. When it comes to this kind of data: There are no “good guys” and “bad guys” - we should assume that everyone is a potential bad guy.

Whenever you hear the “good guys” justification, immediately remind yourself of the ways the “good guys” have been found to be “bad guys” in sheep’s clothing.

Whenever you hear someone use the “nothing to hide” argument, remind yourself that none of the victims in these stories had anything to hide, nor had they done anything wrong. (Much like the thousands of women who die from partner abuse every year.)


> Tough. That's the price we have to pay for keeping law enforcement in check. Either adapt or take this power away from law enforcement.

This position sounds fair, until you think of the skewed demographic that is paying that price.


> "But stalkers!"

I think attributing the problem to “stalkers” minimizes the issues this arrangement of publicly searchable surveillance data creates. Imagine a website where you can type in anyone’s name and it shows you their last known location and their location history. You would have a system which supports universal spying for mundane and nefarious reasons alike. Not just criminal “stalkers” will take advantage of it.

Potentially this sort of arrangement would work if there are limits on the granularity, frequency, and history of the tracking data.


> privately-owned surveillance cameras that are "voluntarily" offered to law enforcement

You may find interesting to know that in the case of crimes being committed near or next to a business, especially independent businesses, their priority is often to minimalize risk to them and their business, and so they do not provide any recording, or use rolling footage which wipes every x days (often just one day). That way the footage is useful to them, but they do not have any additional obligations.


This tracks. My sister had her car keyed (scratching the paint) in a Honda dealer parking lot while waiting for a tire repair. The dealer has cameras but when they reviewed the footage they couldn't find any evidence of the vandalization. They wouldn't let anyone else review the footage so we're not really sure what happened. However, I do know that finding evidence of a crime that happened on their property would just cause the dealer trouble and likely force them to pay for damages.

She filed a police report but that didn't really help at all. Not that we really expected it to, but she was just trying to be complete. In the end she had to pay to fix the damage and the dealer (and the criminal) had no repercussions at all.


Law enforcement is working to protect itself, as it always does. Unfortunately the general public is not as organized.

https://therecord.media/new-jersey-law-enforcement-sues-data...


> "But stalkers!"

The market would adapt and provide solutions to this concern. First to the rich and the famous, then hopefully to the masses.


yeah, "you're not spying on your spouse the way your insurance company spies on you, right?" i don't like it


>"But stalkers!"

You mean private individuals doing exactly what the government doest to anyone who interests them enough?


Generally agree on the publicly available case. Several benefits, probably a couple downsides.

- People are more aware the data's available. Most people are probably minimally aware of how much they're actually being recorded all the time. It occurs occasionally on the news, yet most likely never really consider it much unless they're the subject of constant camera monitoring.

- Anybody can check anybody, and with extremely open data archives, then everybody also knows when other people check. "This many people clicked on your account" or something similar, just with surveillance footage. Like usual, police / FBI / spooks / ect... probably just write laws to legalize not telling you and that entitled people don't have to follow those guidelines. Theory's nice though.

- Data's already there and being used, yet, currently, only the police have the data, and you never know what it's being used for unless you ask. Even then you may get a wall of legal issues to ensure you're not allowed to find out. Or you have to lawyer up and pay expensive fees to fight the legal wall.

- Data can be used for other purposes. Anonymized statistics on usage of municipal resources, infrastructure, high traffic areas, crime area behaviors. Amazing what you can find even just cruising around on Google Street View in areas known for high crime. "Damn, just saw somebody pull a gun on the street car. Duck Google driver!"

- Adds to other sources like people's webcams, government satellites, sensor stations for things like weather updates and verification of events / conditions in other areas. Amazing how difficult it's become in the era of fake images to tell whether anything is "actually" happening in some distant location.

- Partially deals with the "Who watches the Watchmen issue." The other Watchmen. Crowd sourced observation and journalism has already shown on numerous occasions that it's often more responsive, and frequently fair, than a lot of the paid corporate journalism. Ukraine was a case where the crowd source journalism and data analysis was so much better than anything the news was showing, it was like every Wiki editor was down on the ground following troop movements. Barely get the mainline news to show anything other than stock footage.

Downsides:

- Obviously stalking. Although with notices about people checking your data frequently, there's at least some push back against the stalking.

- Profiling. However, this probably already gets done by the police anyways. Dark skin areas, "ethnic" areas, ect...

- Data's there, somebody will probably find an "app" that does something miserable with the data. Too little faith in humanity to believe they'll do almost anything else after LLMs and image gen.


I anticipate some kind of open source / crowd sourced ALPR app will become the public's answer to these private data collection systems. Everyone has a dash cam and a cell phone in their car. It wouldn't take much to run local video analysis onboard, capture all the cars around you on the road, compute make, model, distance, velocity, heading, and archive it in a local DB. Then have a "that guy cut me off" button that pushes the record to a collaborative open data set like open street maps.

I hate this idea. It's a shit idea. I expect it's coming anyway.


Until the passkey workflow goes sideways for "tech" people I don't think the risks will be acknowledged (if then even).

Those of us who don't want the let Google, Apple, or Microsoft manage our passkeys (i.e. pledging our fealty to our lords) will be seen as fringe lunatics.

I'll keep my workflow of always visiting sites by typing the URL myself, using a password manager, and TOTP 2FA w/ the secrets saved offline on paper. At least until I'm not allowed to do that anymore.


Same here, I don't like passkeys for many reasons. Another reason is that I can't see the key that I'm using. Therefore: What if Bitwarden doesn't pick up the passkey? Tough luck, I'm out of options. I cannot manually create a passkey entry in Bitwarden because it's all hidden magic. If I notice that the password manager doesn't pick up a registration then I just add it myself. Not possible with passkeys.


Luckily Bitwarden supports passkeys. And you can self host it. And even vaultwarden/bitwarden-rs supports passkeys


The various "bailouts" over the years (2008 financial crisis, COVID stimulus, PPP loan forgiveness, student loan relief) have left me pretty cynical. I paid my bills and didn't get "bailed out". What can I do to profit from this?


No kidding, it is dispiriting. You sort of want there to be a little bit of a cost to making irresponsible decisions. In the end, people with access to the same information as you, but who made a more selfish and short-sighted decision, get bailed out—by you, as a matter of fact, as a tax payer. There ought to be an updated parable of the grasshopper and the ants, where the grasshopper has a great time all summer, then gets bailed out in the winter, then has a great time the following summer, and so on indefinitely.


I feel the same way with student loans. I'm all for government sponsored state universities to compete with tuition at private universities, but to see people who are my age, in their 20s, complaining is so frustrating. I knew these people as teenagers, and I would ask them what their goal in life was. "To go to college" was a common answer. I'd ask them what about after that, and they didn't know, they just knew they HAD to go to college to get a degree and figure it out. They'd get upset with me if I pushed too hard on trying to ask them about specifics about anything, because they didn't care and didn't want to be bothered with it. They just wanted to hang out, have a good time, and not worry about the future.

My girlfriend in college had no idea how much her student loans were in total, how long they were financed, whether they were private or federal loans, etc. She just let her parents do all the financing work (they couldn't pay for her, but they did all the paperwork for her), and she just struggled with school while going out and drinking every night. Now she's graduated and we've long since parted ways, but she has a Bachelor's Degree yet she's currently working as a dog groomer and refusing to pay her student loans "because the government shouldn't force me to pay, I was only 18".

To see all of these people online now, complaining about how they were "groomed by the banks" to take out student loans drives me crazy. You made a bad decision. Your parents helped you make a bad decision. You need to live with that, and feel some form of "punishment" (in the form of not bailing them out) for not planning better for the future.


Majority of ppl are inresponsible and short-signed. So ppl who run this brothel have only two options on the table:

- try to save them from their own stupidity

- let the society collapse

We would ofcourse survive the collapse of society but majority of rich and poor ppl would die.

Rich and powerful want to keep the starus quo of power.

So we stay in this limbo of ppl becoming dumber and powerful bailing them out to not lose power.

Pretty simple stuff.


Some of these "bailouts" make sense. In 2008 these "bailouts" were loans that were paid back in interest. The COVID stimulus was given to everyone below a certain income threshold, so it was effectively a progressive tax cut. The student loan relief is an attempt to fix predatory loans (although it's not a complete solution, it does help though), and while I paid back my loans fully, I don't wish those loans on anyone, especially not those without a high paying job. The PPP loans were very abused, and while they are prosecuting quite a few over it, most of that money is gone forever. The only answer I have for that is not to vote in the politicians who enabled it.


Sad that you're getting downvoted. OP's post is essentially "People who made different choices than me should be punished."


Don't want to "punish" anybody. Those bailouts are all over now. I'm just tired of coming out on the losing end and would like to make something from this crisis, versus just paying for it.


How sure are you that you were on the losing end? What even is the "losing end" of a bailout?

Rising tides lift all boats. Do you know any people who worked on wall street? Then they were helped by a bailout. Do you know any people who worked with the people who worked on wall street? Then they were helped by a bailout. Etc....

Maybe there's a favorite restaurant of yours that stayed open throughout COVID because of PPP. Or there are people who you know who kept their jobs because of PPP.

Maybe there are favorite products of yours that have been created or will be created by people who were able to start companies after their student loans were forgiven.

My point is that these bailouts aren't only helping the people/companies that were directly impacted - we're all connected so it's rare that helping so many people doesn't indirectly affect you in some way.


You missed the rent moratorium. My cousin got to live rent free for 18 months.


it's not so much about profit. your earnings (taxes you pay) are paying for other people's debt. and if you save up a little cash then you're really screwed because FIAT is loosing 5 to 10% per year: that's what pays for all that deficit spending.


You missed your chance, I fully funded my wife and I's 6-figure salaries for our consulting firm for 2 years with PPP loans that were forgiven.


Buy a nice car from an auction?


One can look at the source code to a program, the libraries it uses, the compiler for the language, and the ISA spec for the machine language the compiler generates. You can know that there are no hidden unspecified quantities because programs can't work without being specified.

When you get down to the microcode of the CPU that implements the ISA you might have an issue if it's ill-specified. You might be talking about an ISA like RISC-V, though, specified at a level sufficient to go down to the gates. You might be talking about an ISA like 6502 where the gate-level implementations have been reverse-engineered.

You can take programming all the way down boolean logic if you need to and the tools are readily available. They don't rely on you "just knowing" something.


> One can look at the source code to a program, the libraries it uses, the compiler for the language, and the ISA spec for the machine language the compiler generates. You can know that there are no hidden unspecified quantities because programs can't work without being specified.

I doubt you actually can do that and understand it all. A computer can do it, but I doubt you the human can do that and get a perfect picture of any non trivial program without making errors. Human math is a human language first and foremost, its grammar is human language which is used to define things and symbols. This lets us write things that humans can actually read and understand the entirety of, unlike a million lines of code or cpu instructions.

Show me a program written by 10 programmers over 10 years and I doubt anyone really understands all of it. But we have mathematical fields that hundreds of mathematicians have written over centuries, and people still are able to understand it all perfectly. It is true that a computer can easily read a computer program, but since we are arguing about teaching humans you would need to show evidence that humans can actually read and understand complex code well.


> because programs can't work without being specified.

Someone hasn't read the C spec, with all its specified as undefined behavior.

Programs working on real systems is very different from those systems being formally specified. I suspect that if you only had access to the pile of documentation and no real computer system - if you were an alien trying to reconstruct it, for example - you'd hit serious problems.


Undefined behavior isn't a feature. A spec isn't an implementation, either.

All behavior in an implementation can be teased-out if given sufficient time.

> if you were an alien trying to reconstruct it, for example - you'd hit serious problems.

I can't speak to alien minds. Considering the feats of reverse-engineering I've seen in the IT world (software security, semiconductor reverse-engineering) or cryptography (the breaking the Japanese Purple cipher in WWII, for example) I think it's safe to say humans are really, really good at reverse-engineering other human-created systems from close-to-nothing. Starting with documentation would be a step-up.


Yes, humans are incredible at reverse engineering. My point was about specification, and what happens if you have only a specification and no implementation. Because that's more closely analogous to the mathematical situation, where you're manipulating the specification.

You said:

> because programs can't work without being specified.

.. what I think you may have meant was "can't work without being implemented", because your subsequent comments are all about implementation.

> Undefined behavior isn't a feature

Yes it is, it's a feature of the C specification.

This is where a whole load of pain and insecurity comes from, because as you say the implementations must do something when encountering undefined behavior, and people learn what usually happens, then an improvement is made to the optimizer which changes the implementation.


> All behavior in an implementation can be teased-out if given sufficient time.

Can it? Given what? You would need to understand how the CPU is supposed to execute the compiled code to do that. In order to understand the CPU you would need to read the manual for its instruction set, which is written in human language and hence not any better defined than math. At best you get the same level of strictness as math.

If you assume you already have a perfect knowledge of the CPU workings, then I can just assume that you already have perfect knowledge of the relevant math topic and hence don't even need to read the paper to understand the paper. Human knowledge needs to come from somewhere. If you can read a programming language manual then you can read math. Every math paper is its own DSL in this context with its own small explanations for how it does things.


> Every math paper is its own DSL in this context with its own small explanations for how it does things.

That's really the point though: not every piece of software defines it's own DSL, nor does it necessarily incorporate a DSL from some library or framework (which in turn may or may not borrow from other DSLs, etc.). It is also impossible to incorporate something from other software without actually referencing it explicitly.

Math, though, is more like prose in this respect – while any given novel probably has a lot of structure, terminology, and notation in common with other works in its genre, unless it is extremely derivative it almost certainly has a few quirks and innovations specific to the author or even unique to that particular work that you can absorb while reading or puzzle out due to context, as long as you accept that the context is quite a lot of other works in the genre (this is more true of some genres/subfields than others). Unlike novels, at least in math papers (but not necessarily books) you get explicit references to the other works that the author considered most relevant, but those references are not usually sufficient on their own, nor necessarily complete, and you have to do more spelunking or happen to have done it already.

Finally, like prose, with math you have to rely on other (subsequent) sources to point out deficiencies in the work, or figure them out on your own. Math papers, once published, don't usually get bug fixes and new releases, you're expected to be aware (from the context that has grown around the paper post-publication) what the problems are. Which means reading citations forward in time as well as backward for each referenced paper. The combinatorial explosion is ridiculous.

It would be great if there were something like tour guides published that just marked out the branching garden paths of concepts and notation borrowed and adapted between publications, but textbooks tend to focus on teaching one particular garden path.


> It is also impossible to incorporate something from other software without actually referencing it explicitly.

No, some programming languages just injects symbols based on context. You'd have to compile it with the right dependencies for it to work, so it is impossible to know what it is supposed to be.

And even if they reference some other file, that file might not even be present in the codebase, instead some framework says "fetch this file from some remote repository at this URL on the internet" and then it fetches some file from the node repository, which can be another file tomorrow for all we know. This sort of time variance is non-existent in math, so to me math is way more readable than most code.

And you have probably seen a programming tutorial or similar which uses library functions that no longer exists in modern versions, tells you to call a function but the function was found in a library the tutorial forgot to tell you about, or many of the other things that can go wrong.


> some programming languages just injects symbols based on context

Well, okay, yes, not all software projects deliver reproducible builds of their software. Some software is, in fact, complete garbage.

I'm also not using Gene Ray's TimeCube theory[0] as an example of a mathematical paper.

> This sort of time variance is non-existent in math

Not... entirely. You could cite a preprint that then changes in the final version.

> And you have probably seen a programming tutorial or similar which uses library functions that no longer exists in modern versions

Sure. And cited papers can be retracted entirely.

[0] https://web.archive.org/web/20070718050305/http://www.timecu...


> Well, okay, yes, not all software projects deliver reproducible builds of their software. Some software is, in fact, complete garbage.

And not all math papers are properly documented either. Some math papers are in fact complete garbage. Why are you complaining about an entire field just because some of it is garbage.


Why are you ignoring the fact that I specifically said I wasn't basing my criticism on the worst examples I could find?


All meaning of math notation can be teased out if given sufficient time.


Came here to say the same thing harshly and laced with profanity. I guess I can back off a bit from that now.

I was filled with crushing disappointment when I learned mathematical notation is "shorthand" and there isn't a formal grammar. Same goes for learning writers take "shortcuts" with the expectation the reader will "fill in the gaps". Ostensibly this is so the writer can do "less writing" and the reader can do "less reading".

There's so much "pure" and "universal" about math, but the humans who write about it are too lazy to write about it in a rigorous manner.

I can't write software w/ the expectation the computer "just knows" or that it will "fill in the gaps". Sure-- I can call libraries, write in a higher-level language to let the compiler make machine language for me, etc. I can inspect and understand the underlying implementations if I want to, though. Nothing relies on the machine "just knowing".

It's feels like the same goddamn laziness that plagues every other human endeavor outside of programming. People can't be bothered to be exact about things because being exact is hard and people avoid hard work.

"We'll have a face-to-face to discuss this there's too much here to put in an email."


You seem to be complaining that math isn't programming, that it's something different, and you've discovered that you don't like how mathematicians do math.

Math notation is the way it is because it's what mathematicians have found useful for the purpose of doing and communicating math. If you are upset and disappointed that that's how it is then there's not a lot we can do about it. If there was a better way of doing it, people would be jumping on it. If a different way of doing it would let you achieve more, people would be doing it.

It's not laziness, and I think you very much have got the wrong idea of how it works, why it works, and why it is as it is. Your anger comes across very clearly, and I'm saddened that your experience has left you feeling that way.

Maths is very much about communicating what the results are and why they are true, then giving enough guidance to let someone else work through the details should they choose. Simply giving someone absolutely all the details is not really communicating why something is true.

I'm not good at this, but let me try an analogy. A computer doesn't have to understand why a program gives the result it does, it just has to have the exact algorithm to execute. On the other hand, if I want you to understand why when n is an integer greater than 1, { n divides (n-1)!+1 } if and only if { n is prime } then I can sketch the idea and let you work through it. Giving you all and every step of a proof using Peano axioms isn't going to help you understand.

Similarly, I can express in one of the computer proof assistants the proof that when p is an odd prime, { x^2=-1 has a solution mod p } if and only if { p = 4k+1 for some k }, but that doesn't give a sense of why it's true. But I can sketch a reason why it works, and you can then work out the details, and in that way I'm letting you develop a sense of why it works that way.

Math isn't computing, and complaining that the notation isn't like a computer program is expressing your disappointment (which I'm not trying to minimise, and is probably very real) but is missing the point.

Math isn't computing, and "Doing Math" is not "Writing Programs".


I really, really appreciate your reply and its tone. Thank you for that. You've given me some things to think about.

I often wish people were more like computers. It probably wouldn't make the world better but it would make it more comprehensible.


Thanks for the pingback ... I appreciate that. And thanks for acknowledging that I'm trying to help.

It might also help to think of "scope" in the computing sense. Often you have a paragraph in a math paper using symbols one way, then somewhere else the same symbols crop up with a different meaning. But the scope has changed, and when you practise, you can recognise the change of scope.

We reuse variable names in different scopes, and when something is introduced exactly here, only here, and only persists for a short time, sometimes it's not worth giving it a long, descriptive name. That's also similar to what happens in math. If I have a loop counting from 1 to 10, sometimes it's not worth doing more than:

    for x in [1..10] {
      /* five lines of code */
    }
If you want to know what "x" means then it's right there, and giving it a long descriptive name might very well hamper reading the code rather than making it clearer. That's a judgement call, but it brings the same issues to mind.

I hope that helps. You may still not like math, or the notation, but maybe if gives you a handle on what's going on.

PS: There are plenty of mathematicians who complain about some traditional notations too, but not generally the big stuff.


> We reuse variable names in different scopes

This example works against you. Scope shadowing is nearly universally considered bad practice, to the point that essentially every linter is pre-configured to warn about it, as are many languages themselves (eg prolog, erlang, c#, etc)

To a programmer, you're saying "see, we do it just like the things you're taught to never ever do"

.

> You may still not like math, or the notation,

The notation is probably fine

What I personally don't like is mathematicians' refusal to provide easy reference material

Programmers want mathematicians to make one of these: https://matela.com.br/pub/cheat-sheets/haskell-cs-1.1.pdf

It doesn't have to be perfect. We don't need every possibility of what y-hat or vertical double bars means. An 85% job would be huge.


> Programmers want mathematicians to make one of these: https://matela.com.br/pub/cheat-sheets/haskell-cs-1.1.pdf

There are lots of maths cheat sheets like that. Maths is big, like all-programming-languages big. Just like in programming, notations are re-used in different areas with different meanings, and different authors sometimes use different notation for the same meaning. A universal cheat sheet is impossible (just like a general programming cheat sheet is), but many cheat sheets or notation reference pages exist for particular contexts, one of which is "the basics", e.g. https://www.pinterest.nz/pin/734016439237543897/. Try searching or image searching for [math cheat sheet], [linear algebra cheat sheet], etc.

> mathematicians' refusal to provide easy reference material

This is an absurd claim. There is no such general refusal. On the contrary, many mathematicians provide their students with relevant easy reference material constantly. We sometimes spend entire semester-long courses providing easy reference material, and there are many books with exactly the kind of cheat-sheet you want inside the cover, or in an appendix or front matter (as well as the ones on the internet mentioned above).


> There are lots of maths cheat sheets like that.

I have never found one that gets me through the average undergraduate CS paper.

If you know one, I would greatly appreciate a tip. Unfortunately, I saw what you offered, and that's not it.

.

> A universal cheat sheet is impossible

I explicitly stated that this was a non-goal.

.

> one of which is "the basics", e.g. https://www.pinterest.nz/pin/734016439237543897/.

That one is far too basic. It doesn't even have things like average, or absolute value. It includes things nobody needs explained, like subtract, and things that aren't math, like logical operators.

This is why I said "yes, people have tried, but nobody has succeeded."

The explicit context was the average software paper. We're talking about programmers.

.

> > mathematicians' refusal to provide easy reference material

> This is an absurd claim.

Lots of people in here seem to agree with me. YMMV.

Feel free to provide me easy reference material.

.

> On the contrary, many mathematicians provide their students with

> We sometimes spend entire semester-long courses

The explicit context is "to people who aren't mathematicians or mathematics students."

Remember, we're talking about programmers who are appealing to the mathematics community for help.

If your response to "you guys won't give programmers a short easy two page PDF at the level we need" is to remind me that you give your own students semester long courses, then you've absolutely failed to understand what's being said.

.

> there are many books with exactly the kind of cheat-sheet you want inside the cover

Every time I ask for one, I get something with symbols meant for children learning arithmetic, like the one you gave. It explains plus and percent.

Again, 𝗧𝗵𝗲 𝗰𝗼𝗻𝘁𝗲𝘅𝘁 𝗶𝘀 𝗽𝗿𝗼𝗳𝗲𝘀𝘀𝗶𝗼𝗻𝗮𝗹 𝗽𝗿𝗼𝗴𝗿𝗮𝗺𝗺𝗲𝗿𝘀 𝘁𝗿𝘆𝗶𝗻𝗴 𝘁𝗼 𝗴𝗲𝘁 𝗵𝗲𝗹𝗽 𝗿𝗲𝗮𝗱𝗶𝗻𝗴 𝗖𝗦 𝗽𝗮𝗽𝗲𝗿𝘀 𝘄𝗿𝗶𝘁𝘁𝗲𝗻 𝗯𝘆 𝗺𝗮𝘁𝗵𝗲𝗺𝗮𝘁𝗶𝗰𝗶𝗮𝗻𝘀.

If you think we need plus explained to us, and that the next step is an insular semester long lecture course that isn't offered to us, then how can you possibly be surprised that we think you failed us?

.

> as well as the ones on the internet mentioned above

You only mentioned one. The other one is something I gave, from our community, trying to explain to you the kind of thing I want.

It covers topics like monads, pattern matching, infix, operator precedence, typeclasses, infinite lists, codata, higher order functors, special folds, tuples, numerics, modules, tracing, list comprehensions, and dealing with the compiler itself.

You patted me on the head and taught me that * means times.

I continue to feel that the mathematics community refuses to understand the needs of the programming community, or provide appropriate reference material.

It's either "this is arithmetic" or "let's do linear equations in russian"

There's no practical middle ground and you seem resistant to even understanding that such a thing exists

Programmers aren't ignorant like the other mathematicians in here have repeatedly said. You can't do our things any more than we can do yours. We've seen your code.

It's just that when you ask us for appropriately costed reference, we comply, and you do not even grok.

You really put up a thing that explained `less-than`, as if that was what you were being asked for.

It turns out that most programmers know what the equals sign means.

There is nothing of practical value in the actual domain space being talked about, here.

Nothing here is beyond a highschool pre-calculus class. That is not the level that professional programmers need.

I don't mean to seem rude, but it feels a little bit like being talked down to, having it suggested that this is the level of help my occupation is asking for.

The statisticians can do it: http://web.mit.edu/~csvoss/Public/usabo/stats_handout.pdf

This isn't far off: https://en.wikipedia.org/wiki/Glossary_of_mathematical_symbo...


> I have never found one that gets me through the average undergraduate CS paper.

That cheat sheet would be written by CS folks, since every applied domain uses their own quirks in their notations. Mathematicians can't help you there. You can't blame mathematicians for the shortcomings of CS researchers.


Many textbooks have pages that explain the mathematical notation used. Here's an example from a linear algebra textbook: http://linear.ups.edu/html/notation.html

But it doesn't make sense to put lists of notation everywhere mathematical notation is used like in a journal or because the audience is already expected to know it. If the author does something weird or non-standard it's typically explained, sometimes it's even explained if it's pretty standard.

Different branches of math, physics, statistics,etc. will redefine the same symbols to mean different things but that's not much different from programming languages. & in C++ has a different meaning than & in R. Like the first step of understanding someone else's code is to know what language your looking at it's important to understand the context of what you're looking at. Look at previous cites, relevant textbooks, ask around, reread the paper again. I've read some papers a dozen times easy before it clicked.


> This example works against you. Scope shadowing is nearly universally considered bad practice

So you never used the same variable name in two different scopes ever? Like, if a function takes argument "name", no other function you ever write again in any program can have a variable named "name" unless it is the same exact usage?

Or, as is commonly complained about in math, every programmer in the world then use the variable "name" only for that usecase and otherwise comes up with a new name for it?

Having different scopes doesn't imply shadowing, it just means that you define it and then use it and then scope goes out and it no longer exists. No mathematician knows even close to every domain, so different domains of math uses notation differently. It is like how different programmers programs in different programming languages. It is such a waste to have so many programming languages, but people still do it for legacy reasons.


> So you never used the same variable name in two different scopes ever?

That's not what shadowing is.

.

> Having different scopes doesn't imply shadowing

I didn't say that it did.

.

> No mathematician knows even close to every domain

this is irrelevant to a lightweight two page cheat sheet for simple mathematial symbols

part of the problem is that if we ask you for a simple thing that isn't perfect or exhaustive, you lecture us on how no document could contain every concept

that's very clearly not what's being requested of you. the same is true of haskell. that sheet doesn't contain all of haskell. i doubt anybody knows all of haskell, which of course is far smaller than mathematics.

i'd like to stay in the practical world. it was clearly stated that an exhaustive solution was a non-goal.

let's try to do one thing that doesn't have a limit at infinity. (i'm sorry, i'm a programmer, math jokes are hard)

surely you've made food, right? did you learn the recipe from a cookbook? did it contain every ingredient and recipe that any cook ever knew? did it go over the chemistry of the protein denaturing, the physics of the water boiling, the ethics of importing the burner fuel, allergy responses, cultural backgrounds, molecular weights, how to make things in a duck press?

no?

was that because the cookbook was just good enough? it was just like "use this much chicken and this much onion, two tortillas, some cilantro and lime?"

nobody wants exhaustive anything. if you managed somehow to produce that (ie by just giving the manual page to the wolfram language) it would be rejected as the exact opposite of what was being requested.

the thing you're protesting is the i'm saying is the wrong job.

cool beans. the thing i'm actually asking for is straightforward.

someone already gave me one, but the difficulty level was aimed at children, instead of professional programmers, the space requested, while also calling me ignorant. it's a shame; that one was almost it.

but it should be things like `ŷ` and `||x||` and whatever. it should include sum, integral, and product for the juniors. For `|x|` it should say "Absolute value, magnitude, length, or cardinality."

it doesn't need `||x||` because we need someone to teach us what absolute value and cardinality and so on mean. it needs `||x||` because we forgot what double-bar says, and if we have that list of four things, we can figure out which one it is just like you can.

we know what magnitude is. we just don't know what `||foo||` says.

we just need our cracker jack decoder rings. we get the ideas. we don't get your letters.

It's not explaining anything. It's just a cheat sheet. You aren't solving education. C'mon.

This isn't too far off: https://en.wikipedia.org/wiki/Glossary_of_mathematical_symbo...

.

> It is such a waste to have so many programming languages

if you say so. i rather like the several that i created.

things aren't a waste just because you don't know what they're for.

you might as well assert that it's a waste that there are so many tools in the shop.

The statisticians can do it: http://web.mit.edu/~csvoss/Public/usabo/stats_handout.pdf

.

> every programmer in the world then use the variable "name" only for that usecase and otherwise comes up with a new name for it?

this isn't terribly uncommon, primarily because our culture is to use long descriptive names, which have a far lower natural collision rate

i do get that symbols can overlap. that's okay! nobody's complaining about that. it's fine for `||x||` to mean four things. we're just as bad as you are about that. there's half a dozen meanings for stack, another five for heap, seven for map, five for vector, i don't even want to get into what a mess "array" is, et cetera.

but if i was writing a cheat sheet for you, I could write `< > generally means a generic type, a tuple, greater than less than, an HTML/XML/SGML tag, an email inclusion, or an IRC handle`

Is that exhaustive? Naw. I can name another dozen off the top of my head.

But that's generally going to be good enough.

All I want is generally good enough. just please write down what they are?


> That's not what shadowing is.

But that is what mathematicians do. Why did you bring up shadowing if it isn't relevant?


> Math notation is the way it is because it's what mathematicians have found useful for the purpose of doing and communicating math.

That's only really a good description for the most well trod areas, where people habe bothered to iterate. I think a more realistic statement would be:

"Math notation is the way it is because some mathematician found it sufficient to do and communicate math, and others found it tolerable enough to not bother to change."

Personally, though, my problem has always been where publications use letters and symbols to mean things that are just "known" in some subfield that isn't directly referenced. It's not a problem for direct back and forth communication during development, true, but it dramatically increases the burden on someone who wants to jump in.


I mostly agree with you.

That all said, it would still be quite nice if it was somehow more accessable. A lot of papers containing material that's probably actually quite standardizable remain opaque to me, and the notation invariably falls by the wayside if there's a code or language description available.

Many times, math notatons have been thought to be minimal, or most clear possibly, only to fall by the wayside

Whereas this notation serves domain specialists well, it still leaves people like me somewhat confused

A cheat sheet - even to the practical norms - would go a long way


This is a pretty good defense. Well done.


Here's a take from a mathematician-in-training, and it's biased toward research-level math, or at least math from the last hundred years:

Math is difficult, and a lot of what we have is the result of the sharpest minds doing their best to eke out whatever better understanding of something they can manage. Getting any sort of explanation for something is hard enough, but to get a clear theory with good notation takes an order of magnitude more effort and insight. This can take decades more of collective work.

Imagine complaining about cartographers from a thousand years ago having sketchy maps in "unexplored" regions. Maps are supposed to be precise, you say, there's actual earth there that the map represents! But it takes an extraordinary amount of effort to actually send people to these places to map it out -- it's hardly laziness. Mathematics can be the same way, where areas that are seemingly unrigorous are the sketches of what some explorers have seen (and they check that their accounts line up), then others hopefully come along and map it all in detail.

When reading papers, there's a fine balance of how much detail I want to see. For unfamiliar arguments and notation, it's great to have it explained right there, but I've found having too much detail frustrating sometimes, since after slogging through a page of it you realize "oh, this is the standard argument for such-and-such, I wish they had just said so." You tend to figure that something is being explained because there is some difference that's being pointed out.

I've been doing some formalization in Lean/mathlib, and it is truly an enormous amount of work to make things fully rigorous, even making it so that all notation has a formal grammar. It relies on Lean to fill in unstated details, and figuring out ways to get it to do that properly and efficiently, since otherwise the notation gets completely unworkable.


> There's so much "pure" and "universal" about math, but the humans who write about it are too lazy to write about it in a rigorous manner.

Are you sure it's laziness? Maybe it's a result of there not actually being any universal notation (not even within subfields) or the exactness you refer to really isn't necessary. This doesn't mean that unclear exposition is a good thing. Mathematical writing (as with all writing) should strive towards clarity. But clarity doesn't require some sort of minutely perfectly consistently notation which would be required by a computer because humans are better than computers at handling exactly those kinds of situations.

> People can't be bothered to be exact about things because being exact is hard and people avoid hard work.

I think you have it wrong. People can't be bothered to be as exact because they don't need to. People can understand things even if they are inexact. So can mathematicians. Honestly this is a feature. If computers would just intuitively understand what I tell them to do like a human assistant would, that would be a step up not a step down in human computer interfaces.


> But clarity doesn't require some sort of minutely perfectly consistently notation which would be required by a computer

I made this point in another comment, but I think it bears repeating and elaboration: Consistency isn't required (at least outside any single paper), but explicitness would be a tremendous boon.

Software incorporates outside context all the time, but it pretty much always does it explicitly (though the explicitness may be transitive, ie. dependencies of dependencies). Math papers often assume context that is not explicitly noted in the citations, nor those papers' citations, etc.

Instead, some of the context might only be found in other papers that cite the same papers you are tracking down. You sometimes need to follow citations both backward and forward from every link in the chain. And unlike following citations backward (ie. the ones each author considered most relevant), the forward links aren't curated and many (perhaps most) will be blind alleys (there also may be cycles in the citation graph, but these are relatively rare). But somehow you have to collect knowledge (or at least passing familiarity) with an encyclopedic corpus in order to at least recognize and place the context left implicit in any one paper in order to understand it.

It's maddening.


I totally agree. I think that many mathematical papers aren't explained as well as they can be. My advisor was pretty adamant that papers should not be written in some proof-chasing style like you describe and that the author should clearly include the arguments they need (citing those authors they might have learned it from) unless those arguments are truly standard. No "using a method similar to [author] in Lemma 5 of [some paper]" and instead just including it in your paper and making sure if fits in well.

That is just an example of bad exposition in my opinion. It's also not technically "unclear" in any notational sense so it's a bit of an aside from this argument. But I agree with you 100% that it is bad bad bad. This is a perfect example of why arguments like "does this proof make coq happy" totally misses the point.


> That is just an example of bad exposition in my opinion [and] a perfect example of why arguments like "does this proof make coq happy" totally misses the point..

In theory, some kind of checker could validate the semantics of a paper to just tell you whether the arguments made are complete. Not whether there is a correct formal proof, just pointing out any obscured leaps of faith[0]. A rough analogue to test suite coverage for code (which is also not any sort of guarantee of correctness, just basic reassurance that all (or most of) the code is tested and isn't broken in any obvious way, especially while making changes).

I'm trying to think of an equivalent for prose, and am coming up with examples like detecting conflicting descriptions of locations or named characters, or whether the author lost track of which character said which lines in the dialog.

> It's also not technically "unclear" in any notational sense

Perhaps not necessarily, but unfamiliar/borrowed/idiosyncratic notation is a perfect (and common) place for insufficient exposition to be hiding.

[0] https://imgur.com/gallery/ApzhVFj


People can also understand each other through combinations of obscure slang, garbled audio, thick accents, and drunken slurring. It's still an unpleasant way to communicate.

Shall we be satisfied with the same low standards in a technical field, because it is how it is?

Hands-on users of math notation are complaining that it sucks. I'm not sure why a dismissive "works for me" is so often the default response.


> Hands-on users of math notation are complaining that it sucks. I'm not sure why a dismissive "works for me" is so often the default response.

It is really easy to complain. People also complain about every popular programming language, but it is really hard to make something that is actually better. It is easy to make something that you yourself think is better, but it is hard to make something that is better in practice.


Personally, I stopped complaining about Java, Perl, and PHP when I didn't have to read or write them anymore for work.


> Hands-on users of math notation are complaining that it sucks. I'm not sure why a dismissive "works for me" is so often the default response.

Are you sure this is because the notation is unclear/imprecise or because you just don't like it? I like certain programming languages and certain programming styles and really don't like others. But in none of the cases (those I like nor those I don't) are they not 100% "clear". The code compiles and executes after all so there really isn't much of an argument that somehow it's underspecified.

The same thing exists in mathematics. There are certain fields of math whose traditional notation/style/approach/etc. are totally incomprehensible to me. There are also many mathematicians who would say the same about my preferences as well.

So my point is that all people are _different_. Some people like certain things and some people like others. How can you hope to please everyone simultaneously? In my experience, there is no field at all that is as precise as mathematics. Sure "code" is precise, but (imo) professional programmers are nowhere near as precise in any general design or conversation than mathematicians. So I find the attack on supposedly bad mathematical notation a bit odd.

Mathematicians constantly try to come up with better methods of explaining things. They put more effort into it than basically any field in my experience. The problems are really that we as humans don't all think the same and that mathematics is just plain hard. We've improved mathematical communication immensely throughout history and we will continue to do so. But we'll never reach some sort of perfect communication style because no single such style could ever exist.


There are formal grammars. The formal grammars are really hard to understand in my humble opinion. The best examples I think are COQ (see e.g. https://en.wikipedia.org/wiki/Coq) and Lean (see e.g https://en.wikipedia.org/wiki/Lean_(proof_assistant) ).

Yes, we are too lazy to be 100% formal and many times we are too lazy to be mostly formal. This is mostly because we target our writing to other mathematicians who have no need to see every small step and including every step makes the proofs long. On the other hand, I do feel that generally speaking mathematicians should show more of their work and skip fewer steps.

I find your statement "People can't be bothered to be exact about things because being exact is hard and people avoid hard work." to be very true. Being precise is difficult.


It doesn't feel like it will be long until we're getting CAPTCHAs like: "Click the pictures of humans brandishing firearms."


As long as I get to look at cat photos afterwards I'm good


"Tag the insurgents"


The current "system" is very hostile to small business. I don't think the average person realizes how bad it is.

I abandoned a financially successful independent contracting business of 15 years and went back to being an employee (ugh) because the Republican efforts to repeal the ACA freaked me out. I'm old enough that my wife and I both have some pre-existing non-chronic conditions, and I have a young daughter. Paying ~$20K of premiums annually for the privilege of having a $13K deductible was grudgingly acceptable to me. The uncertainty was not. The stress of wondering if insurance was going to be available year-to-year (in the final year I operated the business my County had exactly one marketplace plan available) was extreme.

I happened to get a job offer from a Customer. I decided that, though I swore I'd never be anybody's employee again, it was irresponsible in light of the efforts by the GOP to repeal the ACA to continue gambling that insurance would be available.

I had a taste of being uninsured in 2014 when, on pre-ACA grandfathered insurance, I had to pay out-of-pocket for treatment related to a pre-existing condition that wasn't covered. The idea of just how close we all are to a medically-induced bankruptcy really hit me at that point.

My quality of life is much worse now. I used to bill in 20 hours what I now have to report to the office "40 hours" to earn. I used to be able to spend more time with my family. I used to have the flexibility to make decisions about how and when I wanted to work, take time off, etc.

My Customers, some of whom I'd worked with for 10+ years, ended up stuck going with drastically more expensive (and less attentive and skilled) "managed services" companies to service their IT needs. The emails and calls that I get regularly asking for my help, along with the out-right statements to the fact, tell me that the situation for them isn't better either.

It's a racket.


I have genuine questions about how I am supposed to conduct myself in a culturally sensitive way about this issue. I have a daughter who knows nothing of sex yet. Eventually there are conversations that I expect will need to occur.

I sincerely don't want to "blame the victim". I also don't understand where that line falls. I've felt that accusing a speaker of "victim blaming" is sometimes used as a way to shout them down in a discussion (or worse, target them for being "canceled" in their life outside of Internet discourse-- hence my use of a throwaway account here).

I do not culturally accept sexually predatory behavior. It is wrong. It should be prosecuted. I do not have a "men will be men" attitude. I also accept that I can't change the culture of institutions myself, no matter how fiercely I believe what I believe.

- Am I "victim blaming" if I talk to my daughter about the existence of sexual predators and her vulnerability?

- Am I in the wrong if I caution her about institutions or settings that have historically adopted a "men will be men" attitude?

- Am I in the wrong if I explain the realpolitik that individuals' rejection of "men will be men" doesn't automatically change the entrenched attitudes of organizations that might systematically act to shield predators and ignore vitims?

- I've already had the conversation about how people are generally good, but that we have to be cautious because there are people with criminal motivations, or mental illness, that might cause them to want to hurt us in a general sense (non-sexual violence). Am I "victim blaming" if I talk to her about general situational awareness, being cautious, and not putting yourself into situations where general violence might occur?


I abandoned my business of 15 years last year because of worry about availability of health insurance. It looks like I was in a similar situation to you.

My business was going well. I was making equivalent income of a bit more than a high-end salary for my skill set in my area. I was fine (grudgingly) paying the premiums.

The problem was that there was absolutely no assurance insurance that would actually cover anything would be available in the future. We had "marketplace" plans for a number of years. Eventually there were no "silver" marketplace plans available in my locale (and "bronze", with 20% "co-insurance", means I'm going bankrupt if I have any significant events anyway, so I might as well just have no coverage).

I gave up and took a job. I couldn't expose my family to the risk of not having any insurance available. It was crushing.

I took the route of not having employees and was unable to qualify for "business" plans. I guess I wasn't "successful" after all, since I didn't aspire to grow the business beyond what would support my family.


Stories like these are reflective of the fact that there are higher rates of entrepreneurship in countries with universal healthcare.


it's insane to me that the democrats haven't latched onto this as a talking point... it's a pro-small business move that would spark entrepreneurship that both sides love to talk about. the big businesses know that it will kneecap their ability to retain employees with the threat of losing their subsidized insurance.


Does the "bronze" plan have "out of pocket maximum"? It supposed to be 20% up to the OOM and total coverage after, right?


Using a gigantic overly complex software written in a non-memory-safe language for a system that gives commands to the flight control system?

This is how you get the Cylons pwning your spaceships. >sigh<


We get it, you like Go. Meanwhile they are in space tho..


Isn't mission critical software overwhelmingly C++?


A lot of Ada as well.


Adama wouldn't have worried so much about networked ships if the people who programmed this ship programmed Battlestars. They still used COBOL.


They still used COBOL.

But they were experts in using it, some even say Lords


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: