My theory is that is what is latent in the terrain of human psychology is a propensity to view misconceptions about which problem it is, as evidence there isn't one. When I stop to think about it, I realize that the misconceptions are evidence there is.
With that in mind: in what way should we terraform the latent terrain of human psychology to fix this problem? Because breaking up the tech companies seems like a way, but you seem to have a different one in mind.
I tend to agree with your general intuition that presupposing modern understandings creates all kinds of problems understanding ancient texts, some of which is in translation. But the specific case of "apokalypsis" it is kind of the opposite.
There is quite the extensive record of first/second century apocalyptic literature, (some of it even became the NT). These preserve remarkably detailed pictures of what apocalyptic thinking looked like long before Nicea.
The main thing to say is the concept of apocalypse was much broader than today's. To your question, it did mean a revelation of relationship with god, and then it also meant an end-of-times sense, and it also was political commentary, and it was also a conspiracy theory, etc. People did not distinguish between which modern, narrower concept they meant, because they actually meant the broad concept.
I suppose a different claim strikes me as false. "should have been deemed classified at the time they were sent" is one thing, "there was, in fact, confidential stuff in there" is a different thing.
I think a decent case can be made that one rounds up to the other, but I guess that case seems more like an argument to be made than a fact to be corrected.
Just because something isn't labeled classified doesn't make it not classified. If you work anywhere with classified information, you are expected to know certain information is classified, or may become classified later. You may not always know the latter, but you should know the former.
I'm not going to defend classifying embarrassing information because it's well -- embarrassing. But the established trend is to classify information "just to be safe" and let someone else make the declassifying decisions, particularly someone that's not you.
There was a weird issue with Wikileaks in that publicly released information was still considered classified, and any documents must be still treated as such.
Was that silly, yes. This led to a weird issue where journalists and members of the public had more access to certain classified documents than people holding clearances.
To me, it is apparent that the data cannot support any clean division between two "sides", it tells a more complicated story about sometimes there was apostolic authorship, sometimes not, and sometimes we don't really know.
I would suggest that the real academic consensus is that we can confidently rule out the us-vs-them preoccupation that is common in lay discussion.
"No sides in science" is a silly idea. Of course, scholars have biases. They're human. Humans like to group up and gang up against other.
Specific to Bible Scholarship, I wager the two big sides are scholars who have faith (i.e., Nicene Creed) and scholars who have little. Bruce Metzger who had some faith, and Bart Ehrman who has none. RSV/ESV which says Jesus is the "Son of God" in Mark 1, and NRSVue which deletes "Son of God" from Mark 1.
There are plenty of YouTube videos that go into the subject thoroughly. I couldn't find the one I watched recently stating the notion that the gospels ever could have been totally anonymous is absurd. Nobody would take you seriously, reputation was everything in the ancient world. The people of the time knew exactly who wrote what, even if there weren't any direct titles on the actual manuscripts.
So then who wrote Hebrews? It wasn't Paul's writing style, and it doesn't name it's author. Matthew and Luke don't name the Q source material they have in common. Let's take gMark, someone composes it around 70AD somewhere. It gets copied and sent to other communities elsewhere. Decades later it's attributed to Mark.
Reputation has never been everything & as crazy conspiracy theories like Qanon & antivax prove, some sizable fraction of the population will find a way to believe whatever they want to.
Sure. They had crazy conspiracy theories back then too. Anyone can believe what they want. But reputation means something today just as it did back then. It's just today we outsource that function to the academic system.
Consider an alien which subsists on photons, which is a form of life that exists today. We know from heseinberg that the sensing of this food "here" or "there" is nonphysical. Presumably our creature's civilization would require no heisenberg to discover what anyone can see from their own photosensor.
Rather it is the concept of objects remaining in a single place that would require some real mathematical innovation to a creature with no experience of such an idea. And so this distinction of entirely separate logical states, far from being basic or inescapable, is our very human invention. It is useful for creatures like us, who perceive things in one place when they are not really so, who do their computing with sand in a region where it's bountiful, and who encode abstractions as software because doing so in dedicated hardware is more costly.
While it is certainly possible that all intelligent life would have these constraints, there is no particular reason to expect it. What we can expect is that humans will expect others to be too much like ourselves; it's a well-known cognitive defect in our species.
> Consider an alien which subsists on photons, which is a form of life that exists today.
Plants don't just subsist on photons, there are many other ingredients.
> We know from heseinberg that the sensing of this food "here" or "there" is nonphysical.
I don't know what this means. How do you "non-physically" sense photons?
> Rather it is the concept of objects remaining in a single place that would require some real mathematical innovation to a creature with no experience of such an idea. And so this distinction of entirely separate logical states, far from being basic or inescapable, is our very human invention.
Assuming you're talking about some alien made of bosons that aren't subject to the Pauli exclusion principle, you'll note that bosons still interact with fermions in which that principle does apply, so I don't think your argument follows. I admit I don't really understand your premises though so I have no idea what you really meant.
I am also writing a game engine, also because of the gameobject issues, also doing tens of thousands of simulated AI agents. In my case they are GPU-accelerated and I'd guess you are doing something basically similar even if it's CPU-based.
Would be interested to discuss architecture as there aren't many of us out there and not much of a shared knowledge base. I looked for contact info in your profile but didn't immediately turn it up.
The main problem with the interpretation that the GPL sets conditions for use of software is that it specifically claims it does not:
> Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running the Program is not restricted,
Now there are some ways to try to do an end-run around this clause, it does raise some questions about how you get the right to run the program without "this License". But if we take the text seriously, it disclaims any restrictions on use.
This is the part where the derivative work standard comes in. Because if the GPL doesn't set conditions on use, your use of the software doesn't require complying with that license. The GPL would only enter in if you are going beyond mere use, e.g. are you making a legal derivative work (not the GPL definition)
I find it difficult to imagine a situation in which you could use the library without copying the library. How would you obtain the code to execute it? Someone must distribute it to you, and/or you must copy it. Those are the governed actions.
EDIT: note that the GPL broadly doesn't impose restrictions on interacting with a program over a network ("using" a server), which is why the AGPL exists.
So, one answer (not necessarily the right one!) to how you can use software without copying it is that "the license says you can". In other words the license definitely views "running" and "copying" as distinct, regardless of you, me, or federal law.
The other, maybe more familiar idea to lawyers, and maybe more plausible to you, is the one in MAI v. Peak, that running is copying by definition. (The argument is based on the idea that running a program copies it into RAM, so we don't even need to talk about how you obtained the software.)
The way this shakes out is as follows:
a. If running the program is "not covered by this License", then we can stop reading the license and return to copyright law. But copyright law says we need a license to run (that is, to copy, MAI v. Peak) the program, so where do we get it, if not from "this license"? Bit of a puzzler.
b. If "running the Program is not restricted", maybe that sentence is by itself some kind of license to "run" the program, even though that contradicts the "not covered" part? If so, we need to understand what the license means by "run" which is evidently something different than "copy".
Very probably, what this clause originally meant was that people who think like MAI v. Peak are wrong and nobody should need a license to run software. If so, it's pretty challenging to turn around and argue "just kidding, they do"
Congress has changed the law since MAI v. Peak specifically to allow users to create copies and adaptations of software programs as an essential step in running or utilizing the program in question on a machine, or for limited archival purposes (i.e. backups).
"Adaptation" is another name for "derivative work". So it would appear that it is not a copyright violation for an end user to load or dynamically link modules with incompatible licenses into ram, or statically link them together, or make binary modifications, as long as what they are doing is necessary to use the program.
It might still be a license violation of course, but it is probably a fruitless exercise to go after users who are exercising what are ordinarily considered to be well established rights necessary to use the software, the very thing that made MAI v. Peak an unfortunate ruling that Congress had to fix. Surely that sort of thing - even by third party technical support - should have been considered fair use from the beginning.
> Congress has changed the law since MAI v. Peak specifically to allow users to create copies and adaptations of software programs as an essential step in running or utilizing the program in question on a machine, or for limited archival purposes (i.e. backups).
That's not entirely true, as far as I understand. The rule that a user may copy the work to RAM as a fair-use exemption already existed at the time of MAI v Peak. However, it was found not to apply, since the person that loaded the program into memory was not MAI's client, but a Peak employee, who was fulfilling a separate contract with MAI's client, and who had never legally obtained a copy of the work from MAI (which would have entitled them to load it in memory themselves).
Basically, the court at the time found that if I have a copy of Windows from MS, I am allowed to copy it into memory and run it. But, a repairman I hire is not allowed to load my copy into memory and run it themselves, unless they also legally own a copy of Windows.
However, Congress did amend copyright law in light of MAI v Peak, to extend the existing fair use exemption for copying into RAM to people acting as service/repair contractors.
Thanks for the clarification on that. Either way, MAI v. Peak was kind of a disastrous decision that presumably could have easily been resolved under fair use for the service contractor. Congress included Section 107 for a reason, and no doubt part of the logic is that courts use it instead of creating a make work program for Congress on the subject.
That would seem to go double for whichever court decided that ephemeral copies in RAM are "fixed" in a "tangible medium of expression", in direct contradiction of the plain meaning of the term.
> The other, maybe more familiar idea to lawyers, and maybe more plausible to you, is the one in MAI v. Peak, that running is copying by definition.
That is a bit over-simplified. The important part in MAI v. Peak was that Peak was a third party to the license between MAI and their client. The court recognized that MAI's client had the right to load/copy the program that they had acquired from MAI into memory without any additional license from MAI - there was already an explicit exemption in copyright law for this. However, they decided that this right can't be extended to a 3rd party (Peak, who as acting as a support technician), even if on the same machines.
In its’ very basic form a license has two distinct purposes:
a) limit whatever rights are granted by copyright law
b) grant new rights that go beyond copyright law
> Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running the Program is not restricted
I don’t see a contradiction
here.
The first part states that anything being granted by this license (purpose b) only applies to copying and distributing.
The second part simply states that no new restrictions (in the sense of purpose a) are made.
To be more explicit, the paragraph should be read as:
> The act of running the Program is not additionally restricted
If true then this means before you even download the program you must agree to its license. only then are you allowed to download which in turn copies the program to ram and then to a FS.
It's ridiculous to expect that you can go after individual users for making single copies while not following your terms. You're going to sue someone for zero damages? For bruising your ego? Good luck with that.
For the same reason, an AGPL item could be enforceable, but only if the violator is juicy enough.
OED gives the following usages, the first of which suggest users predate computers:
1950 Science 112 732/1 Analog machines..are enthusiastically supported by their users.
1959 E. M. McCormick Digital Computer Primer x. 139 The number of instructions which can be executed by a computer represents a compromise between the designer's and user's requirements.
Thanks for a good reference! It makes sense that "users of a service" or "user of a machine" slipped its way into computer jargon, no need to be coined or anything.
I'm tempted to pick up a copy of Digital Computer Primer on ebay, I love the old explanations of what these machines even are.
In general, a retained object is deallocated on last release. However ownership of some objects somewhere may have been given to an autoreleasepool, in which case “the last release” for those objects will come from the pool. To what extent this happens is implementation-defined.
Swift and ObjC implementations have levers which discourage objects being sent to the pool in common cases. It is possible to pull them from other languages but not easy.
That discussion is a bit strange. In that example Foo is implicitly #[repr(Rust)] meaning it has an undefined layout. In particular, a,b is unordered, and even if you don't care which order you get, there is the question of padding (a reasonable compiler will tightly pack a,b but that's not required).
For this reason, we never reach the question of alignment because we know neither i32, u8, nor any other type has the same layout as Foo (undefined layout).
It is certainly true that unsafe Rust is veeeeery unsafe, as perhaps evidenced by me being the first person to point out the repr issue. On the other hand this scheme has a lot of advantages for writing safe Rust.
With that in mind: in what way should we terraform the latent terrain of human psychology to fix this problem? Because breaking up the tech companies seems like a way, but you seem to have a different one in mind.