The issue is not so much that macros are not merely syntactic transformations, but rather than s-expressions alone aren’t sufficiently expressive to represent the syntax of any Lisp with a package system. The scheme macro system solves that by having macros return syntax objects where identifiers used in a macro can carry around a reference to the original context where the macro was defined. So referencing a function in a macro definition will look up the function in the context of where the macro is defined, and return a syntax object that refers to the correct function. To my knowledge, most modern macro systems work this way (Scheme, Dylan, etc.)
Most of the need for hygienic macros goes away if you have a compiler (or macro expander) that warns about shadowing.
Code has to go out of its way to do something silly, like reference a variable which it does not define; and by coincidence this has to be defined by a macro that is used in the same scope (so that it simultaneously evades unbound variable and shadowing diagnosis).
Hygienic macros do not solve accidental name capture in code that contains no macros. Manually written code can contain a mistake of reference due to variable shadowing, creating a bug.
Kind of. Passing around extra context in order to resolve the symbol in the right environment does fix the problem, but it's hard to argue with unqoting the symbol as the better fix. Embed the function foo in the macro instead of the symbol foo and information on how to turn that symbol into the function later.
I agree that Janet's approach is nicer, but here's an argument against it: if you unquote a function in a macro and then later redefine the function (like, while doing interactive development in the same running process), the macro expansion will still have the "old" definition. Whereas if you look up the function by (lexical context +) name every time you run the code, you'll pick up re-definitions automatically.
Looking up the function at runtime all the time, was at the time of the design of these Lisp systems exactly the thing which one wanted to avoid. Macros were designed such that a compiled application does no additional macro expansions at runtime or need to resolve functions. A compiled application could also be one which did not include a development environment and/or was compiled to static code. Computers were slow and the applications were ambitious, the competition was using C/C++. Such a graphics example would have been written in the context of a CAD system or a graphics design software. There was little need to lookup the mostly same functions all the time, but more need to have very fast code.
Janet's approach seems a little strange to me in that it's opt-in rather than opt-out. In CL or Clojure, you just normally can't have this issue due to the package or namespace system (e.g. you have to go out of your way to define an anaphoric macro that will work anywhere). Clojure doesn't have a separate function namespace either, but since backquote will automatically qualify symbols with their ns unless you opt out, you don't need to use any strange (at least to me) unquoting approach.
The "separate package" issue is really a non-issue, especially in Clojure where you have a different namespace for every file. Some people prefer a separate package for every file in CL, but even with one-package-per-project, you only have to worry about your own code. Anyone redefining functions outside your package isn't going to cause any problems with your macros.
Funny you mention that, fexprs are definitely better behaved when they bind the function itself instead of a name by which the function might be found later.
The issue is not so much that macros are not merely syntactic transformations, but rather than s-expressions alone aren’t sufficiently expressive to represent the syntax of any Lisp with a package system. The scheme macro system solves that by having macros return syntax objects where identifiers used in a macro can carry around a reference to the original context where the macro was defined. So referencing a function in a macro definition will look up the function in the context of where the macro is defined, and return a syntax object that refers to the correct function. To my knowledge, most modern macro systems work this way (Scheme, Dylan, etc.)