There is in fact no photograph of treetops glowing.
There is a digital UV-wavelength video of the corona, and a visible-wavelength video of the trees.
The paper [1] contains a sole picture with tiny circles indicating where the UV-video detected corona events, overlaid over a frame of the visible-wavelength video.
The paper does also contain a video [2] which overlays a somewhat processed version of the UV video over the visible wavelength video, where UV photon events are indicated by decaying red dots.
Do you think they are not photographs of the Sun because these are not what I see if I look at the sun with my eyes? (In which case I'll see pure white then perma black, I assume.)
Sure, a photo taken in non-visible spectrum is still a photo. And stacking photos taken with different wavelength filters or sensor can also be considered a photo. For example the headline image of the spruce tips taken in a lab is photo. And based on the description of the UV camera in the paper, they did generate UV video of the tree tops.
However, the linked article and associated paper don't have any such photos (or video) of the corona in the treetops. Instead the UV video was processed with a detection algorithm, and then the visible-light photos and video were annotated with graphed dots of where detections were seen. Those dots aren't a photo of the corona by any reasonable definition.
While reading I thought this is basically visual tinnitus and then the author used exactly that term. As someone with tinnitus, I can definitely understand the longing for "absolute darkness".
They're the same as looking at the sun with your eyes. You won't go blind looking directly for a short time. It's just best not to stare for a long time.
At work, some guy has been pushing a 2-day feature into its 5th week now, with questions like "what do you mean by (database) table?" "Is <not_a_database_table> a database table?"
Etc...
We have to fill-in RFDs to answer those kind of questions, so the process is massively slow and st...(expunged due to HN guidelines).
So yeah, some people really love their semantics and are willing to do whatever it takes to keep it that way.
[You can take a guess at where this startup will be in 2-3 years ...]
Sorry, in what way is this not a photograph? Are you saying that a video is not a sequence of photographs, that UV photons captured by a sensor don’t count because human retina sensitivity is low in that range, or some hopefully-less-semantic argument?
The headline suggests that people have seen treetops glowing and it just hasn’t been captured on video before. The actual pictures and video is of something that nobody could have seen with their eyes.
This reminds me of a chat room interaction I had maybe 25 years ago. The other person was adamant that humans can't see the infrared from TV remotes, and I was adamant that I could. It was pretty a widespread belief (even in school science books) at that time that humans couldn't see infrared. Since then more science was done to prove that, in fact, some humans can see some infrared under some conditions.
I share that mainly to state that humans are amazing and have a wide and inconsistent range of capabilities (and sometimes even mutating into new capabilities!) Personally, I will always hesitate to say "nobody" and I lean towards "no typical human" instead. :)
The faint red glow is actual red light as many IR LED's (esp the ones used in cameras for night illumination) are close to the visible spectrum and have some visible light emission.
You can absolutely see corona discharge like that with your eyes.
If you come to my day job, and we shut off all the lights in the test room, after your eyes adjust in the dark for a minute, you'll see the soft purple glow coming from the edge our 160kV test rig.
Definitely emits UV, but there is enough visible to see it for sure. It comes from the electrons exciting nitrogen in the air.[1]
> (1:If you come to my day job), and (2: we shut off all the lights) (3:in the test room), (4:after your eyes adjust in the dark for a minute), you'll see the soft purple glow coming (5:from the edge our 160kV test rig).
So, 5 different things that make it glow "not coming from treetops". Parent poster wanted to see glowing treetops in a forest, where we might not be adjusted to dark for a minute.
You can also see such corona discharge with benchtop tesla coils even in lighted room, but those are not trees in forest glowing from a storm.
Even a smallish Tesla coil easily produces voltages north of 160kV. I built one using 4" PVC for the secondary with a wound length of maybe ~2 feet of secondary? From memory of the calculations I did at the time I think it was around 350 kV peak? Might have been higher. Threw 24 inch sparks quite easily.
I don't really blame the researchers here but this is yet another article that is happy to have a clickbait headline which any reasonable reader is going to assume will include a picture of "treetops glowing".
At least personally I scanned the article for it and only found the picture at the top, which I was then frustrated to learn that's just a lab photo, and I came here wondering where the actual image is of it in the field so I found OPs comment helpful to indicate that the suggestion there would be a beautiful picture of glowing canopy somewhere is basically a result of editorializing.
It's true that the image isn't fiction or a purely fabricated "artists rendering" from data. But it's also true that "filmed" and "glowing" are unusual ways to refer to what happened.
You don't usually say filmed when talking about recording uv or microwaves etc. You technically could, and probably back when film was actually how uv was recorded a few people working in the field probably did, but almost no one else does, or no one at all since decades, which means the author of the title is the one out of step, not the people reading it.
They actually recorded something, and this title is misleading. Both things are true.
When I worked in a lab that took videos with a UV camera, I still called them videos, and I would absolutely have said that I took a video of the subject (a methanol flame in this case).
Essentially every color photograph you have ever seen is a composite of a red photographic, a green photograph, and a blue photograph.
I've taken the "captured on film" out of the title above and used representative language from the article. If someone can suggest a better (more accurate and neutral) title, we can change it again. (But the subject is interesting whether on film or not, let alone "for the first time".)
Half of the comments are in this subthread which derailed the discussion on this submission before it even started.
Here the damage is done but maybe, please, refrain from doing so elsewhere.
I am just getting into the packet scene in the Boston area.
APRS aside, as far as I've found, there are about a half dozen Winlink nodes in the area and one BBS. And one lovely node in Cambridge (KZ2X-1 [1]) which provides connectivity to a bevy of ancient (though virtualized) OSes.
I don't know how much AMPRnet activity there is. There are only 7 allocations in the area (mine included). I'd love to be able to e.g. log in to my home network from a few radio hops away but I don't think there's any infrastructure in place for that (such as Mobile IP).
Hah! I was literally about to open up my nascent userland AX.25 stack. Is yours open-source, or would you mind sharing? (My e-mail is in my profile.) I want to get something running on an ESP32-S3. My goal is to turn a Cardputer into a companion TNC console for my Kenwood TH-D74.
It is technically! I sent you an email. It's very POSIX-y, unfortunately, but if you were to make some sort of shim for file descriptors (big ask heh) you would be able to use it just fine.
I have nostalgia for Wario Land, because I played it for 5 minutes in a Toys'R'Us, and it's a good game which I never got to play in full until decades later. But I never owned one, so everything else you said rings true to me.
Never used a Virtual Boy, but I'm somehow nostalgic just for the development tool -- grey metal boxes with vents, LEDs, and rocker switches transport me into an optimistic future of the past.
I am pretty sure (though have not vetted) that triangulation of LoRa is possible even at very low SNR.
The trick is understanding LoRa's trick, which is simply to "skew" the signal across time (via chirps), modulo a window of the configured bandwidth around the center frequency. The key is that the skew rate is purely a function of the spreading factor, bandwidth, and IQ polarity (= pol × BW² / 2^SF), so there's a small-ish finite number of skew rates. So you can just modulate raw IQ data with carriers at each of these skew rates to find one which gives you a bunch of carrier waves that hop around discretely at about twice the symbol rate, looking like an FSK signal. You can then bin this at a factor of, say 2^(SF-2) to correlate the signal and raise it up above the noise floor, on which you can apply any standard triangulation technique.
I'll try vetting this soon and reply to this post with results.
Sounds like you know what you’re talking about. I’d be very interested in learning more. I’m not a radio or even software expert. I’m a data scientist with a math background. Interested in communicating across mesh networks anonymously.
I know only basics about mesh networks per se. I'm speaking here from a background in signal theory / amateur radio / research on the LoRa protocol. If you have specific questions on those aspects feel free to contact me, my e-mail is in my profile.
Is your goal nondetection? If so, know that triangulating a radio signal is fairly straightforward, even in a dragnet fashion. The exception I think is something like cryptographic DSSS (found in military use) wherein not only is the signal far below the noise floor, but the spreading function is not predictable by an adversary (LoRa being very much predictable). Even then, you're limited by physics to only be able to transmit so far without the signal being detectable near the transmitter.
A gentle FYI for casuals interested in Meshtastic / MeshCore in the the USA: the default radio settings promoted by both these services are actually outside the parameters permitted by the FCC for unlicensed spread-spectrum operation, which require that such signals be spread over at least 500 kHz of spectrum [1]. Meshtastic "LongFast" spreads over only 250 kHz, and MeshCore "USA Recommended" over only 125 kHz.
(500 kHz bandwidth is indeed a valid setting for the underlying LoRa protocol, and is used when the radios get certified.)
Unfortunately the community meshes in most/all US metros have coalesced around these settings, meaning one is forced to choose between linking with "the" mesh, or operating legally.
Agreed – and MeshCore follows a similar "security on the radio" design.
With the "cell phone + companion radio" setup which is currently very popular, it would seem the correct solution is to perform encryption on the phone – using the Signal protocol – and use the companion radio only to send/receive these blobs.
This has the added benefit that you can pair with _any_ arbitrary companion radio, rather than your identity being tied to one specific radio you own.
reply