Hacker Newsnew | past | comments | ask | show | jobs | submit | acherion's commentslogin

The way these kinds of fonts work is that you don't host the font, they do. You link the font licence you purchased through your HTML code (or CSS, depending on how the foundry recommends you to apply the font) with a specific font URL that they provide you, which will contain unique identifiers. Then they can track how often the font gets loaded.

If your site really kicks off and you max out those visits per month (that they track on their end), they either start charging you the higher tier, cut off loading your font, or send you stern emails.

There is no expectation that you share your analytics with a type foundry.


That’s not true. I’ve bought fonts on Future Fonts and I received a download link to get the files. I think it’s fundamentally an honor system.


My bad, I assumed Future Fonts did something similar to other type foundries. Thanks for letting me know!


When there's a license you're either violating the license agreement or you're not. That's not an honor system.


No, "honor system" is very frequently used and understood to refer to a system where there are explicit rules but where the rules are not enforced via active surveillance.


It sounds like you want to make a judgement call: "they're too small to enforce this license agreement," so you get to pretend it's an honor system and not a license agreement.


The context was whether there is automatic enforcement, not whether you need to abide by the license.


Who's going to verify whether or not you're violating the license?


God


Not to take away from your fantastic explanation but I should note that’s not universal. There are foundries that operate on an honor basis and let you self host the font too.


Noted, I thought Future Fonts did the same system as many other type foundries out there, evidently not. Thanks for letting me know.


> You link the font licence you purchased through your HTML code

Ugh, hard pass for me. It a nice font thought


What you describe is how Google Fonts handles this if you choose to use the fonts directly from Google's servers. This is a violation of GDPR. You can also download them and host them yourself, to comply with data protection laws.

https://cookie-script.com/blog/google-fonts-and-gdpr


This is a good thing to point people at when they claim that GDPR is simple to implement. This legal interpretation is totally reasonable but it’s probably not what most developers would expect.


The law itself is very clear and concise so it is straightforward to find that this is not only a reasonable interpretation but right there in the law.


I would not describe 88 pages as concise.

Regardless, my point is just that there are implications of the GDPR that a lot of engineers are probably not aware of. It makes sense that sending your traffic to Google for fonts violates GDPR. But as an engineer, this is just a CDN. I would not have considered this a violation of GDPR without seeing someone else point it out.


A pretty interesting proposition if you are into cinematography with your mirrorless cameras.

According to Nikon, the takeaway key specs are:

- The full-frame ZR can record up to 6K/60p (59.94p) 12-bit RAW video and incorporates the new R3D NE RAW video file format with RED color science based on RED’s popular R3D RAW codec. It uses Log3G10 and REDWideGamutRGB, the same color space and log curve found in RED cameras.

- Multiple internal RAW formats with high dynamic range: Choose RED’s R3D NE with 15+ stops of dynamic range, or Nikon’s N-RAW, or Apple ProRes RAW HQ.

- The impressively huge 4", DCI-P3 touchscreen LCD is bright enough to be used even in direct sunlight.

- World’s first 32-bit float audio from built-in and external microphones.

- OZO directional audio with 5 pick-up patterns.

- Up to 7.5 stops of built-in image stabilization.

- Straight Out of camera Cinematic Video Mode preset gives RED-curated color for expressive, filmic color and tone.

- Stunning autofocus with Nikon’s deep learning-based AI technology: Smart detection of nine object types, including people and vehicles.

- Fan-less, durable body: Heat-dissipating structure with lightweight magnesium alloy, offering the same dust- and drip-resistance as the Z6III.

- On-screen LUT previews, stores up to 10 LUTs.

- New digital accessory shoe.

- Price -$2199.95 SRP (body), available late October.


> An amiga museum with all the games, artwork, coding technology, music technology etc. Perhaps an AI can be tasked to produce all of this soon. Youtube videos might be an engaging delivery mechanism. A physical museum too can be considered, perhaps as part of Computer History Museum and similar.

Sounds like you haven't been in touch with the Amiga scene in quite a while, if you think the above is something new. Perhaps Amiga / retro museums haven't been set up in your location, but there are heaps of them in Europe, for example. Youtube videos are a dime a dozen, just search 'amiga' on youtube and you will find literally hundreds of channels dedicated to the Amiga and/or Commodore in general. I subscribe to many of them already, and they all provide excellent in depth content for the Amiga, from hardware, to software, to games, to demos.

> AI coding might unlock mass creation of new software, games, demos, music etc. What was once conceived impossible will be very possible and likely abundant soon

Why would game writing / music creation / demos / software be "once conceived impossible"? Kids were doing the very thing in their bedrooms in the 80s and 90s, without AI. What would AI bring to the table nowadays that couldn't be done in the 80s/90s when the Amiga was popular?

People developing for the Amiga were putting their heart and soul into their creations. AI can't replicate that, and it definitely can't improve it, in any sense of the word.


seriously, has Starchild3001 never looked at the modern indy game scene? Half of it is flooded with people choosing restrictions based on old machines. More consoles than computers, games trying to look like an NES or a PSX are a dime a dozen.


Most of these "new retro" games are coded with modern tools and engines, they often only approximate the look and feel of retro games (their code is very modern and much easier to deal with) and the restrictions are broken as soon as it is too inconvenient. They can be great games, look nice and be congruent with their inspirations of course, but the ones built with real restrictions, be it actually made to run on real retro hardware or some kind of fictional VM like in UFO 50 or attempting to recreate a similar graphics system like the NES PPU tiling and pixel restrictions in Shovel Knight are much rarer and take way more effort to make.


I mostly follow Amiga and C64 (a little bit). I don't follow the platforms you're talking about.


Re: Online content.

I'm well aware of what's available out there as online content (it's no farther than a Google or youtube search).

Do you think what's out there as online content is what's truly possible if we had a million more Amiga enthusiasts?

That's my vision of what's to come in, say, 10-20 yrs. Imagine every Amiga game played and recorded by many (AI) users from start to finish. Every tactic explored, and cool strategies figured out. I for one would watch this.

Imagine vibe coding becoming more and more possible with 68k assembly. And having 1000x Amiga (AI) developers producing cool demo, intro and game material. New material. Novel and cutting edge material. At massive scale.

I believe this is the future we're headed. I for one am very excited about it.

----------

Re: A physical museum.

No, an Amiga or Commodore focus cannot be found anywhere in Silicon Valley or in United States. Even Computer History Museum (CHM) in Silicon Valley has very little Commodore content.

I live <1 mile away from the original Amiga offices in Los Gatos. It's a bit of shame that there's so little Amiga or Commodore in CHM.


The joy of the demoscene is inextricable from the human and physical nature of it.

Yes, you can have AI tools vibe code up "new" 68k assembly for old machines, but you're never going to see it find genuinely new techniques for pushing the limits of the hardware until you give it access to actual hardware. The demoscene pushes the limits so hard that emulators have to be updated after demos are published. That makes it prohibitively expensive and difficult to employ AI to do this work in the manner you describe.

Don't mistake productivity for progress. There is joy in solving hard problems yourself, especially when you're the one who chose the limitations... And remember to sit back and enjoy yourself once in a while.

Speaking of, here's a demo you can sit back and enjoy: https://youtu.be/3aJzSySfCZM


Re: AI. I believe this will still be a human operation, as far as I can see.

Awesome demo! It's a little bit of middle age crisis :), but superbly done! Thank you.


I'm not sure why you are testing in iOS 9.3.6. (which was released in 2015), when the documentation says support for Safari is for versions 16 and above.


Well, as I said, I thought this should work everywhere because it's just a textarea, so I didn't read the docs and rushed to test my hypothesis on an old and widely unsupported device


Why can't you be both? I am an amateur photographer, but it doesn't mean that I carry my camera with me everywhere that I go. I see photography as a hobby, so when I feel like I want to do "hobby things" I bring a camera with me. I prepare myself to do so. It doesn't mean that I don't use my phone camera at all (in fact I upgraded my phone purely for the "better camera").

If you are just taking snapshots to share with friends, then it makes sense to not bring the camera. But if it's your hobby, where you sit down and take time and care to take a photo, then it's a different game altogether.

I don't often print my photos out and put them on a wall, but I do have my own photography blog where I post the photos I take (with a camera). I think the article is still relevant to that kind of scenario too.

I think the purpose of this kind of page is to outline differences between taking a snapshot and taking a photo. This is to argue back at people who think that taking a photo with an iPhone is just as good _in any situation_ and think that _anyone_ with a camera is wasting their time. It also attempts to combat the prevalent myth that more megapixels = better photos. Yes that myth still exists in 2025.


yeah agree. I decided I wasn't a photographer, though I'm still interested in it.

> This is to argue back at people who think that taking a photo with an iPhone is just as good _in any situation_ and think that _anyone_ with a camera is wasting their time.

"Never argue with idiots. They drag you down to their level and beat you with experience". Seriously, are there people who think that iPhones are just as good as dedicated cameras, and can still tie their own shoelaces?


I'm Aussie, so I'm automatically wrong, but here in Australia it's typically pronounced "caysh". It's a uniquely Australian thing, saying cache this way outside of this country will get you looks.


Don't worry, you're not alone - Kiwis do the same.


Just a heads up that this site has an animated SVG favicon. I had added it to my Firefox bookmarks bar and the favicon kept animating its color changes. Looked pretty neat, but then realised after a day or so that Firefox was chewing up 50-60% CPU on my MacBook Air. Turns out it was down to this bookmark, after I removed it, the CPU levels were much much lower.

I'm not sure if it's the same deal if I were to put the bookmark in a folder (so it's not in view all the time) but just something to note.


Oh wow that's interesting! Thanks for the note, I'll probably remove it.


While what you said is true (MacOS was really unstable in these tumultuous years), this link specifically celebrates the user interface of the later versions of non-OSX MacOS. This doesn't really have a relation to the stability of the OS.


> this link specifically celebrates the user interface of the later versions of non-OSX MacOS. This doesn't really have a relation to the stability of the OS.

"Very pretty but can't do much" was a general take on the Mac OS cube of the day.

The lack of a fan or any decent cooling, the "lack of a floppy disk" (for those of us who didn't use Zip drives), it was pretty to look at but hard to work with.

We had one to run FrameMaker on, but beyond type-setting (& fonts), it was a shiny thing which was treated like a sunday sports car.

Where I was, the Tex user group is what eventually materialized into a Linux User group and there was simultaneously love for the screen, rendering and fonts for the Mac, but near hatred at having to use it to professionally typeset things.

Math publications quickly jumped ship out of Adobe due to OS 9, but very few came back to the OS X versions until years later when Apple started making really good laptops with fast hardware.


> The lack of a fan or any decent cooling, the "lack of a floppy disk" (for those of us who didn't use Zip drives), it was pretty to look at but hard to work with.

The G4 Cube had an (empty) standard mount and power connector for an optional fan.


Neat little app, but it made bluetoothd go bananas on my CPU, chewing up to 40% (M2 MBA here)


That is explained in the FAQ, apparently the bt module is inefficient but you can disable it.


I disabled the module but it still chews up a lot of CPU.


A mastodon instance where bots were welcomed.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: