Hacker Newsnew | past | comments | ask | show | jobs | submit | natural219's commentslogin

Hey, sounds great. How do I contact you?


Location: Kansas City, MO, USA -- but 100% enthusiastic about relocation

Remote: Sure, but hybrid/in-office roles are fine

Willing to relocate: Yes!

Technologies: React, Typescript, Ruby/Rails, Node.js/express -- any modern web application framework is easily within my wheelhouse, especially in the LLM era.

Resume: https://cjohnson.io/assets/files/resume.pdf

Email: chris@cjohnson.io


Hey. Love this posting man. I sent in an application :)


Thanks! Prophet Town's 2nd-in-command (Ian) is helping me comb responses and he's out camping through Saturday; we're going to start going through responses this weekend.


Hey, do you hire Canadians by any chance?


We have in isolated cases, in the past. Our needs at the moment mostly require US-only, but you're free to put in submission to the form.


"A glorified subreddit for HN readers..."

At some point, I wonder if `dang` would support this? It seems like a good idea. For more topic-specific higher volume news items.


This is incredible.

119 million posts, x ~10MB per picture,

is like 1,200 petabytes? 1.2 exabytes?

Am I missing something here... this seems very impressive.


I believe Dan said yesterday? It costs around 4k a month. Unsure if he’s talking cad or us. Honestly more impressed it’s only that much.


More like 1 petabyte, no?

    iex(5)> mb = 119000000 * 10
    1190000000
    iex(6)> mb / 1000 / 1000
    1190.0


I’m making an assumption, but I would guess most of the images being shared are about a tenth of that.


Images and videos, though to be fair, the limit for both is 15 MB.

And I think it's a safe assumption there's some ffmpeg / imagemagick running somewhere that reduces the size even further before serving it to others.


Each instance can set it's own size limits.


The correct result of 119 million * 10MB is 1.19 PB.

Also the images on the first page of the author's own account average only 157kB, bringing your estimate to 18.6 TB...

pxscdn.com resolves to Digital Ocean, if he's using their S3-compatible storage, that's $372/mo + $0.01 per GB downloaded


You are off by some orders of magnitude.


More like 1MB per JPEG picture


Not even. Instagram photos are roughly 50kb-200kb for jpg’s.


if you compress it properly for a fast web experience, you'll probably be better 10KB-100KB


Those are transfer sizes, not sizes of the data at rest. Even transferred, the `srcset` for this image I picked at random includes a 178KB version; there's certainly a higher quality version stored, if not the original upload, something closer to it.

https://www.instagram.com/pet/p/C327eIguGRL/?img_index=1


Wow. This is amazing. Extremely practical to use, I'm glad I checked H.N. yesterday.


Horribly, lol. I think 6-7 out of 10? I have no idea what these classifications mean though.


AI moderators too would be an enormous boon if they could get that right.


It would be good, but the cost per moderation is still really high for it to be practical.


Funny. I just tried to post my yearly attempt to communicate with people on Reddit, which got taken down immediately by the auto-moderator.

https://www.reddit.com/r/Austin/comments/1c0nuy6/it_is_impos...

It's infinitely sad that there's no place to just connect with people on the internet anymore. My post got 6 comments and a DM within the first minute, before the post got taken down. These people could have been new friends.

I've been through this cycle so many times I have long given up on trying to post on the internet. Logging on to find people and share thoughts only to be met with this massive wall of context and janitorial standards. I gave up like five years ago.

This is to say that the whole debate between "social media causes anxiety" and our landscape of social media causes anxiety makes this debate way too coarse. Getting on the internet between 2005-2012 felt happy, free, and was just a wellspring of community and connection. Post-2013 it's been a nightmarish hellscape on every platform.


I mean, you kinda answered the question there yourself. The whole REST paradigm (spearheaded by the Ruby on Rails community) was strongly influenced and modeled after the HATEOAS paper and ideas. There's some talks and papers about it but I'd have to dig them up, but basically the working reference implementation that shaped the web for years is here: https://guides.rubyonrails.org/routing.html


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: