Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The protocol is from the era where each protocol used a dedicated TCP port rather than "it's all JSON over HTTPS" like now.


Yeah. Amen to that

There were some bad ideas in the beginning, FTP as an example of several of those


FTP splitting the data and control ports was a smart implementation. It allowed for optimization of the ports (one focused on throughput) and meant that control could still occur even while long responses were happening on the data port.


I haven't tested it, but I imagine it allows you to trick an FTP server into making an HTTP request (without TLS).

It probably seemed like a good idea at the time, and there was no way to know the problems without trying it.

It also allows you to send a command to cancel an ongoing transfer.


Well... RFC 953 specifically mentions using PORT to send a file to a line printer so that seems intentional. [1]

For streaming mode STORe/RETRieve the connection closes at the EOF so you could send the request but the response would be lost.

[1]

    It is possible for the user to specify an alternate data port by
    use of the PORT command.  The user may want a file dumped on a TAC
    line printer or retrieved from a third party host.


it also allowed for FXP, which was a godsend in early pirating/warez days ;-)

...or so I've heard


This could have been better accomplished by multiple connections to the same port.


A lot harder to implement QoS on that.


QoS didn’t exist when FTP was invented. The protocol was designed badly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: