Jump to content

Ftp Security


Infatuas

Recommended Posts

Hello All,

I use ftp quite frequently and also familiar with sftp through filezilla. What is wrong with using a regular FTP server? I understand that FTP powered by SSL will offer file encryption when FTP will not, but what if the files you are transfering are not of importance? Is that the only problem? Or does opening port 21 suddenly make your operating system unsecure and prone to more vulnerabilities compared to SFTP?

If someone has the answer to this it would be interesting to hear. BTW, I use inbound, outbound, and application based firewall rules to secure connectivity to my systems from the WAN. This is mostly a curiosity question...

Thanks~

Link to comment
Share on other sites

So lets take FileZilla for example which has self contained username/password. Assuming someone is actually able to sniff out the credentials, it certainly wont be from my LAN which is to say that they would have to get it from either my ISP or an ISP that I am connected too remotely (more likely); then login to my ftp server which is on an open/close schedule based on my working hours. Could they traverse the file system using the FileZilla credentials or drop in a self executing virus/worm?

Link to comment
Share on other sites

Or does opening port 21 suddenly make your operating system unsecure and prone to more vulnerabilities compared to SFTP?

In theory, assuming there are bugs in all software that can be exploited, then by increasing the number of open services then you are increasing your chances of being exploited. Of course this can be offset by using well maintained mainstream packages (e.g. openSSH) which should have less bugs than exotic poorly maintained packages.

On a security advice point of view with using FTP:

  • Don't use the same credentials that you use for other systems.
  • If possible use Kerberos for authentication rather than username and password.
  • Limit the directories that can be accessed through FTP to only those that you require.
  • Chroot the ftp daemon so that if someone does get access they are limited in what they can access/do.

Link to comment
Share on other sites

Or does opening port 21 suddenly make your operating system unsecure and prone to more vulnerabilities compared to SFTP?

It won't necessarily make your system insecure, but if a bot or a script kiddie finds port 21 opened, they will try to brute force their way in.

By changing the default port to something unused will slow their attempts down, but it won't prevent a future attack from occurring.

Always remember to disable anonymous login and if possible use other means of authentication (eq, Kerberos).

Keeping your software update its also important, that will minimize the chances of someone exploiting your server.

If you are the only person using the FTP server, I would consider using OpenVPN to secure the connection, rather than using the FTP server itself, which could be more vulnerable to attacks.

Link to comment
Share on other sites

Personally there are a few times that I like to use FTP over SFTP. For most file transfers I will use ssh (either sftp or scp). The few times that I prefer to use FTP is for when I have a lot of files being made available to the general public or if I am needing to transfer large files a low powered system.

The making file available to the public scenario more often falls under a web server's role now than an ftp server, but if using an ftp server then chroot it into its own environment and really lock down which directories can be seen.

The transferring large files to a low powered system then set up a write only account and encrypt your backup files before transferring. That way they may be able to sniff your login credentials but they can't read back any of the files, and if they sniff the whole file while it is being transferred they would still have to break the encryption before they have anything. As always really lock down the access that ftp user has so that they can't change to other directories and cause mischief if they do sniff your login details.

The transferring large files scenario is commonly a problem seen when transferring backups from a server to a NAS box. The server has plenty of power for its side of a copy over ssh but the NAS box doesn't have enough power to do the decrypt on its side of ssh. I have seen it where a server would only be able to copy a file to a NAS box at about 2MB/s over sftp but over ftp it could hit 10MB/s. When transferring a large backup file regularly it made sense to encrypt on the local machine and then use FTP to transfer the file.

Edited by Jason Cooper
Link to comment
Share on other sites

While FTP services are generally good for transferring larger files or smaller files. One could always use a webserver to upload or download files via a web based interface. A lot of free web hosting services provide this capability and whilst its convenient you don't have to set up or install any third party software to interact with.

If the webserver allows you could access/upload/download all your information via HTTPS, this will prevent someone from sniffing your traffic and consequently stealing your information or even your logon credentials.

Though HTTPS is good for encrypting the connection, you shouldn't entirely rely on it. You should always encrypt your information before it leaves your computer.

Link to comment
Share on other sites

Open wireshark, then connect to your FTP server. You will see your username and password in plaintext. Use SFTP or SCP when transferring files to and from your web servers, and if possible, use public and private SSH keys with a compatible SCP client, vs just standard SFTP/SCP, which helps add another layer or complexity. They would need your ssh keys to connect. Without it, they could do nothing over SSH.

Edited by digip
Link to comment
Share on other sites

Thanks for all the info guys it makes much more sense. The only discrepancy that I have noticed is as follows.

Someone stated above the obvious logic that the more ports you open up the more vulnerabilites you in essence become vulnerable too. That being said, when I configure a generic FTP server on port 21. I am usually going to use a static NAT with portmapping (e.g. (Outside TCP-2121) > (Inside TCP-21)) and that would be it. One port to open, granted there is no encryption. On the flip side, if I configure filezilla for the recommended passive mode (even in a Windows environment) I would specify a Passive Port range of 50000-50100, but in doing so I am not only opening the ingress ports for SFTP port 990 I am also needing to open ports 50000-50100. Aren't I counteracting the balance.

Link to comment
Share on other sites

Thanks for all the info guys it makes much more sense. The only discrepancy that I have noticed is as follows.

Someone stated above the obvious logic that the more ports you open up the more vulnerabilites you in essence become vulnerable too. That being said, when I configure a generic FTP server on port 21. I am usually going to use a static NAT with portmapping (e.g. (Outside TCP-2121) > (Inside TCP-21)) and that would be it. One port to open, granted there is no encryption. On the flip side, if I configure filezilla for the recommended passive mode (even in a Windows environment) I would specify a Passive Port range of 50000-50100, but in doing so I am not only opening the ingress ports for SFTP port 990 I am also needing to open ports 50000-50100. Aren't I counteracting the balance.

In order for FTP to work, it needs both 20 and 21, but not sure if that will hinder it if only pointing to port 21.

Link to comment
Share on other sites

I know the ports... LOL I don't think you read it all. Now worries though; I have been convinced and I configured the Passive FTP over TLS which works great. Only thing that isn't too great is having to use a client from a Windows box to connect to it.

Link to comment
Share on other sites

I know the ports... LOL I don't think you read it all. Now worries though; I have been convinced and I configured the Passive FTP over TLS which works great. Only thing that isn't too great is having to use a client from a Windows box to connect to it.

I was going to suggest DOS, if you are comfortable with CLI but I realized DOS does not support SSL/TLS protocol. So it will have to be done via a third party software.

Link to comment
Share on other sites

FTP daemons have vulns (as mentioned by Jason Cooper) and are sometimes configured improperly. The lack of encryption is also an obvious problem. My personal grief with FTP is that the default in many setups is to transfer in ASCII mode; which appends 0x0a after any 0x0d byte in a file. This behavior corrupts any file that gets transferred if BINARY mode is not used -- possibly the dumbest thing ever.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...