The term data compression refers to lowering the number of bits of information which should be stored or transmitted. This can be achieved with or without the loss of information, so what will be erased at the time of the compression will be either redundant data or unnecessary one. When the data is uncompressed afterwards, in the first case the information and its quality will be the same, while in the second case the quality shall be worse. There are different compression algorithms which are more efficient for different type of data. Compressing and uncompressing data often takes lots of processing time, which means that the server performing the action needs to have ample resources in order to be able to process the data quick enough. An example how information can be compressed is to store just how many consecutive positions should have 1 and just how many should have 0 within the binary code instead of storing the actual 1s and 0s.
Data Compression in Hosting
The ZFS file system that runs on our cloud web hosting platform employs a compression algorithm identified as LZ4. The latter is considerably faster and better than any other algorithm available on the market, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard disk, which improves the performance of websites hosted on ZFS-based platforms. Since the algorithm compresses data very well and it does that very quickly, we are able to generate several backups of all the content stored in the hosting accounts on our servers daily. Both your content and its backups will take less space and since both ZFS and LZ4 work extremely fast, the backup generation will not change the performance of the servers where your content will be kept.