Data compression is the compacting of information by lowering the number of bits that are stored or transmitted. Consequently, the compressed information needs much less disk space than the original one, so much more content could be stored using the same amount of space. There are various compression algorithms which function in different ways and with a lot of them just the redundant bits are erased, so once the info is uncompressed, there's no decrease in quality. Others erase unneeded bits, but uncompressing the data following that will lead to lower quality compared to the original. Compressing and uncompressing content needs a significant amount of system resources, in particular CPU processing time, therefore any web hosting platform that uses compression in real time should have enough power to support that attribute. An example how data can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" what number of sequential 1s or 0s there should be instead of saving the entire code.

Data Compression in Shared Web Hosting

The ZFS file system which runs on our cloud Internet hosting platform uses a compression algorithm called LZ4. The aforementioned is substantially faster and better than any other algorithm you will find, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard drive, which improves the overall performance of Internet sites hosted on ZFS-based platforms. As the algorithm compresses data very well and it does that very quickly, we are able to generate several backup copies of all the content stored in the shared web hosting accounts on our servers daily. Both your content and its backups will take less space and since both ZFS and LZ4 work very quickly, the backup generation will not change the performance of the servers where your content will be stored.