Data compression is the compacting of info by lowering the number of bits that are stored or transmitted. This way, the compressed data will take much less disk space than the original one, so additional content might be stored using the same amount of space. You will find different compression algorithms that work in different ways and with a lot of them just the redundant bits are removed, so once the info is uncompressed, there is no decrease in quality. Others delete unnecessary bits, but uncompressing the data later on will lead to lower quality compared to the original. Compressing and uncompressing content takes a huge amount of system resources, especially CPU processing time, so any Internet hosting platform that uses compression in real time must have ample power to support this attribute. An example how info can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" what number of consecutive 1s or 0s there should be instead of keeping the actual code.
Data Compression in Shared Web Hosting
The ZFS file system that runs on our cloud web hosting platform uses a compression algorithm called LZ4. The aforementioned is substantially faster and better than any other algorithm you can find, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard disk, which improves the performance of Internet sites hosted on ZFS-based platforms. Due to the fact that the algorithm compresses data really well and it does that very fast, we can generate several backup copies of all the content kept in the shared web hosting accounts on our servers every day. Both your content and its backups will take less space and since both ZFS and LZ4 work extremely fast, the backup generation will not influence the performance of the web servers where your content will be stored.