Data compression is the compacting of information by lowering the number of bits that are stored or transmitted. Thus, the compressed information takes considerably less disk space than the original one, so a lot more content can be stored on identical amount of space. You will find many different compression algorithms that function in different ways and with a lot of them only the redundant bits are removed, therefore once the info is uncompressed, there's no decrease in quality. Others delete unneeded bits, but uncompressing the data subsequently will lead to lower quality in comparison with the original. Compressing and uncompressing content needs a huge amount of system resources, especially CPU processing time, so each and every Internet hosting platform which uses compression in real time must have enough power to support this attribute. An example how information can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of keeping the whole code.

Data Compression in Shared Website Hosting

The compression algorithm employed by the ZFS file system which runs on our cloud hosting platform is named LZ4. It can improve the performance of any site hosted in a shared website hosting account with us as not only does it compress data more efficiently than algorithms used by various other file systems, but also uncompresses data at speeds that are higher than the HDD reading speeds. This can be done by using a great deal of CPU processing time, which is not a problem for our platform since it uses clusters of powerful servers working together. An additional advantage of LZ4 is that it enables us to generate backup copies at a higher speed and on less disk space, so we will have a couple of daily backups of your databases and files and their generation will not influence the performance of the servers. That way, we could always recover all of the content that you could have erased by accident.