Data compression is the compacting of information by decreasing the number of bits that are stored or transmitted. This way, the compressed data will need much less disk space than the initial one, so more content might be stored using the same amount of space. There're various compression algorithms that work in different ways and with a lot of them just the redundant bits are erased, which means that once the info is uncompressed, there's no decrease in quality. Others delete unneeded bits, but uncompressing the data at a later time will lead to lower quality in comparison with the original. Compressing and uncompressing content takes a huge amount of system resources, in particular CPU processing time, so each and every hosting platform that uses compression in real time should have sufficient power to support that feature. An example how information can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" what number of sequential 1s or 0s there should be instead of keeping the entire code.

Data Compression in Shared Hosting

The compression algorithm employed by the ZFS file system which runs on our cloud web hosting platform is known as LZ4. It can boost the performance of any website hosted in a shared hosting account on our end since not only does it compress data more effectively than algorithms employed by other file systems, but it also uncompresses data at speeds that are higher than the hard disk reading speeds. This is achieved by using a lot of CPU processing time, which is not a problem for our platform owing to the fact that it uses clusters of powerful servers working together. A further advantage of LZ4 is that it enables us to make backups a lot faster and on less disk space, so we will have a couple of daily backups of your databases and files and their generation will not affect the performance of the servers. That way, we could always restore all of the content that you could have erased by mistake.