Data compression is the compacting of info by lowering the number of bits which are stored or transmitted. In this way, the compressed info takes less disk space than the original one, so extra content might be stored on identical amount of space. There are many different compression algorithms which work in different ways and with many of them only the redundant bits are erased, which means that once the info is uncompressed, there is no decrease in quality. Others erase unnecessary bits, but uncompressing the data subsequently will lead to reduced quality compared to the original. Compressing and uncompressing content takes a huge amount of system resources, especially CPU processing time, so any web hosting platform that employs compression in real time needs to have ample power to support that feature. An example how info can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" how many sequential 1s or 0s there should be instead of saving the entire code.
Data Compression in Cloud Hosting
The compression algorithm that we use on the cloud hosting platform where your new cloud hosting account will be created is called LZ4 and it's used by the advanced ZFS file system which powers the platform. The algorithm is greater than the ones other file systems use because its compression ratio is higher and it processes data a lot faster. The speed is most noticeable when content is being uncompressed as this happens more quickly than data can be read from a hard disk drive. As a result, LZ4 improves the performance of each and every site hosted on a server that uses this algorithm. We take full advantage of LZ4 in an additional way - its speed and compression ratio allow us to produce a number of daily backup copies of the whole content of all accounts and store them for thirty days. Not only do the backup copies take less space, but their generation won't slow the servers down like it can often happen with many other file systems.