Data compression is the compacting of information by decreasing the number of bits which are stored or transmitted. Because of this, the compressed info requires much less disk space than the original one, so extra content could be stored on the same amount of space. You will find many different compression algorithms which function in different ways and with a number of them just the redundant bits are erased, which means that once the info is uncompressed, there is no decrease in quality. Others delete excessive bits, but uncompressing the data at a later time will result in lower quality compared to the original. Compressing and uncompressing content takes a large amount of system resources, in particular CPU processing time, so each and every Internet hosting platform that uses compression in real time must have adequate power to support that feature. An example how info can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" how many sequential 1s or 0s there should be instead of keeping the actual code.

Data Compression in Shared Web Hosting

The ZFS file system that operates on our cloud web hosting platform employs a compression algorithm identified as LZ4. The latter is considerably faster and better than any other algorithm you'll find, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk, which improves the performance of websites hosted on ZFS-based platforms. As the algorithm compresses data very well and it does that quickly, we can generate several backup copies of all the content stored in the shared web hosting accounts on our servers on a daily basis. Both your content and its backups will require reduced space and since both ZFS and LZ4 work very quickly, the backup generation will not affect the performance of the hosting servers where your content will be stored.