Real-Time Compression: Why Is It Important?

Why is Real-Time Compression Important?

The ability to understand the data your organization is collecting in real-time and make decisions immediately.

The volume of task-essential data that businesses must have instant access to at all times is huge and increasing daily. The Web is becoming real time. Real time analytics, real time ad serving, real time everything.

Infographic: Why is Real-Time Compression Important?
Click The Image to View the Full Infographic

The rapid growth of data is the problem. Even in 2009’s “Great Recession”, the amount of digital information grew 62% over 2008 to 800 billion gigabytes. What is critical to realize is that 35% more digital information is created today than the capacity exists to store it; and this number will jump to over 60% over the next several years.

Why is Real-Time Compression Important? [Infographic]


Design by Infographic World

  • Pingback: ¿Por qué la compresión en tiempo real es importante? #infografia #infographic « TICs y Formación()

  • David Floyer

    Real-time data compression should be considered a standard requirement for storage equipment

  • Chris DUrso

    In my work of algorithms for information retrieval I try to avoid the word compression and frequently use packing instead as compression implies the necessity to decompress the data for it to be useful distracting from my most common purpose of creating of highly dense data – speed. People naturally think slow when they think of compression, but in fact when properly applied and with the appropriate algorithms “densified data” can be orders of magnitude quicker than equivalent algorithms with identical data in its uncompressed state. This is not only in loading from permanent storage or network( think 5:1 compressed not only 1/5 network or disk i/o saturation) but careful structure allows greater utilization of L1/L2 caching giving you advantages of greatly reduced latency (potentially 500:1) and memory bus saturation.

    Consideration of algorithm specific packing also breaks the mental hurdle of common word boundaries/structures (@ 32/64/128 bits) frequently allowing co-mingling of relational data which in turn allow frequent same or adjacent cache line access (that which will already be loaded in L1/L2) improving overall performance. In the past decade and for many problems I work on memory has gotten extremely cheap however with it the number of CPU cores increasing, packing becomes relatively larger part of the game because of the diminishing share of the memory bus those cores must compete for.

    One little mind-bender: one of the regular tools I use for algorithms development and packing is simple old LZW. As the indexed and packed data approaches, meets or beats density of the LZ of its raw form I might be approaching optimal form / algorithm. In other words “dense data is dense data” a transformation for speed of access purposes generally need not necessarily change its size. Indexing usually is thought to take up space but that is not necessarily true.

  • Pingback: Real-Time Compression Infographic | Artistic Infographics()