The sense of touch is often crucial for humans to perform manipulation tasks. Providing tactile feedback during teleoperation or for users of prosthetic devices would be beneficial. However, the representation of tactile information constitutes a major technical challenge, since the numerous and possibly multimodal sensor readings are massive compared to the available tactile display technology. We introduce an algorithm that deploys two stages of K-means clustering along and across tactile image frames that render tactile sensor information at each time instant. In this manner, the massive tactile information is adaptively compressed in real-time while preserving its physical meaning, thus, remains intuitive and direct. We experimentally verify and examine the characteristics of our algorithm by evaluating the original and compressed tactile data. The data was gathered during the active tactile exploration of several objects of daily living by an Allegro robot hand that was covered with 15 uSkin sensor modules providing 2403-axis force vector measurements at each time instant. Our novel algorithm is straight forward enough to be implemented into tactile feedback systems. Finally, our algorithm allows for the direct feedback of massive tactile sensor data for a broad variety of tactile sensors and tactile displays, thereby, enables the compressed yet intuitive representation of massive tactile sensor information for real-time applications.