The Smart Data Logger :Gbs to Kbs
Updated: Jun 24, 2021
In today’s era of the internet, ‘The cloud’ is no alien term. Everyone is better aware of the population that uses the internet and depends on the cloud for keeping it safe. With technology advancing in a way where everything has become intelligent and uses the internet for proper functioning, even the automotive industry has a lot to add.
Vehicles have evolved and become smarter and sophisticated. And as an addition to this, they support very advanced data collection. This extensive collection of data helps to improve the functionality/performance of your vehicle. The collected Data is dumped onto the cloud, real-time. To be later pulled up and analysed to find the causes behind a particular error or behaviour. Every automotive industry follows this, especially when EV’s have started to hit the market; this process has become real-time.
EV’s are brilliant vehicles capable of relaying data from each ECU onto the cloud in real-time. Data like the SOC status, availability of the nearest charging station, and many more can also be easily monitored. Thus, every vehicle sends very high-frequency data, which later is directed to the concerned department for processing. Finally, the actual Data is analysed, and the needed Data (the one which carries the information of error) string is extracted.
What is of concern, one may ask? While this process might seem simple, the data collection and analysis process, unfortunately, is not that easy and probably not even real-time data collections, as claimed. As what if there was ‘no’ or ‘poor’ internet connection? Where would that Data go? The modern study suggests that 100s of KBs/sec amount of Data is generated, which goes up to 10s of GBs per month, per vehicle. Imagine the amount of data all the vehicles are sending out to the cloud collectively. This is already a topic of concern for many.
Though the cloud is like an ocean that can store vast amounts of data, it is a huge technological challenge to store and maintain these data centre globally. Considering the amount of data everyone is dumping onto it, that seems near. Not to mention the bandwidth dependency and the time that goes into it. What about the expenses involved to execute this entire process? Data is not cheap neither is the storage space.
What could be a possible alternative?
A device or Datalogger which is intelligent enough to avoid this. At influx, we dare to question the existence of such a technology that may save the cloud from space crises. As a solution, what if a device could work even in the most remote areas and still collect all the data, process it locally onboard in real-time and send only the required data string to the cloud that too with minimal data usage?
How would it help?
By considerably reducing the size of analysed data transferred to the cloud every second and making analysis more accessible. Enabling to take decision instantly instead of waiting for data uploading, processing and analysis. The Gbs of data can be reduced into some Kbs of meaningful data and successfully transferred onto the cloud even with a minimum bandwidth available.