Could AI Be Used for Extreme File Compression?
I probably have a super limited understanding of AI, but here’s a thought:
From what I know, training an AI model works like this: • You feed in a massive dataset. • An algorithm processes it in a way that it builds a neural network that allows it to recreate similar outputs later.
Isn’t that basically a form of compression?
For example, training a model might require hundreds of terabytes of data, but the final trained model could be just a few hundred gigabytes. So could the same concept be applied to normal file compression?
Let’s say I have a 1GB file. Could an AI “compress” it into a tiny neural network and later reconstruct it perfectly when needed? Would that work for general files, or are there limits to how much AI can compress data without loss?
Even if it can’t achieve perfect lossless compression, could it still potentially compress it in a lossy way?
Thank you in advance