Login to Account Create an Account
Highest Compression Ratio help
Posted 10 July 2012 - 12:32 PM
First off let me start by asking this.
Have any of you ever downloaded a zip or rar file that was like 5 MB and then you uncompressed it and it unpacked to like 200MB? I have, I've seen it.
Personally the highest compression I have ever seen in my life was a 740KB file uncompressed to 538MB. I've never understood what tools were used to do this.
So my question is: What are people using to compress files down to a few MB when unpacked they are hundreds of times their compressed size.
I've tried 7zip with all kinds of different combinations and I've never been able to recreate an unbelievable compression.
My maximum I have achieved was 6;.28MB on a file that was originally 27.4MB
Any advice on programs or methods to compress files to exaggerated sizes would be greatly appreciated.
Posted 10 July 2012 - 12:38 PM
[ Download all Windows XP Post SP3 High-Priority Updates with a simple double click @ xdot.tk ]
If someone helps you fix a problem, please report back so they and others can benefit from the solution. Thanks!
Posted 10 July 2012 - 12:56 PM
There has to be some set of rules or parameters that are used to compress files to 1000 times smaller than the source. The file I saw that unpacked to 5XX mb was used by 7zip. The extension was 7Z. Ive always used 7zip I find that it has the best compression ratio but I've never been able to recreate these massive compressions that I have seen.
I've looked around and tried to change the dictionary size and put it to LZMA2 and so on. I've never been able to get anything under 50% compression. There has to be something I'm missing.
You see it so often in ISO files that are 700MB compressed down to 50-60-70-100 MB. There has to be some method that is being used to super compress these files.
Posted 10 July 2012 - 01:28 PM
Or a "tighter" compressor.
Use 7-Zip if you want the highest level of compression.
(PAQ8 family or WinRK or nanozip)
NO, there is NO magic app that does miracles , there are good compressors, better compressors and worse compressors, but that's all.
There is a trade off between "compression time", "decompression time" and "compression ratio" (and memory used/needed) most common compressors (and decompressors) tend to be more or less "symmetrical" and "efficient enough", if you have "infinite time" to compress (and also to decompress) you can have "tighter" compression.
From time to time someone comes out with a "miracle" compressor, that as soon as it is tested, it is found to be a joke.
If you have exceptional results, it meas that your source is a very compressible one like - as -X- pointed out - something with lots of 00's or anyway repeated data, a good example is empty disk images.
and take some time studying the basics .
Broken link on MSFN? You may try this. Looking for SEARCH on MSFN? Click here.
Posted 10 July 2012 - 02:21 PM
Now it is more common but if you uncompress the video from a movie, you're more likely going to get files like RESx * RESy * 24(usual color depth) * duration (in second) * ips (images per second) bytes.
A 90minutes video from a bluray would take uncompressed about 1920*1080*24*30*90*60*60 = about 439TB (a little too much for most people).
That prove that an appropriate compressor can make wonder one some files but nothing great on others.
Some intelligent file format allow more than that: vector image file format usually are a lot smaller than any other lossy or lossless format and a lot better but your computer need to compute each point to show what it look like every time you want to look a it. Imagine you want to watch Star Wars with its digitized scenes but that your player/computer would have to render each picture before displaying it. You would need an enormous calculus power to see it at the right frame rate.
Edit reason: added the following:
I played a lot with winpe and bartpe at one time and created many isos that i would store to keep an history. One year ago, i had free time and decided to clean the 17GB of PE isos. I first tried to compress all those with 7zip and got a 5GB .7zip which was too big to suit my needs.
I extracted properly all files and created a folder for each and i created a batch to recreate each iso afterwards. Then i created an iso using the option store duplicate files once and i got a 1.5GB file that i could compress with 7zip and get a 717MB file.
Of course the process won't apply for all files but it show how using the right method to compress/store data efficiently is important.
Perhaps sometime developpers will find / use tweak/trick like this to create very small .exe, .dll.
Edited by allen2, 10 July 2012 - 02:55 PM.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users