Jump to content

Highest Compression Ratio help


anthonyaudi

Recommended Posts

Hello guys so I've been reading around and it seems like this question has been asked a million times but there is never a definitive answer to it.

First off let me start by asking this.

Have any of you ever downloaded a zip or rar file that was like 5 MB and then you uncompressed it and it unpacked to like 200MB? I have, I've seen it.

Personally the highest compression I have ever seen in my life was a 740KB file uncompressed to 538MB. I've never understood what tools were used to do this.

So my question is: What are people using to compress files down to a few MB when unpacked they are hundreds of times their compressed size.

I've tried 7zip with all kinds of different combinations and I've never been able to recreate an unbelievable compression.

My maximum I have achieved was 6;.28MB on a file that was originally 27.4MB

Any advice on programs or methods to compress files to exaggerated sizes would be greatly appreciated.

Thanks!

Link to comment
Share on other sites


It's the source file that allows that level of compression. Lots of "white space" or something. Use 7-Zip if you want the highest level of compression.

Link to comment
Share on other sites

But what I dont understand is how can it be that in a 50mb file some people can compress this file to 1mb some people 5 and others 30. Sometimes the compression is almost NIL where compressed it is almost the same size as the original file.

There has to be some set of rules or parameters that are used to compress files to 1000 times smaller than the source. The file I saw that unpacked to 5XX mb was used by 7zip. The extension was 7Z. Ive always used 7zip I find that it has the best compression ratio but I've never been able to recreate these massive compressions that I have seen.

I've looked around and tried to change the dictionary size and put it to LZMA2 and so on. I've never been able to get anything under 50% compression. There has to be something I'm missing.

You see it so often in ISO files that are 700MB compressed down to 50-60-70-100 MB. There has to be some method that is being used to super compress these files.

Link to comment
Share on other sites

Use 7-Zip if you want the highest level of compression.

Or a "tighter" compressor. :whistle:

(PAQ8 family or WinRK or nanozip)

@anthoyaudi

NO, there is NO magic app that does miracles :no: , there are good compressors, better compressors and worse compressors, but that's all.

There is a trade off between "compression time", "decompression time" and "compression ratio" (and memory used/needed) most common compressors (and decompressors) tend to be more or less "symmetrical" and "efficient enough", if you have "infinite time" to compress (and also to decompress) you can have "tighter" compression.

From time to time someone comes out with a "miracle" compressor, that as soon as it is tested, it is found to be a joke.

If you have exceptional results, it meas that your source is a very compressible one like - as -X- pointed out - something with lots of 00's or anyway repeated data, a good example is empty disk images.

Go here:

http://www.maximumcompression.com/index.html

and take some time studying the basics :rolleyes: .

jaclaz

Link to comment
Share on other sites

There are compressors with loss used for multimedia files. In its prime mp3 for audio was miraculous it could compress 700MB (a CD) to 50MB or less but with a loss of quality (that most people could not hear).

Now it is more common but if you uncompress the video from a movie, you're more likely going to get files like RESx * RESy * 24(usual color depth) * duration (in second) * ips (images per second) bytes.

A 90minutes video from a bluray would take uncompressed about 1920*1080*24*30*90*60*60 = about 439TB (a little too much for most people).

That prove that an appropriate compressor can make wonder one some files but nothing great on others.

Some intelligent file format allow more than that: vector image file format usually are a lot smaller than any other lossy or lossless format and a lot better but your computer need to compute each point to show what it look like every time you want to look a it. Imagine you want to watch Star Wars with its digitized scenes but that your player/computer would have to render each picture before displaying it. You would need an enormous calculus power to see it at the right frame rate.

Edit reason: added the following:

I played a lot with winpe and bartpe at one time and created many isos that i would store to keep an history. One year ago, i had free time and decided to clean the 17GB of PE isos. I first tried to compress all those with 7zip and got a 5GB .7zip which was too big to suit my needs.

I extracted properly all files and created a folder for each and i created a batch to recreate each iso afterwards. Then i created an iso using the option store duplicate files once and i got a 1.5GB file that i could compress with 7zip and get a 717MB file.

Of course the process won't apply for all files but it show how using the right method to compress/store data efficiently is important.

Perhaps sometime developpers will find / use tweak/trick like this to create very small .exe, .dll.

Edited by allen2
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...