Current time: 12-09-2019, 08:00 PM Hello There, Guest! (LoginRegister)

Post Reply 
 
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Need to losslessly optimize large pngs. (5MB and bigger)
01-15-2014, 07:31 AM
Post: #1
Need to losslessly optimize large pngs. (5MB and bigger)
Ok I need some suggestions on some tools that can losslessly compress large images.

I have a handful of .png & .jpg files that are 5MB and larger in size. I have about 441 images that exceed 5MB. Some are even in the range of 10-30 MB and about 3 are 60MB and up.

So if anyone can provide any tools to losslessly compress these that would be great, I'd rather not deal with lossy compression if possible as these are game maps and smearing/distorting of any kind isn't acceptable.

I've tried pnggauntlet and pngoptimizer so far and both are too slow.
Visit this user's website Find all posts by this user
Quote this message in a reply
01-15-2014, 08:11 AM
Post: #2
RE: Need to losslessly optimize large pngs. (5MB and bigger)
Are you using a Mac or PC?
Visit this user's website Find all posts by this user
Quote this message in a reply
01-15-2014, 08:16 AM
Post: #3
RE: Need to losslessly optimize large pngs. (5MB and bigger)
I'm using PC, if the tools have a gui the better Tongue
Visit this user's website Find all posts by this user
Quote this message in a reply
01-15-2014, 10:42 PM (This post was last modified: 01-15-2014 10:43 PM by robzilla.)
Post: #4
RE: Need to losslessly optimize large pngs. (5MB and bigger)
The problem is that image compression tools are generally bound to a single CPU core, so I'm afraid you won't be able to speed them up much, and the difference between the various tools is probably small. With file sizes of 5MB and up, I think you need to ask yourself whether compression is worth it. What are you expecting to save, and to what extent will that affect the user experience?
Find all posts by this user
Quote this message in a reply
01-16-2014, 02:21 AM (This post was last modified: 01-16-2014 02:55 AM by avalanch.)
Post: #5
RE: Need to losslessly optimize large pngs. (5MB and bigger)
Lossless pretty much means that there is no visual loss in quality so it won't really negatively affect the user experience. What am I willing to save? Some of the time it shaves off a miniscule 3-4 percent and most of the time it averages around 50-70 percent savings.

Instead of discouraging me, suggest me with some tools and yes I am willing to check out paid applications/services.
Visit this user's website Find all posts by this user
Quote this message in a reply
01-16-2014, 05:31 AM (This post was last modified: 01-16-2014 05:31 AM by robzilla.)
Post: #6
RE: Need to losslessly optimize large pngs. (5MB and bigger)
Sorry, I missed the intended use of these images on reading your post the first time.

I'm not discouraging you, I'm just saying you need to realize that compression is a time-consuming task, especially with images that are multiple megabytes in size. You just won't get around this being "too slow". Thankfully, as of version 3.1, PNGGauntlet does support parallel processing of images (one image per thread), which will reduce total compression time if you have a few threads to spare.

All APIs and web tools I'm aware of (kraken.io, PunyPNG, TinyPNG, smush.it, etc.) are either lossy or have file size limitations that won't work for you. If you want a GUI on Windows, I'd say PNGGauntlet with its parallel processing is probably best for the job.

Personally, I'd probably fire up a multi-core cloud server somewhere and have it crunch the images with the help of a wrapper that assigns a compression job to each core. That way, at least this won't bother you when you're using your PC for other tasks.
Find all posts by this user
Quote this message in a reply
01-16-2014, 06:46 AM (This post was last modified: 01-16-2014 06:49 AM by avalanch.)
Post: #7
RE: Need to losslessly optimize large pngs. (5MB and bigger)
I have a 4 core vps (it's centos 6 with zpanel installed) that I can use for that but how would I go about setting up such a compression job?
Visit this user's website Find all posts by this user
Quote this message in a reply
01-16-2014, 08:57 AM (This post was last modified: 01-16-2014 09:08 AM by robzilla.)
Post: #8
RE: Need to losslessly optimize large pngs. (5MB and bigger)
An example with OptiPNG: http://stackoverflow.com/questions/30024...lize-tasks

If you want to use OptiPNG, you'll have to build it from source first, since it's not in any of the main repositories.

Code:
yum -y install gcc
cd /tmp
wget http://prdownloads.sourceforge.net/optipng/optipng-0.7.4.tar.gz?download
tar -xzvf optipng-0.7.4.tar.gz
cd optipng-0.7.4
./configure
make
make install

If you'd rather use PNGout, just issue:

Code:
yum -y install pngout

That's assuming you have the EPEL repository. If not:

Code:
rpm -Uvh http://dl.fedoraproject.org/pub/epel/6/i386/epel-release-6-8.noarch.rpm
yum install optipng --enablerepo=epel

If you didn't have the EPEL repo, you may want to disable it again afterwards by setting "enabled=0" in /etc/yum.repos.d/epel.repo.

Anyway, assuming you've installed OptiPNG and your PNGs are in a directory like "/tmp/png":

Code:
cd /tmp/png
find . -iname "*png" -print0 | xargs -0 --max-procs=4 -n 1 optipng -dir /tmp/png-opt/ &

The first part locates all PNG files in the current directory, the xargs bit takes that input and creates a queue of files to pass on to OptiPNG, where --max-procs is the number of processes to start (i.e. cores to use). The -dir directive tells OptiPNG to save the compressed images to a /tmp/png-opt directory. You can pass other options like the optimization level, if you like. The ampersand at the end will ensure the jobs keep running even if you've closed your SSH session.

No guarantees, of course, but I've just tested it on an 8-thread VPS (CentOS 6.5) and it seems to be working rather well.

Hope that helps.
Find all posts by this user
Quote this message in a reply
01-16-2014, 01:01 PM
Post: #9
RE: Need to losslessly optimize large pngs. (5MB and bigger)
Thanks rob I'll give that a shot tomorrow Smile
Visit this user's website Find all posts by this user
Quote this message in a reply
01-16-2014, 01:08 PM
Post: #10
RE: Need to losslessly optimize large pngs. (5MB and bigger)
I'd refine that slightly...

Code:
cd /tmp/png
nohup (find . -iname '*png' -type f -print0 | xargs -0 --max-procs=4 -n 1 optipng -dir /tmp/png-opt/) &

Use single quotes, otherwise the *png will expand if there are any files present that would match in the current directory, and you''ll only get a part of what you expect
-type f to ignore any directories, sockets, etc that end in png
nohup and brackets so that you can set it off when you leave the office, log out, and it'll be finished in the morning.
Find all posts by this user
Quote this message in a reply
Post Reply 


Forum Jump:


User(s) browsing this thread: 1 Guest(s)