Need to losslessly optimize large pngs. (5MB and bigger)
|
01-16-2014, 08:57 AM
(This post was last modified: 01-16-2014 09:08 AM by robzilla.)
Post: #8
|
|||
|
|||
RE: Need to losslessly optimize large pngs. (5MB and bigger)
An example with OptiPNG: http://stackoverflow.com/questions/30024...lize-tasks
If you want to use OptiPNG, you'll have to build it from source first, since it's not in any of the main repositories. Code: yum -y install gcc If you'd rather use PNGout, just issue: Code: yum -y install pngout That's assuming you have the EPEL repository. If not: Code: rpm -Uvh http://dl.fedoraproject.org/pub/epel/6/i386/epel-release-6-8.noarch.rpm If you didn't have the EPEL repo, you may want to disable it again afterwards by setting "enabled=0" in /etc/yum.repos.d/epel.repo. Anyway, assuming you've installed OptiPNG and your PNGs are in a directory like "/tmp/png": Code: cd /tmp/png The first part locates all PNG files in the current directory, the xargs bit takes that input and creates a queue of files to pass on to OptiPNG, where --max-procs is the number of processes to start (i.e. cores to use). The -dir directive tells OptiPNG to save the compressed images to a /tmp/png-opt directory. You can pass other options like the optimization level, if you like. The ampersand at the end will ensure the jobs keep running even if you've closed your SSH session. No guarantees, of course, but I've just tested it on an 8-thread VPS (CentOS 6.5) and it seems to be working rather well. Hope that helps. |
|||
« Next Oldest | Next Newest »
|
User(s) browsing this thread: 1 Guest(s)