Friday, August 14, 2009

Black is black (II)

The second black thing has nothing to do with black point compensation, but is still black-fashioned. That are the black preserving intents. No, this does not belong to normal ICC workflow. ICC has tried to address such need but still there is nothing in the spec.

Let's see what the issue is. Suppose we work in press. Press are very tied to standards, US press uses SWOP and European folks are more toward FOGRA. Japanese people uses other standards like TOYO, for example. Each standard is very well detailed and presses are setup to faithfully emulate any of these standards.

Ok, let's imagine you got an image ad, looking like that. This is a very usual flier, now just imagine instead of getting it in PDF, you get it as a raster file. Say in a CMYK TIFF, ready for a SWOP press. And you want to print in on a FOGRA27!!

Maybe you see no issue over here, but take a look on this:


LittleCMS ColorSpace conversion calculator - 4.0 [LittleCMS 2.00]

Enter values, 'q' to quit
C (0..100)? 0
M (0..100)? 0
Y (0..100)? 0
K (0..100)? 50

C=48.21 M=38.71 Y=34.53 K=1.09


That means, if I convert from SWOP to FOGRA27, the ICC profiles totally mess up the K channel, so a portion of the picture that originally is using only black ink, after the conversion, gets Cyan, Magenta, Yellow and well, a little bit of black as well. Now please realize what happens on all the text in the Flier.


Ugly. I guess the vendor who pays the bill will not be very pleased, right?

So I've added two different modes to deal with that: Black-ink-only preservation and black-plane preservation. The first is simple and effective: do all the colorimetric transforms but keep only K (preserving L*) where the source image is only black. The second mode is fair more complex ans tries to preserve the WHOLE K plane. I'm still checking this latter. If you want to give it a try, please download the new snapshot and build the TIFFICC utility. It already implements those new intents.

Wednesday, August 12, 2009

Back in black

I've been lately working hard in two black thingies. Namely, black point compensation and black preserving intents. Black preserving intents are worth of a post, and still there are some subtle bugs, so let's go fot the black point compensation first. BPC has been implemented in lcms for years. It works. So if ain't broken don't fix it. Well, not really. It seemed to me lcms2 would be a good chance to implement Adobe's BPC... oh, that's a nice story.

BPC is a sort of "poor man's" gamut mapping. It basically adjust contrast of images in a way that darkest tone of source device gets mapped to darkest tone of destination device. If you have an image that is adjusted to be displayed on a monitor, and want to print it on a large format printer, you should realize printer can render black significantly darker that the screen. So BPC can do the adjustment for you. It only makes sense on relative colorimetric intent. Perceptual and saturation does have an implicit BPC .

Works like magic but it is quite simple: just a plain linear scaling in XYZ space.

What turns out the real problem: It is easy to implement BPC as long as you can figure out what the black point is. The black point tag doesn't help. There is such quantity of bogus profiles that ICC has deprecated the tag in a recent addendum. It is up to the poor CMM to figure out what the black point is.

And it is a hard task tough. Ok, you may argue, just use the relative colorimetric in the input direction, feed RGB=(0, 0, 0) and ... wait, what about CMYK? Ok, CMY=(255, 255, 255,255)... Hum.. CMYK devices are ink limited, that is, the maximum amout of ink that a given paper can accept is limited (else ink will spread all over) so this doesn't work either. What lcms do, is to use perceptual intent to convert Lab of zero to CMYK and then this CMYK to Lab using rel. colorimetric. This works fine on all but some broken profiles. Well, Adobe's paper target those profiles.

The funny part is the complexity of Adobe's algorith. The perceptual trick is used as a seed on certain profiles, then black axis of a round-trip on L* is examined. If the difference is greater than 4, then a least-squares fitting of the curve to a quadratic curve should be done. The black point is the vertex.

In fact it is more complex as I'm ommiting constants depending on the intent and other gory details. Such complex. After the 6th time you read the paper, it begins to make sense.

But unfortunately it doesn't work to me. And unfortunately, it doesn't work for the ICC reference implementation CMM. We both get the same bad guess of L* = 625 for a given profile. And guess what? I've tried Photoshop CS4, and the reported black point does exactly match with my humble, perceptual to rel.col. algorith, so I will keep this one for a while.

Tuesday, August 4, 2009

New drop available

August drop is here. Working features are CMYK, black point compensation, all profile I/O ...
Enjoy!

Saturday, August 1, 2009

Sleep your bugs

It hurts. Its a sort of burning... when you figure out where is the bug, you experience an urgent necessity of fixing it, NOW! Big mistake.

It is not clear if was Knuth or Hoare who coined the phrase:

We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.


It is indeed very true. I would like to add a humble corollary:

premature bugfixing is lame too.


So, here is my advice: Sleep your bugs.

I am not talking about typos. You know, misspelling 'l' and '1' for example (nobody should use 'le0' as var name, '1e0' is still a valid number) or failing to include a range check (last time I forget to place an 'if' I got a Pwnie nomination)

I am talking about logic bugs. Something wrong with the algorithm or the true logic of the program. Adding inconsistent features is another example.

Sleep those kind of bugs. Take good note of them, and wait a day or two in fixing. Probably you will find a neat way to deal with the issue without changing those hundreds of lines. Maybe not, but even in this case the solution is worth of the wait.

In my case the bug was black point detection. lcms2 is now an unbounded CMM that operates on floats. So the black point detection code was using floating point to do the calculations. Yep, it seems a nice feature: far more precision, etc. On the other hand, inks on floating point are represented as %, so the effective range is 0..100%, that is how Photoshop and others do work and I thought it makes sense to keep the feature in lcms2 as well.

Put both features together and you have a nice logic bug: extra code is needed on BP detection to deal with different ranges.

If you happen to be a professional programmer, you surely realize that my advice of postpone bug fixing goes against the schedule, the program manager and the planner. Sure, but that's lcms2, my pet project which has unlimited resources on time. Would be a dream for a commercial project, but this project is not commercial and therefore is not subjected to schedule tyranny.

By the way, a new drop of lcms2 is in the way and will be available in a day or two.