![]() I will agree with you contention that different computer screens do have different display characteristics, even after a calibration and profiling. So far as I know the information pertaining to Macs was correct until the release of Mac OS X 10.6 towards the end of 2009, when Apple joined the fold and used a gamma of 2.2 as well. Part of the issue is that you are looking at 10+ year old books. That's why I am confused because indeed, I also have read about 2.4 being sRGB gamma was reffering at display hardware capability to adjust the gamma value / the capability to apply 1D LUT, not using the video card but directly on the monitor.Īnyway, I can use Colormunki Display with DispcalGUI, there is no problem. Like here (please read the first line), here (please read the sRGB chapter), here, and so on. Regarding the sRGB gamma standard (2.2) in different places: With a digital display, the more you have to change your monitor from its native state, the more you will decrease the number of colors/shades that it can display. Unfortunately, there are limits to how accurately you can calibrate your display. ![]() ![]() It is 2.4 with a linear portion with a slope of 12.92thank you both for your the color "depth" here: Where did you read that? As far as I know, neither color "depth" nor color accuracy have anything to do with gamma directly. With my SpyderPRO, gamma is set to 2.2 which is a reasonable approximation of the sRGB gamma that Ted describes. In a non-colour managed situation, you are probably best having your monitor gamma close to sRGB. If you are viewing an image with an embedded ICC profile using colour managed software, then the colour management should look after gamma conversion if there is a difference between that in the profile and that of the monitor. This suggests to me that for the average user, adjusting your monitor gamma is not terribly important. You have to go to the top of the line SyderElite to get gamma adjustment. I use SpyderPRO 5 for monitor profiling and it does't have a gamma adjustment. How can I approach this, which is the "right" seen color, at native monitor gamma value or other value, excepting all the other variables? (room light condition, color temperature, etc considering these the same in all the cases). I have to mention that my LCD is a 8-bit/channel P-MVA and doesn't have interactive gamma adjustment on it or possibility to implement internal 1D LUT (1D LUT will be implemented on the videocard using a Spyder4PRO or Colormunki Display device for calibration). On the other hand, taking in consideration that I'll use the monitor to create/view a future printed material or intend to display the content on other sRGB calibrated monitors, somehow I guess that I should use a gamma value set at 2.2 (sRGB). (please see the chapter Limitations of monitor calibration) 2.2, 1.8) which will give us more accurate color rendition Īs far as I've read, creating 1D LUT for video card (8-bit to 8-bit), forcing the display to use 2.2 or any other value instead of native gamma value, will cut the colour depth. When it comes to color management and calibration, reading in official/unofficial sources, there are different opinions telling that the native display gamma value can be left "as is" = "native" (2.43 measured in my case) preserving the color depth OR, on the other hand, gamma value could be set at different values (e.g. I have a question about choosing the "right" gamma value and I am interested to find your point of view.
0 Comments
Leave a Reply. |