Recently there have been a few people on dyxum stick their heads above the parapet and have a go at producing comparisons of cameras, lenses, flashes etc. Many of these testers have been roundly abused and their efforts trashed. The common theme from self anointed test experts has been that the people conducting these tests had wasted their time by trying to use their cameras as they normally would instead of in a precisely controlled and identical way.
I have been uneasy about the way I have seen people testing and reviewing cameras for some time but there is nothing on tv right now so I figured I’d spend a few minutes putting my thoughts on paper (well on disk to be more accurate).
I am relatively new to photography but I have been involved in testing complex systems in one form or another for most of my career. Over the years I have been involved in the testing of cars, trucks, plant, aircraft and surveillance and control systems and my experience tells me that the accepted wisdom of how to test cameras is completely wrong.
Over and over I read people dictating how a test is to be conducted – use exactly the same speeds, exactly the same ISO values, exactly the same apertures, develop using the same software, same sharpness settings etc. Sorry but that simply isn’t a meaningful test.
Imagine you were testing cars over a standing quarter mile, would you specify that both cars had to be launched at precisely the same engine speeds, that you had to change gears at the same revs, etc? Of course not. You would conduct a few test runs to determine the ideal launch and gear shift points for each car and then run each car in their optimum configuration. Then after each car had given up the best times they could achieve you would truly know which was the fastest. That is a meaningful test.
If I was testing a radar, the characteristics of the target, its position, velocity vector, RCS would all be identical but I would configure each radar in the manner that best suited it. If one radar performed best at higher frequencies and other a lower frequencies and I would use each differently were I to purchase them what would be the point of forcing each to operate at an artificial common frequency. All I would be ensuring would be that my testing was a waste of time.
Cameras are precisely the same. When testing cameras we should design the tests to ensure each camera produces the best image the camera can in a given situation. All that needs to be kept constant is “the shot”, ie the scene and the lighting. Just about everything else should be up for grabs. One camera might have great inherent noise characteristics meaning you can ramp up the ISO and make a shot relatively easily. Another might not have such great noise characteristics but you might be able to get around this by “over” exposing and then winding back the exposure in processing. That one camera might have required a little more work in a given situation is worthy of comment surely, but that is all, ultimately any meaningful test must be designed to ensure each competitor was configured and used to yield the best possible results. To do otherwise is to pretend that there is only one way of taking a shot, only one way to develop it, only one way to present it.
We know this is not true, so why would we persist in running tests that, by definition, have no relevance to how we would actually use our cameras?
I think we need to stop thinking that camera tests need to follow the same old tired formula of here is a scene and here are comparisons of iso x, y, and z. That test really does not help me at all. It leaves so many questions unanswered. Instead a good test should say here is a scene, this is the best shot we could get from camera x and this is how we got it, this is the best we could get from camera y and this is how we got it. This is a real world test with real world application.
What do others think?
I know a lot of people on dyxum place great stock in the so called “scientific” method of test, I’m really interested in why would you test that way? Do you really restrict your use of your equipment to the highly restrictive regimes you advocate for your testing?
No comments:
Post a Comment