Percent Flicker vs. Flicker Index: Understanding the Difference
Let’s delve into the world of light flicker and explore the nuances between percent flicker and flicker index. The Department of Energy (DOE) defines flicker as the “variation of light output over time.” Interestingly, all light sources exhibit some degree of flicker. While often imperceptible, this flicker can contribute to various health issues, such as migraines, headaches, fatigue, and eye strain, as well as impair visual performance. For some, flickering light can be an annoying and distracting experience.
Figure 1: Visual representation of a single light output cycle.
Flicker often arises from issues within LED software drivers, environmental factors, power supply problems, or the loads connected to the LED or LED array.
Percent Flicker Explained
Percent flicker, also known as percentage flicker, is measured on a scale from 0 to 100%. It’s a widely recognized and commonly used parameter, often referred to as peak-to-peak contrast or Michelson contrast. The calculation involves using the formula shown in equation-1.
Flicker Index Explained
The flicker index, on the other hand, is measured on a scale from 0 to 1.0, as shown in equation-2.
To accurately measure these values, specialized instruments known as light flicker analyzers are employed. These devices quantify both the percent flicker and flicker index based on the light output variations.
Key Differences
It’s important to note that neither percent flicker nor flicker index takes frequency into account. They simply measure the magnitude of light fluctuation, not how often it occurs.
In summary, while both parameters describe the degree of light flicker, they differ in their measurement scales and calculations. Percent flicker provides a more intuitive percentage-based understanding, while flicker index offers a normalized value between 0 and 1.