Published on

January 22, 2026

Article

VR Field of View (FOV): How Lens Edges Affect Immersion

Field of View (FOV) is the single most marketed specification in the Virtual Reality industry. From the 90° of the early Oculus Rift to the 210° of the StarVR, the number promises “Immersion.

VR Field of View (FOV): How Lens Edges Affect Immersion

Field of View (FOV) is the single most marketed specification in the Virtual Reality industry. From the 90° of the early Oculus Rift to the 210° of the StarVR, the number promises “Immersion.

Published on

January 22, 2026

Article

VR Field of View

Imbar Bentolila

Marketing Manager

Table of Content

The Physics of Immersion and the FOV Hierarchy

Field of View (FOV) is the single most marketed specification in the Virtual Reality industry. From the 90° of the early Oculus Rift to the 210° of the StarVR, the number promises “Immersion.” However, for the optical engineer, a single number is meaningless. FOV is not a scalar quantity; it is a complex geometric relationship between the eye, the lens, and the display panel.

True immersion-the psychological state of “Presence”-is not governed by how wide the image is, but by how consistent the image is at the periphery. The human eye is an evolved motion detector. While our foveal (central) vision provides high resolution, our peripheral vision provides spatial context and motion cues. If the “Lens Edge” fails to deliver accurate spatial data, the illusion breaks, and nausea (VR sickness) ensues.

This section deconstructs the hierarchy of FOV and the optical physics that define the visible limit.

Defining FOV: Mechanical vs. Optical vs. Perceived

To understand the edge, we must define the center.

  1. Monocular FOV

The field visible to a single eye.

Formula: Theoretical Monocular FOV

FOV = 2 · arctan( d / (2 · f) )

Where:

  • d is the active display dimension (horizontal or diagonal).
  • f is the Effective Focal Length (EFL) of the lens.

The Engineering Trap: This formula assumes a simple thin lens. In complex VR stacks (Pancake), the Eye Relief (ER)-the distance from the cornea to the first lens surface-dramatically alters the effective FOV. As the eye moves further away (increasing ER), the lens aperture clips the viewing angle (Vignetting).

  1. Binocular Overlap

The critical zone where both eyes see the same object, enabling Stereopsis (3D depth perception).

  • High Overlap: Better depth perception, but narrower total FOV.

  • Low Overlap: Wider total FOV, but creates “Binocular Rivalry” at the edges (nose effect).

  • The Immersion Factor: The lens edge often defines the boundary of this overlap. If the edge quality (MTF) drops in the overlap zone, the user cannot fuse the 3D image, causing eye strain.
  1. Rendered FOV vs. Visible FOV

The “Lens Edge” is the physical aperture stop. Often, the display panel extends beyond what the lens can transmit.

  • Stencil Mesh: VR compositors use a “Hidden Area Mesh” to stop rendering pixels that fall outside the lens’s circular view.

  • The Gap: Immersion is lost when the user sees the black unrendered border. The goal of “Edge-to-Edge” clarity is to push the optical drop-off point beyond the mechanical frame of the headset.

The Human Visual System: Fovea vs. Periphery

Why do lens edges matter if we only look at the center?

Because the retina is not uniform.

  • Fovea (0° – 5°): Peak acuity. Resolves fine detail (text). Requires high MTF.
  • Para-Fovea (5° – 30°): Shape recognition.
  • Periphery (30° – 100°+): Motion detection and flicker sensitivity.

The Edge Sensitivity:

The peripheral retina is extremely sensitive to contrast changes and flicker.

If the edge of the VR lens suffers from Vignetting (darkening) or Chromatic Aberration (color shifting), the peripheral retina detects this as “motion” or “anomaly.” The brain, detecting an anomaly in the periphery, reflexively triggers a saccade (eye movement) to look at it.

  • The Loop: The user looks at the edge -> The edge is blurry (low MTF) -> The eye cannot accommodate -> Eye strain occurs.

Geometric Stability (Distortion)

The lens creates a virtual image at infinity (or ~2m). Ideally, straight lines in the virtual world should appear straight.

However, high-power VR lenses introduce massive Barrel Distortion.

To fix this, the software applies Pincushion Distortion to the image sent to the display.

Ideally: Lens Distortion + Software Correction = Zero.

The Edge Failure:

At the edge of the lens (high field angles), the distortion becomes non-linear (e.g., Mustache Distortion). The software warp map often fails to perfectly match the physical lens characteristics at 50° off-axis.

  • Result: Objects at the edge of the screen appear to warp or “breathe” as the user turns their head. This instability at the lens edge is a primary trigger for motion sickness.

Light Fall-Off (Relative Illumination)

No lens is perfectly uniform. The brightness at the edge naturally drops according to the Cosine Fourth Law:

I(θ) = I(0) · cos⁴(θ)

Where θ is the field angle.

At 50° (wide FOV), cos⁴(50°) is approx 0.17.

This means, theoretically, the edge is 83% darker than the center.

  • Engineering Mitigation: VR lenses use rigorous pupil matching and retro-focus designs to combat this, but the “dark tunnel” effect remains a major limiter of immersion.

The “Lens Edge” is not just a mechanical boundary; it is the psychological boundary of the virtual world.

While the center of the lens delivers the content (the game, the movie), the edge of the lens delivers the context (space, motion, scale).

In the next section, we will explore the specific optical aberrations that plague the periphery-why the edge is blurry, colored, and stretched-and the physics behind them.

The Aberration Abyss – Why Optics Fail at High Angles

In Part 1, we defined the importance of the periphery. In Part 2, we dive into the Seidel Aberrations.

Designing a lens that is sharp at the center (0°) is easy. Designing a lens that remains sharp at 50° (the edge) is a battle against physics.

In VR, the “Sweet Spot” is the area of the lens where the image is clear. Outside this spot, the image degrades. Expanding this sweet spot to the very edge is the Holy Grail of VR optics.

Lateral Chromatic Aberration (LCA)

The most noticeable defect at the lens edge is Color Fringing.

  • The Physics: Refractive index depends on wavelength (Dispersion). Blue light bends more than Red light.
  • On-Axis (Center): The colors align.
  • Off-Axis (Edge): The separation creates a rainbow gap.
    Equation: Transverse Chromatic Aberration
    TCA ≈ y · (V_d)^(-1)
    Where:
  • y is the image height.
  • V_d is the Abbe number of the lens material.

The Immersion Killer:

At the edge of the FOV, high-contrast objects (like white text) split into Red and Blue ghosts.

  • Software Fix: VR compositors separate the R, G, and B color channels and warp them independently (pre-distortion).
  • The Limit: This only works if the LCA is linear. In complex aspheric lenses, the chromatic shift varies non-linearly, leaving residual “purple haze” at the far edge that cannot be fixed digitally.

Astigmatism and Field Curvature

If you look at text at the edge of a VR headset, it often looks “smeared” in one direction.

  • Astigmatism: The lens has two different focal lengths for rays in the Tangential plane vs. the Sagittal plane.

  • Field Curvature (Petzval): The focal plane is a curved bowl, but the OLED display is flat.

The Conflict:

The optical designer can flatten the field (fix Field Curvature) but often at the cost of increasing Astigmatism.

  • VR Impact: As the eye looks toward the edge, it tries to accommodate (focus). If the edge focus is 1 diopter different from the center focus, the ciliary muscles of the eye are forced to work rapidly as the user scans the scene. This leads to Eye Strain (Asthenopia).

Coma and “Smearing”

Coma is an off-axis aberration that makes a point of light look like a comet with a tail.

  • Cause: Variation in magnification across the pupil.

  • VR Context: In Fresnel lenses, Coma at the edge interacts with the grooves to create “spikes.” In Pancake lenses, Coma often manifests as a “glow” or reduction in contrast on one side of bright objects.

  • Tolerance: Coma is asymmetric. The human brain finds asymmetric blur much more annoying than symmetric blur (defocus). Therefore, lens edges must be rigorously corrected for Coma.

The “Pupil Swim” Phenomenon

This is the most critical dynamic artifact at the lens edge.

Pupil Swim occurs when the geometric distortion of the lens changes as the eye rotates in the eye box.

  • Static Scenario: If the eye is perfectly centered, the distortion correction works.
  • Dynamic Scenario: The user looks 20° to the left. The pupil is now in a different position relative to the optical axis. The distortion profile at this new pupil position is different.
  • Visual Result: The world appears to “swim” or stretch. A static room feels like it is made of jelly.

The Physics of Swim:

It is driven by the rate of change of distortion (slope of the distortion curve).

Engineers minimize this by designing lenses with F-Theta properties or by using Eye Tracking to dynamically adjust the distortion shader in real-time (Dynamic Distortion Correction).

MTF Degradation (Blur)

Ultimately, all these aberrations reduce the Modulation Transfer Function (MTF).

  • Center MTF: Typically > 0.6 at 20 cycles/mm.
  • Edge MTF: Often drops to < 0.2 at 20 cycles/mm.

The “Blurry Tunnel”:

When the edge MTF is low, the peripheral vision is “foggy.”

While the fovea (center) doesn’t look there often, the brain uses the peripheral texture for Optical Flow (calculating self-movement).

  • Immersion Impact: If the optical flow cues are blurry, the vestibular system (balance) gets confused. A blurry lens edge is not just ugly; it is a vector for simulation sickness.

The edge of the lens is where the battle for optical quality is lost.

While spherical aberration affects the center, the edge is attacked by Astigmatism, Chromatic Aberration, and Coma simultaneously.

Designing a lens that maintains “Edge-to-Edge Clarity” increases the complexity (and cost) exponentially. In the next section, we compare how the two dominant architectures-Fresnel and Pancake-handle this edge challenge.

Architecture Wars – Fresnel vs. Pancake Edges

Not all lens edges are created equal. The physical architecture of the lens dictates how the image degrades at the periphery.

The industry is currently transitioning from Fresnel lenses (the legacy standard) to Pancake lenses (folded optics). This transition is largely driven by the desire to improve edge performance and eliminate the artifacts that break immersion.

The Fresnel Edge: Grooves and God Rays

Fresnel lenses achieve a wide FOV and short focal length by collapsing the curvature into concentric rings (grooves).

  • The Geometry: The center grooves are shallow. The edge grooves are deep and steep.

The “Draft Face” Problem:

At the edge of the lens, the “Draft Angle” (the non-optical return face of the groove) becomes significant.

  • Artifact: Light hitting the draft face scatters.
  • God Rays: High-contrast objects at the edge create streaks of light pointing toward the center.
  • Immersion Breaker: This artifact is inherently linked to the position of the object. As the user moves their head, the God Rays rotate. This “dynamic artifact” constantly reminds the user they are wearing a headset.

Moiré and Aliasing at the Edge:

At the periphery, the pixel grid of the display interacts with the groove pitch of the Fresnel lens. This can create Moiré patterns-ripples of interference that are fixed to the screen space.

The Pancake Edge: Vignetting and Birefringence

Pancake lenses use polarization folding to compact the optical path. They are smooth (no grooves), so God Rays are eliminated. However, they introduce new edge problems.

  1. Hard Vignetting (The “Binocular” Look)

Pancake lenses often have a smaller physical aperture than Fresnel lenses.

  • Ray Clipping: Due to the folded path, rays entering at steep angles (high FOV) often miss the internal mirrors or are blocked by the housing.
  • Result: The cut-off at the edge is sharp and abrupt (Hard Vignetting), unlike the soft fade of Fresnel. This creates a “scuba mask” feeling that limits the perceived FOV.
  1. Birefringence Leakage (Ghosting)

As discussed in our analysis of the Mura Effect, Pancake lenses rely on Quarter Wave Plates (QWP).

  • Angle Dependence: Retarders are angle-sensitive. A QWP provides exactly $\lambda/4$ retardation at 0°. At 50° (the edge), the retardation shifts (e.g., to $\lambda/3.8$).
  • The Leak: This shift means the polarization state is not perfectly rotated. The blocking polarizer fails to block the back-reflection.
  • Result: Ghost Images appear specifically at the periphery. Bright objects at the edge of the FOV appear double.

Geometric Distortion Comparison

  • Fresnel: Typically has moderate distortion. The challenge is the “kink” in distortion caused by the zones.
  • Pancake: typically has Massive Distortion. To achieve the compact size, the optical power is extreme.
    • The Consequence: The software must apply a heavy warp. This results in the “Pixel Density Drop-off.”
    • Pixel Stretching: At the edge, the image is stretched so much to compensate for the lens compression that the effective resolution (Pixels Per Degree – PPD) drops significantly. The edge looks pixelated not because of the lens blur, but because the software has stretched the pixels.

Eye Relief and the “Keyhole Effect”

The FOV of a Pancake lens is highly sensitive to Eye Relief (how close the eye is to the lens).

  • Fresnel: Forgiving. You can be 15mm away and still see most of the image.
  • Pancake: Unforgiving. Due to the complex pupil matching, if the user moves their eye 5mm back, the FOV collapses rapidly.
  • Immersion Impact: Users with glasses or deep eye sockets often experience a significantly reduced FOV (“Keyhole Effect”) in Pancake headsets compared to Fresnel, damaging immersion.

Edge Performance Comparison

Feature Fresnel Lens Edge Pancake Lens Edge
Sharpness (MTF) Low (Scattering/Diffraction) High (Refractive purity)
Artifacts God Rays, Ring Glare Ghosting, Color Shift
Distortion Moderate High (Requires heavy warp)
Vignetting Soft roll-off Hard clipping
Chromatic Aberration High (Diffractive) Low (Corrected)
Eye Box Sensitivity Low (Forgiving) High (Strict placement)

3.6 Conclusion of Part 3

The industry shift to Pancake lenses is a trade. We trade the “God Rays” and blur of the Fresnel edge for the sharpness of the Pancake edge, but we pay the price in Light Efficiency and Ghosting.

However, the biggest gain in Pancake optics is “Edge-to-Edge Clarity.” Even if the total FOV is slightly smaller, the fact that the text is readable at the very edge is a massive boost to immersion, allowing the user to scan with their eyes (natural behavior) rather than moving their head (unnatural behavior).

Metrology – Measuring the Limit

We have established that the lens edge is the critical failure point for immersion. But how do we measure it?

Standard optical metrology is designed for cameras, which have flat sensors and moderate field angles. VR lenses require measuring fields up to 100° with a “virtual pupil” that sits inside the lens barrel.

This final section explores the advanced metrology techniques required to validate VR FOV.

The Challenge of High-Angle Metrology

Measuring the center (0°) is easy. Measuring at 50° is a geometric nightmare.

  • Mechanical Collision: The metrology camera must rotate around the “Entrance Pupil” of the lens. At 50°, the camera housing often hits the lens mount.

  • Sensor Acceptance Angle: Standard CCD sensors have micro-lenses that accept light up to ~15-20°. Rays coming in at 50° are rejected by the sensor’s own optics, creating artificial vignetting.

The Solution:

Advanced VR test stations (like those from Rotlex or Gamma Scientific) use Wide-Angle Entrance Pupil Objectives. These are “reverse eye” lenses that can accept light from ±60° and funnel it onto a standard sensor.

Measuring MTF at the Edge

A single MTF number is useless. We need the Field MTF Curve.

  • X-Axis: Field Angle (0° to max).
  • Y-Axis: MTF Value (at a specific frequency, e.g., 20 lp/mm).

The Immersion Threshold:

  • Pass: MTF > 0.2 at the max FOV.
  • Fail: If MTF drops to zero before the hard mechanical aperture stop. This indicates the “Optical FOV” is smaller than the “Mechanical FOV,” meaning the user sees a blurry mess at the edge before they see the black border.

Distortion Mapping (The Warp Mesh)

To fix the massive distortion of VR lenses, manufacturers must generate a calibration file (warp mesh) for every single headset unit.

  • The Process: The metrology camera looks through the lens at a precise dot grid.

  • Measurement: It compares the observed position of the dot at the edge vs. the ideal grid position.

  • Output: A vector map $(dx, dy)$ for every pixel.

Why Edge Metrology Matters Here:

If the metrology system is inaccurate at the edge (due to lens distortion in the test camera), the warp map will be wrong.

  • Result: Pupil Swim. If the warp map is off by even 0.5% at the edge, the virtual world will warp when the user moves their head.

Measuring “Eye Box” Volume

FOV is not static; it changes as the eye moves.

High-end metrology scans the Eye Box Volume.

  1. Place the camera at the nominal eye position. Measure FOV.
  2. Move the camera 5mm Left. Measure FOV.
  3. Move 5mm Up. Measure FOV.

The Stability Metric:

We quantify “Immersion Stability” by calculating the variance of the FOV and MTF across this box.

  • Good Lens: FOV stays constant (100°) as the eye moves.
  • Bad Lens: FOV shrinks to 80° when the eye is 5mm off-center (Vignetting).

Stray Light and Ghost Analysis

To quantify the edge artifacts (God Rays / Ghosts), we use Stray Light Source Measurement.

  • Setup: A super-bright light source is positioned just outside the FOV (e.g., at 60°).
  • Measurement: The camera looks at the black screen.
  • Analysis: Any light detected is “Veiling Glare” or scattering.
  • Immersion Impact: This simulates a user standing in a dark room with a bright light to their side. If the lens edge scatters this light, the entire image washes out (contrast loss).

The Future: Foveated Metrology

As Eye Tracking becomes standard, lenses are being designed with Dynamic Characteristics.

Future metrology will not just measure the “Static Lens”; it will measure the “System Loop.”

  • Concept: The test station talks to the headset. “I am looking at 30°.” The headset renders high resolution at 30°. The camera measures the MTF at 30°.
    This closes the loop between Optical Hardware, Rendering Software, and Perceptual Physics.

What is the difference between “Monocular FOV” and “Binocular FOV,” and which matters more for immersion?

Monocular FOV is the field visible to one eye, typically around 90°-100° in modern headsets. Binocular FOV is the combined field of both eyes, which can reach 110°-120°. For immersion, the Binocular Overlap (the central area where both fields overlap) is actually the most critical. High overlap (~80°-90°) enables strong 3D depth perception (stereopsis). If the overlap is too small, users feel like they are looking through binoculars with a divider in the middle, which destroys the sense of presence despite a wide total FOV.

Why do VR lens edges often look darker than the center (Vignetting)?

This is governed by the Cosine Fourth Law of Illumination ($I \propto \cos^4 \theta$). As the viewing angle increases, the brightness naturally drops because the light hits the lens aperture at a steep angle, effectively reducing the accessible area of the pupil. In VR, this is compounded by mechanical blocking (lens rings) and the “folded” path of Pancake lenses, which clip light rays at high angles. Engineers use software to brighten the edges (Vignetting Correction), but this raises the black levels and noise.

What is “Pupil Swim,” and why does it cause motion sickness?

Pupil Swim is a dynamic distortion artifact. It happens when the geometric distortion of the lens varies as your eye rotates to look at different parts of the screen. If you look at a straight door frame in the virtual world and turn your head, the door frame might appear to bend or wobble. This discrepancy between the visual cue (wobbling world) and the vestibular cue (steady head movement) triggers the brain’s poison response mechanism, causing nausea.

Why do text and white lines separate into red and blue colors at the edge of the lens?

This is Lateral Chromatic Aberration (LCA). The lens acts like a prism, bending blue light more than red light. At the center (optical axis), the colors align. At the edge, the difference in refraction angle is maximized, causing the colors to separate. VR software tries to fix this by pre-distorting the Red, Green, and Blue channels of the image separately, but it cannot perfectly correct high-order, non-linear chromatic aberrations in complex aspheric lenses.

How do Pancake lenses improve edge sharpness compared to Fresnel lenses?

Fresnel lenses have physical grooves. At the edge, these grooves are deep and steep, causing scattering and diffraction that blur the image. Pancake lenses are smooth refractive elements (usually glass or high-grade plastic). They eliminate the groove structure, allowing for continuous, sharp refraction even at high angles. This results in “Edge-to-Edge Clarity,” where text remains readable even in the periphery, allowing users to scan with their eyes instead of their heads.

What is the “Sweet Spot” in a VR lens?

The Sweet Spot is the area of the lens (usually the center) where the MTF (sharpness) is high and aberrations are low. In older Fresnel headsets, the sweet spot was small; if the headset shifted slightly on your face, the image became blurry. Modern Pancake lenses have a much larger “Optical Sweet Spot,” meaning the image remains sharp even if the eye is not perfectly centered, or if the user looks toward the edges.

Can we just make the lenses bigger to increase FOV?

Ideally yes, but Weight and Distortion prevent this. Larger lenses require more glass/plastic, making the headset heavy and front-heavy (bad ergonomics). Furthermore, as diameter increases, the optical aberrations (especially distortion and field curvature) grow exponentially, requiring more corrective elements, which adds even more weight. The trend is toward smaller but closer lenses (Pancake) to balance FOV with form factor.

Why do I see “God Rays” or streaks of light at the edge of Fresnel lenses?

God Rays are caused by light scattering off the Draft Face (the vertical riser) of the Fresnel grooves. When a bright object (like white text) is on a dark background near the edge of the FOV, light hits these risers and scatters radially across the lens. This is a contrast-killing artifact unique to the discontinuous topology of Fresnel optics and is largely eliminated in Pancake designs.

How is the “Mura Effect” related to the lens edge?

While Mura (clouding/grain) comes from the display or lamination, the lens edge magnifies it. Angle-Dependent Mura is a specific type where the display looks uniform when viewed from the center, but as the user looks through the edge of the lens, the changing angle of light through the polarization layers (in Pancake stacks) reveals non-uniformities, color shifts, or brightness blobs that were invisible on-axis.

How is VR Field of View measured accurately?

You cannot use a standard camera. Metrology engineers use a specialized Wide-Angle Entrance Pupil Camera. The camera lens mimics the human eye (with the aperture at the front) and rotates around the lens’s “Eye Point” to capture the full 100°+ field. This allows measuring the MTF and distortion exactly as the user’s retina would receive it, rather than measuring a flat projection.

Conclusion

Field of View is the canvas of Virtual Reality.

While marketing teams push for higher numbers (120°, 150°, 200°), the optical engineer knows that Quality > Quantity.

A 100° FOV that is sharp, chemically correct, and geometrically stable from edge-to-edge provides vastly superior immersion to a 130° FOV that is blurry and swimming at the periphery.

Understanding the physics of the lens edge-the interplay of Vignetting, Distortion, and Chromatic Aberration-is the key to unlocking true “Presence.” The lens edge is the horizon of the virtual world; if the horizon is broken, the reality crumbles.

FEATURED PRODUCT

No data was found

Share Article

They trust us

More than 30 years of creating great machines for eye lenses

SEND YOUR LENS

You can also send us your lens to check in three simple steps

More in Knowledge Base

More Articles

Measurement Uncertainty Optical Metrology

February 14, 2026

Understanding Measurement Uncertainty in Optical Metrology Systems

In precision optical manufacturing, the difference between a lens that provides excellent visual performance and one that causes patient discomfort often comes down to fractions of a diopter. When a metrology system reports that a progressive lens has a corridor power of +2.00D, what does that number actually mean? Is the true value exactly +2.00D, or could it be +1.97D or +2.04D?

Free-form Lens Verification

February 14, 2026

Why Your Free-Form Generator Software Can’t Replace Actual Lens Verification

Every day, optical laboratories around the world make a critical assumption: if the free-form generator software says the lens is correct, then the lens must be correct. This assumption seems logical. After all, modern generators are sophisticated CNC machines controlled by advanced software that calculates millions of data points. The software knows exactly what surface it intended to create. Why would you need to verify something the machine already knows?

V-Pro GS3 Calibration Protocol

February 14, 2026

V-Pro GS3 Calibration Protocol: Ensuring Consistent Visual Inspection Results

Every contact lens manufacturer knows the frustration: a batch passes inspection on Monday morning, but similar lenses fail on Tuesday afternoon. Same product, same specifications, different results. The root cause often isn’t the lenses-it’s inconsistent inspection conditions.

IOL MTF Root Cause Analysis

February 9, 2026

Why IOLs Pass Power Testing but Fail MTF: Root Cause Analysis Using Wavefront Data

Wavefront-based measurement systems automatically decompose the measured wavefront into Zernike coefficients. The mode with the largest magnitude indicates the dominant aberration type, which maps directly to specific production causes.

free-form lens defects

February 9, 2026

5 Surface Defects That Traditional Focimeters Miss in Free-Form Lenses

Every optical laboratory relies on focimeters as the backbone of lens verification. These instruments have served the industry for decades, providing quick confirmation that distance power, near addition, and cylinder values meet prescription requirements. For traditional lens designs with uniform surfaces, focimeter verification worked reasonably well.

FFV Measurement Stability Environmental Factors

February 5, 2026

How to Identify Environmental Factors Affecting FFV Measurement Stability

Free-form progressive lenses represent the pinnacle of optical design precision. Each lens contains thousands of calculated curvature variations across its surface, with power tolerances measured in hundredths of a diopter. Verifying these lenses requires measurement systems capable of matching this precision—and that precision depends critically on environmental stability.

Send your Lens

Free Check

Inform Us About Your Samples

Let us know what you plan to send by filling out the sample submission form.

Send Your Samples

After completing the form, you will receive detailed shipping instructions.

Receive a Full Report

Get a comprehensive report from the specified system upon analysis of your samples.

Fill the form below before you send your lens