What is Distortion in Lenses? Why does it happen, and how to fix this?
- Vadzo Imaging

- Sep 17, 2024
- 14 min read
You might have noticed buildings appear to bulge outward in wide-angle shots, and a grid of straight tiles starts to look slightly curved when you look through the camera. This is due to distortion in lenses. It occurs more often than many of us think.

Lens distortion is something you cannot ignore, irrespective of your objective. You could be integrating cameras into a robotics platform, developing a machine vision system for industries, or designing a medical imaging device. For example, during a medical procedure, if there is a distortion of the lenses, it could affect the depth of the process. Similarly, in a factory inspection line, a distorted image could detect good parts as defective ones. That is why you cannot afford to miss lens distortion.
In this guide, we are going to walk you through exactly what lens distortion in lenses is, the different forms it takes, including the notorious camera barrel distortion, what causes it, and how lens distortion correction actually works in practice.
What Is Lens Distortion?
So, what is lens distortion? Basically, lens distortion is a type of optical aberration, specifically a geometric aberration. In this, the original positions of features in an image are changed from their perfect spots.
In a perfect lens, light from each point in the scene would land exactly on its matching point on the image sensor. Straight lines in the real world must appear as straight lines in the image. When distortion occurs, this accuracy breaks, and points get moved from places where they are supposed to be. This is what causes straight lines to appear curved and geometric shapes to look warped.
In embedded vision, this difference is important because distortion can be recovered, as engineers have the freedom to choose wide-angle lenses for maximum coverage. Then they can rectify the distorted image from the result in software without compromising the original quality of the image.
How Does Distortion in Lenses Actually Happen?
Firstly, you have to understand refraction if you want to understand distortion in lenses. Refraction is the bending of light as it passes from one medium to another medium. When light switches from air into a glass, it becomes slow and changes its direction. The angle at which the light bends is decided by the glass's refractive index.
Now here is where it becomes interesting. A lens is not a piece of glass that is flat. It is curved, and different light rays hit it at different distances from the optical center at different angles. And because of that:

Light rays that pass through the center of the lens go through the least refraction, and accuracy is highest in converging.
Light rays that hit the lens's outer edges are bent more sharply, and the relation between the curvature of the lens and their incidence angle creates small displacement errors, but it is measurable.
This is why the most visible distortions are at the corners and edges of an image, while the center of the image usually looks accurate. As you move outwards from the optical axis, the geometry of the lens becomes increasingly flawed compared to a perfect lens, and the displacement of light rays is more accumulated.
Distortion does not increase evenly as you move towards the corners. It follows a cubic pattern based on how far you are from the center. So, even a small jump in the field of view can cause a huge increase in the distortion of edges.
Distortion vs. Other Aberrations: What Makes It Different?
Not all imaging artifacts are created equally. Distortion sits in a unique category when compared to other common aberrations:
Distortion: The data is still there; it is just in the wrong place. Because no information is destroyed, it can be mathematically corrected.
Chromatic Aberration: Different wavelengths of light focus at slightly different points, causing color fringing around high-contrast edges. The pixel data, which is mixed, cannot be separated properly after the fact.
Spherical Aberration: Light rays at different distances from the center focus at different depths create an overall blur. Details get blended, and information is lost across pixels.
For example, in chromatic aberration, color information from one wavelength gets misplaced and ends up being overlapped on the right pixel data from another wavelength. Because both signals are mixed into the same pixel, there is no way to determine which part of the recorded intensity belongs to each. The original data is effectively destroyed.
Distortion does not do this. When a point is geometrically displaced by distortion, it still lands on its own dedicated pixel, just not the pixel it would have landed on without aberration. This means you can apply corrective mapping that shifts each pixel back to originally where it should be.
When Distortion Does Destroy Information
There is a very important exception to the idea that distortion can always be recovered, especially for the wide-angle and fisheye lenses with very large fields of view.
When distortion gets strong, a single pixel on the sensor could end up covering a very large area of the real world. It might be so large that it could capture both dark and bright details at the same time. Instead of recording one tone accurately, it averages them together, producing an intermediate grey value that represents neither feature correctly.
In that situation, the information genuinely is lost. No amount of post-processing can tell you how much of that pixel's value came from the bright feature versus the dark one. This is why extremely wide-angle lenses, while offering impressive coverage, can introduce image quality trade-offs that software alone cannot fully resolve.
Does Wavelength Affect Distortion in Lenses?
Many engineers treat lens distortion as a monochromatic issue, which means that it is analyzed without taking the color of light as a factor. But in reality, it's a little bit more complex than that.
As distortion is caused by refraction, and refraction also varies with the wavelength, distortion changes depending on the wavelength of the light that passes through the lens. This is just like how a prism would split a white light into a rainbow. Blue light refracts differently from the red light, and this causes each color channel having their own distortion issues.
In practice, for most visible-light imaging applications, this wavelength-dependent variation in distortion is small enough to ignore. However, in near-infrared (NIR) imaging, which is common in machine vision, biometric scanning, and night-vision surveillance, the wavelength shift from visible light can cause a meaningful change in the distortion profile. Calibration performed with visible light may not perfectly correct a distorted image captured in NIR.
Vadzo Imaging offers several NIR-capable and monochrome cameras specifically designed for these applications, such as the AR2020 – 20MP Monochrome MIPI Camera, which is well-suited for applications where wavelength-accurate calibration matters.
How Field of View Amplifies Distortion
One of the most consistent patterns in optical engineering is this: the wider the field of view, the more pronounced the distortion.
This is not a defect in manufacturing. It is just the fundamental nature of how geometry works.
A camera lens is made up of many elements that work together to converge light rays that are incoming into one focal point on the sensor. As the field of view gets wider, light enters the angles that are steep, so the outer part of the lens must bend light more. As we have seen, the more the light bends, the more chances it has to get displaced.

The relationship is not linear. Because distortion grows with the cube of the radial field distance, moving from a 60-degree FOV lens to a 120-degree FOV lens does not double the distortion; it multiplies it many times over in the outer frame regions. This is why a fisheye lens with a 180-degree or greater FOV produces dramatically curved horizon lines, while a standard 50mm prime lens shows almost none.
For applications requiring wide spatial coverage without severe distortion, such as traffic monitoring, retail analytics, or autonomous vehicle perception, Vadzo's AR1335 – 13MP Autofocus USB 3.0 Camera and Innova-662CRS Ultra Low Light 1080p GigE Camera are engineered with optimized lens assemblies to balance coverage and geometric accuracy.
Types of Distortion in Lenses
There are four primary types of distortion you will encounter in optical imaging systems. Each has a distinct visual signature and tends to appear in specific lens types:

1. Barrel Distortion (Camera Barrel Distortion)
Camera barrel distortion is the most commonly encountered form of distortion in everyday photography and embedded vision. In barrel distortion, the magnification of the image decreases as you move away from the optical center, so the outer parts of the scene are compressed relative to the center. This makes the image look like it is wrapped around the barrel on the outside. If you capture a square grid with a heavily barrel-distorted lens, the edges will curve outward instead of staying straight.
You’ll notice barrel distortion the most in:
Wide-angle and ultra-wide-angle lenses
Fisheye lenses
Zoom lenses, when used at their widest setting
Camera modules built for surveillance of a high field of view or automotive systems.
For applications where some barrel distortion is acceptable but geometric accuracy still matters, like warehouse logistics or smart retail, Vadzo's AR1335 – 13MP Fixed Focus 4K USB Camera provides a wide FOV with manageable distortion characteristics.
2. Pincushion Distortion
Pincushion distortion is the geometric inverse of barrel distortion. Here, image magnification increases as you move away from the optical center, meaning the outer parts of the scene are stretched outward relative to the center.
The result is that straight lines appear to bow inward, toward the optical axis, resembling the sides of a pin-cushion sewing tool. If you capture a rectangular grid with high pincushion distortion, the edges on the outer will curve inward.
Pincushion distortion is most common in:
Super telephoto and telephoto lenses
Zoom lenses are used at their longest focal length. (telephoto end)
Some complex lens systems have many elements that compensate for barrel distortion.
3. Keystone Distortion
Keystone distortion is mainly caused by the position of the subject and the camera, unlike pincushion and barrel distortion, which are due to the lens itself.
When the camera is not exactly parallel to what you are capturing, straight lines will start to appear tilted or converged. A common example would be capturing a tall building from the ground. The sides would seem to lean inward towards the top like a keystone.
Keystone distortion is most common in:
Capture of architectures in which the camera is tilted either upwards or downwards
Scanning documents in which the document is not a flat one
Industrial inspections in which the camera is not aligned straight with the inspection surface
Projector systems in which the projector is not centered on the screen
Eliminating keystone distortion often requires careful physical mounting of the camera. Vadzo's AR0234 – 2MP Global Shutter MIPI Camera is a popular choice for document capture and inspection systems, where precise perpendicular mounting is combined with a global shutter for distortion-free image capture of static scenes.
4. Mustache (Wave) Distortion
Mustache distortion, also called wave distortion or complex distortion, is a hybrid type that combines barrel and pincushion effects within the same image. Specifically, the center of the image exhibits barrel-like outward bowing, while the outer regions of the image exhibit pincushion-like inward bowing (or vice versa).
This creates a wave-like, curving shape in what should be a straight horizontal line resembling the curves of a mustache, which gives it its name.
Mustache distortion is most common in:
Complex multi-element lens designs, particularly certain wide-angle zoom lenses
Lenses that have been optically corrected for simple barrel distortion, where the correction itself is not perfectly uniform across the field
Compact camera systems with aggressive lens compression
Of the four types, mustache distortion is the most difficult to correct because it cannot be fully described by simple radial polynomial models; more complex correction algorithms are required.
Lens Distortion Correction: Digital Method vs Optical
No matter what the type is, lens distortion can be corrected in two main ways. One is optical correction. Second is digital or software correction.
Optical Correction: Modify and Fix in Lens
This type of correction fixes the source of the issue, which is inside the lens itself. This could be done by the addition of extra elements that have opposite characteristics in distortion. By doing this, the light is corrected before it even reaches the sensor. For example, a pincushion-distorting element can partially cancel a barrel-distorting element.
The trade-offs are real:
Additional lens elements add weight, length, and cost to the optical system.
More glass surfaces mean more internal reflections, which can introduce lens flare into the image, an artifact that is harder to remove than distortion itself.
Optical correction is never perfectly uniform across all focal lengths in a zoom lens.
It adds complexity to the manufacturing and quality control process.
For many embedded vision applications, especially compact, cost-sensitive designs like IoT cameras, endoscopes, or wearable devices, optical correction alone is not practical. Digital correction helps a lot here.
Digital Correction (Software Fix)
Instead of changing the lens, this method corrects distortion after the image is captured. It works by first understanding how the lens distorts the image and then applying the opposite transformation to fix it. A commonly used model for this would be the Brown Conrady Model. It mathematically describes both tangential distortion and radial distortion. Camera Calibration: The software compares the captured pattern to its expected appearance. The differences are then used to calculate the lens's distortion of the image. Coefficient Mapping: Those distortion values are used to build a map that tells the software exactly where each pixel should be placed once the distortion is removed. Interpolate and Remap: The image is then rebuilt using that map, and wherever a pixel doesn't land on an exact grid position, interpolation fills in the color values smoothly.
Several Vadzo cameras include built-in support for lens calibration integration and are compatible with standard ISP correction pipelines. The Innova-678CRS IMX678 Sony Starvis2 HDR 4K GigE Camera, for example, is used in automotive and industrial inspection platforms where in-system distortion correction is applied as part of the image processing pipeline.
ISP-Level Lens Distortion Correction
Many modern image signal processors (ISPs) include dedicated lens shading and geometric correction blocks that can apply distortion correction on-the-fly in real time, without any CPU overhead. This is particularly valuable in latency-sensitive applications like ADAS (Advanced Driver Assistance Systems), robotic guidance, or live medical imaging.
ISP-based lens distortion correction requires that the correction coefficients be loaded into the ISP during initialization, typically from a calibration file stored in non-volatile memory. Once it is loaded, the correction runs automatically in the background so the video output looks undistorted to anything downstream.
Vadzo Imaging's cameras, such as the Innova-662CRS IMX662 Ultra Low Light 1080P GigE Camera and Innova-678CRS IMX678 Sony Starvis2 HDR 4K GigE Camera, are designed for integration with ISP pipelines that include geometric correction, making them excellent choices for high-performance, distortion-corrected embedded vision deployments.
How Vadzo Imaging Cameras Help Tackle Distortion
At Vadzo Imaging, we design and manufacture embedded cameras for demanding real-world applications, and distortion management is a core part of how we approach product engineering.
Here are some of the ways our camera lineup directly addresses distortion-related challenges:
1. Low-Distortion Wide-FOV Cameras for Surveillance & Automotive
The Innova-662CRS IMX662 Ultra Low Light 1080P GigE Camera is specifically engineered for wide field-of-view surveillance and automotive applications where geometric accuracy is critical. It features a built-in distortion correction engine along with an on-board dewarping engine, enabling real-time correction of lens distortion directly at the hardware level. This ensures that the output stream is already geometrically accurate before reaching the host system. By eliminating the need for software-based correction pipelines, the camera reduces CPU load and latency, making it highly suitable for ADAS, perimeter surveillance, and surround-view systems.
Key Benefit: Real-time hardware-level distortion correction and dewarping with zero CPU overhead or post-processing requirements.
2. Global Shutter Cameras for High-Speed, Distortion-Free Capture
The Vajra-AR0235 – 2.3MP Global Shutter USB 3.2 Gen 2×2 Camera is designed for environments where motion accuracy is essential. Unlike rolling shutter cameras that introduce temporal distortion due to sequential pixel capture, this camera captures all pixels simultaneously. This eliminates motion-induced distortion and ensures that the captured image accurately represents the spatial structure of the scene. It is particularly valuable in high-speed industrial and inspection applications.
Key Benefit: Simultaneous pixel capture removes rolling shutter distortion, ensuring accurate imaging of fast-moving objects.
3. High-Resolution Cameras for Precision Measurement Applications
The Falcon-AR2020 – 20MP Monochrome USB 3.0 Camera is ideal for metrology and dimensional inspection tasks where precision is paramount. Its high-resolution sensor allows for sub-pixel measurement accuracy when combined with proper calibration and distortion correction techniques. Additionally, Vadzo Imaging enhances performance through in-house optics customization, developing lens assemblies that achieve near-zero distortion even at wide fields of view (100° HFOV and above). This enables accurate measurements across larger inspection areas without the typical distortion issues of standard lenses.
Key Benefit: High resolution combined with custom optics enables near-zero distortion and sub-pixel measurement accuracy.
4. HDR Cameras for Challenging Lighting Conditions
The Innova-678CRS IMX678 Sony Starvis2 HDR 4K GigE Camera integrates high dynamic range imaging with advanced distortion correction capabilities. It includes an on-board distortion correction engine and a dedicated dewarping engine that processes images in real time at the hardware level. This ensures geometrically accurate output even before data reaches the host system. In high-contrast environments, such as outdoor and automotive settings, HDR ensures proper exposure across the entire frame, including edge regions that are typically prone to distortion. By addressing both lighting and distortion challenges simultaneously, the camera delivers reliable, high-quality imaging without requiring additional processing.
Key Benefit: Combines HDR and hardware-level distortion correction to ensure accurate, well-exposed images in complex lighting conditions.
Frequently Asked Questions (FAQs)
What is the difference between barrel distortion and pincushion distortion?
Barrel distortion makes straight lines bow outward toward the frame edges. It is the most common form of distortion in lenses and appears almost every time you use a wide-angle lens. Pincushion distortion is the opposite, where lines curve inward toward the optical centre, and is typical of telephoto lenses. For wide FOV surveillance and automotive cameras, the Innova-662CRS tackles this directly. Its built-in distortion correction engine and on-board dewarping engine correct the geometry inside the camera itself, so you receive a clean, accurate frame at the output without any host-side processing.
Can lens distortion correction completely remove all distortion from an image?
In most cases yes. A proper calibration process will restore straight lines and bring the image back to geometric accuracy. Where it gets difficult is with extreme wide angle lenses where outer zones can blend multiple features into a single pixel and that information cannot be recovered. This is why Vadzo offers optics customisation to achieve close to zero distortion at 100 degrees HFOV and above reducing the correction burden before it starts. Paired with the AR2020 20MP Monochrome Camera the residual distortion after correction becomes negligible even for precision measurement applications.
Does the camera interface affect how much distortion appears in the image?
The interface does not cause distortion. Barrel distortion and pincushion distortion originate in the lens optics not in whether the camera uses USB GigE or MIPI. What does affect distortion is the sensor readout method. A rolling shutter sensor captures rows at different moments in time which creates spatial warping on top of lens distortion. The AR0235 2.3MP Global Shutter USB 3.2 Gen 2x2 Camera eliminates this completely by capturing every pixel simultaneously giving you a clean undistorted baseline before any lens correction is even applied.
How do I pick the right Vadzo camera for a distortion-critical application?
It depends on your specific challenge. For wide area surveillance or automotive use the Innova-662CRS has an on-board distortion correction engine and dewarping engine that delivers corrected frames directly from the camera. For high speed imaging the AR0235 Global Shutter Camera removes rolling shutter spatial distortion at capture. For precision measurement the AR2020 20MP Monochrome Camera gives you the resolution margin to make residual distortion negligible. For HDR outdoor environments the Innova-678CRS handles both lighting and geometric correction on board with no host side overhead.
Is keystone distortion the same as lens distortion?
Not exactly. They are different. Lens distortion like barrel or pincushion comes from how light bends through the lens optics and appears in every image that lens produces. Keystone distortion is a perspective issue that happens when the camera and subject planes are not parallel causing parallel lines to appear to converge. Fixing them requires separate correction operations. The Innova-662CRS and Innova-678CRS address this with on-board dewarping engines that handle multiple distortion types at the hardware level so what reaches your system is already spatially accurate and ready to use.
Final Thoughts on Lens Distortion Correction in Embedded Vision Systems
Distortion in lenses is one of the most fundamental imaging challenges in any camera-based system, and it is also one of the most manageable once you understand what you are dealing with.
Whether you are contending with camera barrel distortion from a wide-angle surveillance lens, pincushion distortion at the telephoto end of a zoom, or the more complex wave patterns of mustache distortion, the underlying physics is always the same. Uneven refraction of light rays as they pass through curved glass.
The good news is that a distorted image is recoverable at least in most practical scenarios through a combination of smart lens selection, careful system calibration, and proven lens distortion correction techniques at the software or ISP level.
If you are building an embedded vision system and want to start with a camera that gives you a strong foundation for minimal distortion and excellent image quality right out of the box, explore Vadzo Imaging's full range of USB cameras, MIPI cameras, GigE cameras, and SerDes cameras. Our team of application engineers is ready to help you find the right optical solution for your specific requirements.



