Camera phones have changed rapidly over the years since the first camera phone was introduced in 2000 with a 0.35 megapixel sensors. Fast-forward 19 years to the Nokia 9 PureView, a phone with five 12 megapixel sensors.
A year later we see the arrival of the Samsung Galaxy S21 Ultra 5G, a quad-camera phone with an 108MP main camera and 100 times “Space Zoom”. At the moment, the triple-camera iPhone 12 Pro Max and its three 12MP sensors is the top in smartphone camera performance, with number four being the aforementioned Samsung Galaxy S21 Ultra 5G.
In the old days it was easier to differentiate between phones and their respective results. But, since everyone moves to their own beat, it’s harder to qualify which is best. With so much jargon to contend with it can even become overwhelming to begin talking about the camera experience.
Why does more megapixels or more sensors not mean better camera performance? What even is a megapixel or an aperture? What else matters when considering a phone? Let’s explore.
What is a megapixel?
A megapixel is the amount of information per inch of the photo, which also means that higher megapixel images also mean higher file sizes.
A megapixel literally means one million pixels. For example, a 12-megapixel camera can produce images with 12 million total pixels. That’s a lot of dots, right?
Photos with more megapixels will certainly have more detail, but they also come with a larger file size. This can take longer to process or share, and it could eat up your phone’s storage rather quickly.
What is aperture?
Aperture is an opening in your camera lens that allows light to pass through, affecting the brightness of the photo as well as the depth of field.
Aperture is measured in fractions known as f-stops, with a value of f/4 representing a larger aperture than f/16. For reference, the iPhone 12 Pro Max’s cameras ranging between f/2.4 and f/1.6 apertures.
Types of lenses
If you’re looking to buy a modern smartphone with multiple lenses, you’ll probably encounter three types of lenses:
- a telephoto lens with a very large focal length(the distance at which a clear photo can still be formed)
- an ultra-wide lens with a small focal length
- a wide lens that falls in between these two focal ranges
A telephoto lens stacks elements vertically to increase magnification, a step further being the use of periscope cameras, which use angled mirrors to increase magnification while helping to keep the camera system flat in the phone body.
Somewhere in the middle: Acronym Avenue
OIS, EIS, HDR, PDAF, Laser AF. These are all acronyms you might see when you’re buying a phone and reading about their cameras, and it’s very likely that just like me, these acronyms mean absolutely nothing to you. So let’s learn together.
Image Stabilization: OIS vs EIS
If you’re like me and can’t hold your phone still when taking a picture and are then surprised when your picture isn’t a blurry mess, you have image stabilization to thank for this. Image stabilization can be accomplished one of two ways, optically or electronically.
Optical Image Stabilization
Optical Image Stabilization (OIS) is a solution baked into the hardware of the camera and works rather simply: a gyroscope in the camera detects movements and shifts the camera in the opposite direction of the movement.
Electronic Image Stabilization
Electronic Image Stabilization (EIS) is a software solution that makes use of your phone’s accelerometer to detect movements and in modern times, tries to keep the camera’s focus on a specific point while processing the image. One notable example is the “Super Steady” mode on the Samsung Galaxy S21 Ultra.
These two methods can be and are often combined to produce the most stable image possible, with this being called Hybrid Image Stabilization.
Autofocus: Lasers and Phase(rs)
Autofocusing on your camera is a method of adjusting the lens in your camera to produce a sharp image, and there’s two different methods that are used fairly commonly today, and both can make your photos focused AF.
You might think of lasers as the exclusive property of evil villains, but in Laser Autofocus (Laser AF), this is actually something helpful. Essentially, an infrared light is built into your camera system. When taking a picture, infrared light will be emitted and the time it takes for the beam to return from your subject will be used to calculate your distance from the subject, and the lens is adjusted accordingly. This is beneficial because it works in any lighting conditions and very quickly.
Phase Detection Autofocus (PDAF)
Phase Detection? While you may think this has something to do with a parent analyzing your Hot Topic receipts, this article is still about cameras. Phase Detection Autofocus is a method that analyzes light at two different apertures at different sides of the lens and adjusts the lens until those two beams of light align onto the sensor.
High Dynamic Range (HDR)
High Dynamic Range on smartphones is accomplished through combining multiple photos (either through multiple sensors taking a picture of the same thing or through a rapid burst of photos at once). The combination of these photos produces the most balanced lighting in a photo.
No, no, come back, it’s not time for a video conference. Remember, cameras. When you zoom in a photo, you’re essentially trying to reduce the focal length of a photo. However, where traditional cameras have lenses that can retract or extend to accomplish optical zoom (decreasing the focal length by getting physically closer to the subject), smartphones would be slightly less portable if they did the same.
One trick smartphones use to increase magnification optically is the periscope lens mentioned above, which aligns mirrors vertically.
Other than that, smartphones resort to digital zoom, which is the same method that you use when you zoom in on a picture in your gallery. It just decreases the visible area of the photo so the subject appears closer.
While all of this is important, there’s probably one question you have in mind. How does a phone with four 12MP cameras become the best smartphone camera in a world of 108MP cameras. The answer: software.
Once your sensors give your phone the image, it has to be processed by your camera software, and this processing is done by your ISP. No, no, not those people who overcharge you for absolutely terrible internet speeds. Your Image Signal Processor. The ISP uses AI to fix up the image with color correction, reduction of light pollution, etc. as well as compressing the photo so it’s easier to store or share. The final image is then spit out into your gallery
Portrait mode is a camera feature that has become very popular recently that produces beautiful portraits. This is accomplished by your smartphone detecting the foreground and the background and blurring out the background so the subject appears sharper and more in focus.
Your final question might be why no matter how great your smartphone camera is, your pictures or videos look like hot garbage in apps such as Snapchat or TikTok. This mostly results from the fact that developing a camera app that works across phones, camera types, and operating system versions.
To solve this problem, developers have resorted to a fairly simple and universal method: they access your camera and take a screenshot of the camera output.
Google and Samsung are making steps towards giving third-party developers access to camera software through the Android CameraX library, and that’s why the Samsung Galaxy S21 camera looks a lot better in Snapchat than Galaxy phones of the past. Hopefully more manufacturers will enable this support and people who use iPhones will stop making fun of us Android users for sending them bad photos.