Is Google Phone good for photos?  Pixels versus the rest of the world

Is Google Phone good for photos? Pixels versus the rest of the world

Now that Google Pixel smartphones are officially available in Poland, it's time to challenge their legendary photo quality. We have prepared something special for you for this occasion.

Blind vote

Some of you will probably be able to decipher the phones from which the samples listed below come from just by the quality of the photos. Two of them are Pixel 8 and 8 Pro, and the numbers assigned to the phones are fixed – all photos with the number 1 in the corner come from the same phone, etc. At this stage However, we will not tell you which model has a given number yet, so that the votes are collected fairly.

The photos have a reduced resolution to 3000 px wide, and the only modification is to add a number to them in the lower right corner – there was no other interference here. Each photograph was taken hand-held, without the use of a tripod, from exactly the same place – the differences in the content of the frame result mainly from differences in the focal lengths of the lenses used in the phones.

Some of the photos, especially when viewed on a phone screen, may be confusingly similar to each other. Sometimes it even happens that the best thumbnail doesn't look the best when you zoom in on the photo. Therefore, it is worth viewing the photos below on a larger screen to form an accurate judgment about them. They are at your disposal galleries with zoom function at the end of each series.

Highly dynamic photos

The HDR function can disappear from camera settings and there is a quite rational reason for this. In order for a photograph to reflect what our brain registers, each photo should have the brightness of its brightest areas appropriately reduced and details enhanced, as well as the brightness where it is dark in the frame.

Spaces are a great challenge here on the border between shadow and bright light. These two series in the Lublin Old Town reveal many flaws. We are talking here about both about great hype on people at the gate, as well as tendency to burn the blue sky white. At the border of a very bright light source, numerous color disturbances may also occur – chromatic aberrations at the edges (colored halos) or light glows passing onto dark elements, reducing contrast.

Working against the light

Each of us probably knows perfectly well that pointing the phone towards the sun can cause a lot of problems for the cameras. However, no matter how hard we try, the situation may force us to take such a shot. In this situation, phones can completely burn out the star closest to our planet, underexpose everything around it or present it in photos flares and bright dots, which are the result of internal lens reflections. There may also be problems with the sharpness of the entire photo, which becomes soft and the colors lose saturation. Here are samples from the cameras in this scenario of our photo adventure:

The color is uneven

Let's face it, modern phones are not used to record reality in the way we perceive it with our eyes. Smartphone manufacturers, following customer needs, try to sell to us an attractive interpretation of the world around us. Sometimes they go too far and the photos start to look like exaggerated graphics. However, there is a method to this madness, because although there are still many lovers of natural photos, most people, if they do something with their photos, would rather publish them on social media than print them. Such users receive virtually ready-made files this way. Despite the awareness of these procedures, it is still surprising that how much green and red can differ from each other in the unedited photos below.

There must be a wide angle

After years of pushing customers with tiny 2x zoom lenses, VGA image depth sensors, or slightly more detailed macro lenses, one thing has fortunately become accepted – next to the main camera there should be ultra-wide-angle modulei.e. a camera with a lens type fish eye. These cameras are usually much darker than the main module, which is due to both smaller matrices and the limitations of the lenses themselves. However, they allow you to capture much more objects in the frame, making them perfect for photographing interiors and vast landscapes. There is also no shortage of techniques for taking creative, flattering portraits with a camera that slims people in the center of the field of view, but be careful – it can also add a lot of weight to people on the edge of the frame. You can see it in the samples below significant differences in the width of the field of view in individual phones:

Okay, okay, what about photos in the dark?

Photos are paintings painted with light, the less light there is, the emptyer the canvas is, i.e. in photography, the blacker it is. In order not to run out of photographic paint, you can use: The flash lampbut its effects rarely produce pleasant results.

The second way is investing in the largest possible matrices, whose task is to record energy quanta excited by a stream of photons. The larger the area of ​​a single pixel or an aggregate of pixels creating one image point in a photo, the greater the chance of capturing at least tiny reflections of light from the objects we are viewing. A large sensor also means a lower risk of energy quantum jumps between pixels, which is caused by the undesirable phenomenon of noise in photos.

Apart from the size of the matrix, it also counts relative aperture, i.e. this fractional parameter, also referred to as aperture, e.g. f/2.8. The smaller the number in the denominator of the fraction, the more light will reach the camera matrix. And we still have it all software engineering. Taking a long exposure “handheld” is virtually impossible without advanced optical image stabilization mechanisms, but a series of photos is less of a challenge. Huawei came up with such a patent for taking multiple shots without the need to use a tripod and then gluing them into one, bright photo a long time ago. We remember such a night mode from the legendary Huawei P20 Pro and P30 Pro. In addition, especially recently, photo processing has occurred with the help of machine learning mechanisms, or if you prefer, AI. It allows you to create better HDRs or remove noise more effectively.

But enough of this talking, time for practice. The frame below was taken in a dark corner of the Center for the Meeting of Cultures in Lublin. The light practically came from behind the person taking the photo, and the main light source was a “neon” informing about the emergency exit. The challenge was here both showing the inscription above the door and the information contained on a piece of paper stuck to the door.

Close-up photography, called macro mode

There will be no too much theory here. Phone manufacturers choose three paths in their cameras to zoom in on small objects in photos: a special macro module, digital enlargement of the image from an ultra-wide-angle camera or telephoto lens with an appropriately small minimum focusing distance.

The first way is usually very small sensors, poor operation in the dark and the impression of over-sharpening (manufacturers are moving away from this), the second, ultra-wide-angle cameras armed with matrices of 50 Mpix can take large zooms, but we have to be almost in the photographed object, which limits the availability of light . And finally, the use of telephoto lenses – we get nice bokeh here, i.e. soft separation of objects from the background, but with a few exceptions, the zooms are often smaller than those of the other two solutions. It is worth adding that telephoto lenses are mainly used for close-up photography in full-size cameras.

The examples below show photos taken in the phone's dedicated macro mode. Device manufacturers used both an ultra-wide-angle camera and a telephoto lens here.

Detail

Show me how many megapixels you have and I'll tell you who you are! No, it hasn't worked for a long time. Photos in full resolution matrices used in phones, we can only do it in manual modes, excluding any benefits of AI or HDR functions. And to put it briefly, they look bad, and they often have less detail than “automated” photos reduced to 12 or 20 Mpix. These additional pixels are now used to aggregate several pixels into one super-pixel. If, for example, noise appears on 2 of the 9 pixels (components of the super-pixel), the algorithm will assume that it is not there and after averaging the values, it will offer exactly what should be there in the final pixel in the photo. In short, once again the role of programmers and image signal processing engineers may turn out to be much more important than just the empty number of megapixels.

We used photos from the main cameras, modes set to automatic, and we have a seemingly boring photo on the promenade in Lublin. Below, however, you will find an enlargement of the figure of the saint to take a closer look at the differences in detail.

Photo fights - detailed photos

Time for optical zoom

Classic lenses, but small matrices, slightly better magnifications from periscope zooms and the new product of the season – terraprismatic periscope zoom. Manufacturers have many ideas for a decent approximation, and the efforts put into its development result from our needs. Yes, we like zooms, we use both small and binocular zooms using the benefits of AI. The best modules here are inevitably among the most expensive phones on the market, and in addition to the torment of R&D departments over zoom, there are also attempts to control image stability and work against the light.

The example below will probably not be difficult for anyone to classify the listed phones. They went to wallpaper only optical approximations, so the smarter ones will probably even determine which number belongs to which phone. Give your picks in the comments 🙂

And now it's your turn – choose which phone takes the best photos by voting in the survey:

Similar Posts