Current location - Recipe Complete Network - Healthy recipes - IPhone 12 in-depth evaluation details, what about IPhone 12?
IPhone 12 in-depth evaluation details, what about IPhone 12?
You should have seen all kinds of overwhelming reports by now. We'll talk about some specific details later. Let's directly answer a question that everyone is most concerned about:

How about iPhone 12 photography?

One word, cow, two words, can't be broadcast.

Therefore, in this article, we will slowly interpret the image ability of 12 series first.

Horribly, Apple started piling up hardware.

For a long time, Apple has been "stingy" in hardware, and at this time, the whole Android system is engaged in a drastic "pixel war" and "telephoto war". But this time, a bigger sensor was finally adopted. The sensor area directly increased by 47%. The sensor area is basically decisive for the image quality of the whole photo, especially in low light. This is the fundamental reason why Apple's night photography quality is inferior to other Android models in recent years. This time, the sensor is directly increased, so that the quality of the whole night will definitely make great progress. Officials say that the shooting ability of low-light video has even been directly improved by 87%, not to mention being excellent for the naked eye.

More importantly, the iPhone is a machine that doesn't pursue high pixels. This time, Pro Max remains at12 million pixels, so it also brings a larger pixel size of 1.7 micron, which can effectively control this low light noise.

However, speaking of this, the composition of the three cameras of the Pro series has not been introduced.

It is important to emphasize that this year's Pro and Pro Max cameras are not the same.

The parameters of the two are exactly the same only in the ultra-wide angle, but since it comes to ultra-wide angle, let's say one thing. When we evaluated 1 1 last year, we pointed out that although the ultra-wide-angle lens has a good ultra-wide-angle look and feel, the image quality has dropped significantly in low light. This year, Apple directly filled this short board, focusing on improving the night performance and improving the overall low-light image quality.

In the wide-angle lens, that is, the main lens we often say, Max has a larger area than the Pro sensor. As mentioned earlier, there will obviously be better picture quality.

But the more obvious difference is that the telephoto focal length of Max is 65mm, while the telephoto equivalent focal length of Pro is 52 mm.. In other words, Max has a longer focal length. Don't underestimate the 13mm, because 65mm enters the "dessert focal length" of portrait photos. At this focal length, the portrait will have a very good spatial depth of field, and at the same time, it will have better facial distortion control.

In terms of photos, you can shoot pure "portrait works", and in terms of films, you can shoot more pure close-ups.

I know, from a technical point of view, Max doesn't have a longer focal length like many Android machines. But the focal length of the camera is not the size, but the content is judged according to the subject matter. The focal length of 65mm is a very thoughtful choice. Because it ensures that you can take portraits even indoors.

On the contrary, if the focal length is longer, unless your home is hundreds of flat floors, the portrait you take is basically just a face. In other words, 65mm is a very practical telephoto end. Judging from the actual proofs of Apple, the texture presented by the picture is also very wonderful.

And here, there is still a big trick that has not been said, because this time Pro Max has a real unique secret, which is:

Sensor displacement optical image anti-shake

I know you must be at a loss for such a description. But in fact, this technology is not complicated, and it is a technology popular in the field of professional cameras for many years. This time, if I remember correctly, it should be the first time to apply this technology to mobile phones.

To understand this technology, we must first be familiar with a concept called "safety shutter" in the field of photography, that is to say, when the shutter speed is too low, the shaking of our hands will blur the picture and degrade the quality of the whole picture. The so-called sensor displacement optical image stabilization is to make the sensor of the mobile phone move with your rhythm. For example, if your hand moves down, the sensor will move up actively to keep the sensor stable. Although this sounds very simple, it requires a lot of calculations and adjustments in a short time, and this value is as high as 5000 times per second.

And through this technology, two very important photographic pain points can be solved:

1. In the photo, we can hold a long exposure. Without this technology, the safety shutter of hand shake is the reciprocal of the focal length, such as 26mm, so the safety shutter is1/26 s. Below this shutter, it will "paste". But with this technology, Apple officially claimed that the hand-held shutter speed can reach 2 seconds. What's the concept of 2 seconds? It's enough for you to shoot a busy track video, and even some extreme environments can shoot stars.

2. On the video, the anti-shake ability will be a new beginning. In the past, the anti-shake of iPhone video was still based on calculation, that is, through the cutting and alignment of the picture, but now through the movement of the sensor, the picture can be sacrificed less and the transition can be more natural. Simply put, hand-held shooting can also get better mirror effect.

Of course, in addition to the above changes, there are some details, such as the brand-new seven-mirror wide-angle lens and the larger f 1.6 aperture, which makes the whole picture sharper.

Compared with previous models, Max has the longest span in hardware. There is not only the conventional idea of upgrading the lens, but also a new gadget of sensor shaking mobile phone photography. It not only keeps restraint in pixels and focal length, but also keeps cutting-edge in camera technology.

What is even more frightening is that Apple's computational photography has taken another big step.

At least for now, no one knows "computational photography" better than Apple.

Today, Apple has three axes in photo computing photography, which correspond to different light conditions.

The first is the night scene mode in low light, which enhances the brightness, purity and color of the whole picture by stacking multiple pictures. The night scene mode that I want to talk about first is really not the first thing Apple did. However, judging from the night scene mode that I have used various mobile phone lenses of different brands throughout the year, the night scene mode of iP Apple is "the commanding height of aesthetics".

Last year, 1 1 just launched the night scene mode, and I suggested that the night scene mode of iPhone pays more attention to "color". It pursues color restoration in low light environment, rather than pursuing higher brightness. On the contrary, it maintains a high degree of restraint in brightness and focuses on recreating the atmosphere of the night scene. On top of this, this year's night scene mode still maintains such tonality, such as the following sample, which is the best example:

The two biggest changes in the night scene mode this year are that the night scene mode is not only supported by the main lens, but also supported by the ultra-wide angle, and the more interesting front lens also supports the night scene mode, making the self-portrait at night "more beautiful".

Another bigger change is that the night scene mode is embedded in "time-lapse photography". This has always been the function I wanted. Because the iPhone's time-lapse photography has always been the most stable time-lapse creation method. But don't say at night, when the sun sets, the picture quality will obviously drop. This time, we have a good helper "night scene mode" so that we can shoot the delay of "day to night" without restraint. It is equivalent to replacing those night scene frames with high-quality frames in night scene mode and inserting them directly into the delayed video, which makes the night scene delay more abundant.

Then there is Deep Fusion in low light, that is, deep fusion, that is, through the synthesis of multiple frames, the picture can restore the details of the subject in low light environment. It is more like a kind of advanced sharpening in different regions, which makes this picture more textured. An important change this time is that all cameras on Max support deep fusion.

Of course, there is also the smart HDR that the iPhone is best at, and it has come to the latest HDR 3. In the past, our understanding of HDR technology still stayed in the sub-regional brightness of light, for example, making the black places in the picture bright and brighter. But in the intelligent HDR 3, the iPhone directly subdivides the scene. It is learning what scene you are shooting in and making real light adjustment. If it is a portrait, it will recognize your face and divide your face into regions. In addition, it will identify various natural scenes, such as making the sky purer after identifying the sky, and making the cracks in the land more textured after identifying the land. Therefore, it is conceivable that intelligent HDR 3 has become a key weapon to improve the texture of the picture.

Of course, all of the above are "the past", and this time it is perfection and improvement. And the next thing to say is:

A new beginning.

From now on, 12 all supports Dolby Vision shooting. This is also the first camera system that can shoot Dolby Vision. Simply put, Dolby Vision has two major advantages:

1. Excellent color, because the color has been directly upgraded from 8 bits to 10. Don't underestimate the change of only 2 bits. Students in the film and television industry should understand that this 2-bit change is exponential, and the actual number of colors is a big leap, allowing the iPhone to capture 700 million colors directly, which is 60 times that of the past. In other words, we will use the iPhone 12 series to shoot videos in the future, which will have more delicate and real colors.

2. Better dynamic range.

It should be noted that in the past, this technology needed a professional team to complete, requiring a lot of later steps. But now 12 series can directly color Dolby horizon while shooting, which is real-time and frame by frame. For example, Pro will take two pictures with different exposure value, and then analyze them with our customized image signal processor to create a histogram representing the color tone value of each frame. Then, according to this histogram, Dolby horizon video metadata is generated.

Not only shooting, but also 12 series is equipped with the editing ability of Dolby Vision, which allows you to edit Dolby video in real time.

Obviously, the above needs very strong image processing ability and computing power.

Therefore, this is why only the iPhone is good at all of the above, because 12 has the strongest mobile phone processor on the surface-A14 bionic chip.

This is actually not its first appearance, but here you only need to know why the iPhone can be so fast in computational photography, which is the core meaning.

The most terrible thing is that iPhone photography has taken a different approach.

Lidar is a very ambitious attempt. Lidar is used in all Apple product lines, and it first appeared on the latest iPad Pro. Simply put, radar can let us know the "three-dimensional relationship" of the surrounding environment. That is to say, in the past, our mobile phone could only judge the picture in two-dimensional space, but in three-dimensional space, it was helpless. But now with radar, we can get information in three dimensions.

It sounds very metaphysical. Indeed, before today's 12, the actual use scenarios of lidar for users were very few. However, Apple skillfully applied this function to 12 photography, which became the unique skill of iPhone.

First of all, since we know the three-dimensional data, it is obvious that one function will be a big step forward, and that is portrait mode. Because the so-called portrait mode is to identify the spatial levels in the scene and peel off different scenes, so as to judge which plane to blur. Basically, it mainly relies on machine learning "guessing" and "position calculation" of multiple shots. But this has a serious drawback, because it passively accepts light, so in the weak light environment, portrait mode is easy to collapse. Because the picture can't be taken, you naturally can't guess. But now because of the radar, take the initiative to attack, not to mention the weak light, that is, all black can still obtain the three-dimensional relationship in space.

Therefore, with the lidar system, the Pro series still has amazing performance at night. With the great success rate of portrait mode at night, we can get a very nice night scene, that is, Jiao Wai shot by lens. For example, the following sample.

You will find that the blur transition of the whole picture is very natural, and because there are more point light sources at night, the blur effect is even similar to that of many large aperture SLR lenses.

Not only that, laser radar can also show its talents in focusing. Still because it is an active attack, in theory, any light environment can capture the dynamic movement of objects, and it is based on this characteristic that the iPhone has a very strong focusing speed. Apple officially claims that the low-light focusing speed is even 6 times that of the past.

Basically, you can focus when you touch black, not for fun. This will make it more effective for us to shoot video at night.

Of course, there are still these to talk about.

If you finish watching this year's conference, you will find that basically most of the time you are talking about photography, while the other part of the time you are talking about 5G. In fact, 5G is already familiar to domestic users, and in this respect, Apple is already a step behind. But we also know that in the past year, 5G technology has still not made our lives advance in essence. There are many reasons for this. As Cook said, in order for 5G technology to be effective, it needs to make progress in hardware, software and operators.

In hardware, the support for 5G is natural, and the popularity of operators is also visible to the naked eye. The key to the problem lies in the software ecology. Apple's answer is faster network speed and lower latency. Intriguingly, the game actually chose the league mobile game version, and Tencent really ranked.

However, we all know that 5G is just a prelude, and we still have a lot to answer.

Another hot spot is definitely "no charger", which I know must be accompanied by great controversy.

Whether it's cost consideration or Apple's environmental protection purpose, we all know that not sending a charging head can't be a measure of whether to choose an Apple product. And for used, the only thing you need to know is:

Whether iPhone photography is behind, the voice of discussion in the past two years is very loud. But one thing is certain, this is the test of photography:

"the combination of art and art."

Among them, the so-called technique is the technical means, which is the cornerstone of everything. It is precisely because of the development and progress of sensors and algorithms that today's mobile phone photography situation has emerged.

The so-called art is a kind of concept choice, which guides the direction of "art", is the aesthetic appreciation of photography and the cognition of photography.

Today, Apple may have shortcomings in a certain aspect of "art" or find another way, but it still maintains its own mature thinking concept in the cognition of art. Don't believe me, if you look back at all the samples in the whole article now, are they still ahead of many positions in aesthetics?

This is the real capital of mobile phone photography, which is designed by those who know how to create and for those who love to create.