It’s ironic, really, that a company known for its secrecy, clever marketing terms, and at one point in time, its “reality distortion field” has focused squarely (or cinematically, or panoramically) on a key theme when it comes to its phone photos: truth. Apple has said that it wants to give iPhone users a more true-to-life depiction of the things, people, and places being photographed, rather than an unnaturally brightened, oversaturated, or smoothed-out image.

Given Apple’s dominance in this space, it’s easy to overlook the fact that the cameras in the new iPhone 6S models are as much about Apple’s internal philosophy as they are about technical prowess and manufacturing partners. In other words: Apple’s smartphone cameras are not technically unachievable. Other smartphone makers could launch cameras like these — in fact, some have better technical specs. In the past year we’ve seen great strides from major competitors such as Samsung and LG, both of which have released smartphones with highly capable cameras. But Apple says that it’s the company’s decisions around this camera technology and the sum of the parts, not the parts themselves, that sets their new cameras apart.

Deep trench isolation. Retina flash. 4K video. These aren’t just fancy terms used by Apple to market its new cameras — though, they are that, too. At this point, these are expectations. With smartphone makers offering increasingly advanced cameras to consumers, it’s almost easy to forget that when Apple first launched the iPhone eight years ago, the original smartphone’s 2-megapixel camera was basically … functional. The oft-repeated cliché that the best camera is the one in your pocket wasn’t true yet, because in many ways, the best cameras were still the ones we slung around our necks.

Want an image that makes the sky look paradise blue? Or a selfie that makes your skin look like putty? That’s fine, but Apple’s approach to such things is direct: do it in post-production, we’ll capture all those hard-earned wrinkles first. How do the images stack up in real world tests? That might sound better in theory — or worse, depending on how you feel about your complexion — but how do the photos from the new iPhone 6S models really compare to those snapped with competing, market-leading smartphones? Here at The Verge we’ve been putting the new smartphones through the photo ringer, some of us for a week or more, others over the past few days. Some of the images we’ve captured are below, so you can judge for yourself what you feel is the better photo. But, photographic evidence aside, let’s take a quick look at what went into the design of these new iPhone cameras. First, the spec that everyone looks to first but isn’t necessarily indicative of quality: megapixels. The new iPhone 6S and 6S Plus both have 12-megapixel rear-facing camera, the highest megapixel count an iPhone has ever had. But many megapixels do not always equal a great photo — in fact, if you try to cram more megapixels onto a relatively small sensor, each pixel collects less light, often resulting in poorer image quality. Further, pixels can bleed color information into each other when they are crammed closer together. To counter this, Apple says it is utilizing a technology called deep trench isolation, which forms a wall between each pixel and supposedly produces better images with more accurate colors. The difference between the 8 megapixels of the last three generations of iPhone and the 12 megapixels in the iPhone 6S isn’t as great as the raw numbers would lead you to believe, but you can expect to be able to crop in a little closer on your images than before.

Not surprisingly, Apple won’t say exactly who makes the sensor it is using in its iPhone cameras. Earlier teardowns have revealed that Apple has used a variant of a Sony sensor in past iPhone models, but until we get details from teardowns of the new iPhones, we can’t know for sure. Sony’s image sensors are hugely popular across imaging industries — you’ll find them in countless digital cameras and lots of smartphones, including some made by Samsung. Then there’s the image signal processor, or ISP. Apple didn’t design its own image signal processor until the iPhone 4S model, but it’s been a mainstay in every iPhone since. (That phone also shot 1080p HD video at 30 frames per second, a feature that effectively put point-and-shoot video cameras like the Flip out of business.) Though other smartphone makers, including HTC and Samsung, now build their own ISPs, Apple says its camera ISP is one of its biggest advantages and, coupled with further tuning through the phone’s OS, is what produces top-notch photos. One of the design decisions that might have photographers dismayed is that Apple retained the same f/2.2 aperture for both its rear and front cameras, whereas competitors like Samsung and LG offer brighter lenses in their flagship phones. And, the inclusion of 4K video is great, but the iPhone 6S still doesn’t boast the optical image stabilization that the iPhone 6 Plus and iPhone 6S Plus have, which Apple claims is because the mechanical stabilization system requires more space (and not because it’s trying to establish a product differentiator). The 5-megapixel front-facing camera might actually be the most significant improvement in this year’s batch of iPhones; it also shows that even though Apple is steering away from "beautified" smartphone photos, it had to bow to at least one cultural trend. The front camera triggers something called Retina flash, which, as The Verge’s Nilay Patel pointed out in his review, won’t go unnoticed when you’re standing in a dark bar and your iPhone’s display is flashing white, but it really does make low-light selfies look a lot better. As part of Apple’s testing during the development of these cameras, members of the company’s camera engineering team took over 100,000 photos over a period of several months. We didn’t take quite as many, but check out some of the images below (images have been resized and aligned but otherwise unaltered from camera):

For our comparison, we chose the best smartphone cameras on the market: the Samsung Galaxy S6 Edge+ and Note 5, the LG G4, and (of course) the iPhone 6S Plus. The most interesting thing we’ve learned after putting them head to head is that there’s often very little discernable difference between them in the real world. In almost every scenario, each phone took perfectly usable (and sometimes exceptional) photos that are far better than what we could expect just a few years ago.

One trend we noted was that the images from the iPhone often had a very slight green color cast, especially with portraits, and were not as pleasing as images from Samsung’s Galaxy S6 Edge+ or LG’s G4. Macro photos were also a point of struggle with the iPhone — where the other phones quickly and reliably locked focus on what we were attempting to shoot, the iPhone frequently missed.

But there were other instances, such as landscapes, where the iPhone produced the best image, going to show that the differences between the phones aren’t necessarily cut and dried. And images taken in a brightly lit outdoor courtyard looked blown out when they were snapped with the Samsung Galaxy Note 5 (which has the same camera as the other Samsung flagships), whereas the iPhone 6S Plus managed to meter exposure and keep colors true.