A version of this essay was originally published at Tech.pinions, a website dedicated to informed opinions, insight and perspective on the tech industry.

I’ve spent a little more than a week with the new iPhones. My first impression was, “Apple was not kidding when they said the only thing that has changed is everything.” This is the first iPhone experience I’ve had in quite some time where I felt a leap forward in user experience. The kind where you realize your behavior has changed, in a positive way, and you wonder how you lived without some of these new features. I’ve always considered it a good sign when a new feature is added to a device that makes you like it so much you wish you’d had it for years. The new iPhones bring several of these experiences.

Making moments out of pictures

While not the first to market with this feature, Apple introduced a new way for how we think about photos on our iPhones. Note that I said not how we take photos, although that is also true, but how we think about photos. Live Photos turns pictures into little moments. When I saw this feature demonstrated onstage, I knew it would show well, but the question I had was, would it work well in reality and practice?

My first observation about Live Photos was that I liked them so much I wish all the photos I’d taken this past year included this feature. While there is a slight learning curve, it’s one worth figuring out, because the results are worth it. We used to take pictures by snapping the photo at the desired moment. With Live Photos, since it captures 1.5 seconds before the shot and 1.5 seconds after the shot, it is a good practice to leave the camera on the target for the additional 1.5 seconds after you take the picture. This seems odd but, as I said, the results are worth it. Luckily, Apple included a small icon on the top of the screen that says “LIVE” to let you know the Live Photo is still capturing. Once it disappears, you can then move the phone. As I said, it is a slight change to how we typically take photos.

This feature makes you think about how you take photos. Part of this has to do with thinking whether or not a certain picture is good for a Live Photo. For example, sports or action photography work really well. One thing I learned quickly was, since Live Photos are only 15 frames per second, any Live Photo that includes fast motion is not smooth, and contains some jitter in the movement because of the missing frames. I did not find this to be a huge deal, but it was part of my learning curve as I considered all my options when taking a photograph.

One of the most compelling use cases for Live Photos was for taking selfies, something most people do daily and younger people do dozens of times a day. Selfies are always odd kind of moments, especially when they include more than one person. Some of the best moments occur as the group is getting organized or focused on trying to orient their “good side” for the photo. Having the entire moment captured in the still of the photograph is incredibly compelling and addicting once you try it. As I said, it is one of those features you wonder how you lived without for so long.

It will be interesting to see how companies like Facebook, Instagram and others integrate support for Live Photos into their apps. The mere presence of Live Photos changes how you think about photos.

3D Touch — The evolution of touch computing

There is a theme I’ve been noticing that Apple user experiences have been leaning toward: Eliminating friction. While Apple is not always successful across the board in their offerings here, it is hard to argue they are not better than most user experience companies most of the time. The Apple Watch in its entirety is exactly this, in my opinion. The Apple Watch is a consistently valuable piece of my personal computing solution, because it eliminates so much friction from my digital work and personal life. 3D Touch on the iPhone 6s and 6s Plus is an Apple innovation with an emphasis on friction removal. It is another feature that you don’t know how you lived without it.

There are many great examples, but I will highlight a few. I do a lot of cooking in our household, which means I search for a lot of recipes. I use the “Peek” feature for links in Safari. I firmly press on the link and get a quick preview of the recipe. This allows me to get quick previews of these recipe links to see if it sounds interesting before pressing firmer and going right to the page.

Now, of course, it is not that hard to touch the link, visit the page, then decide if I want to stay on it or not, and press the back button. However, what I mentioned before about the removal of friction is key here. This is a time-saving convenience, and one that is hugely beneficial from a user experience standpoint. One addition I would love to see in the future is the ability to scroll, perhaps by just slightly moving your thumb up or down, allowing you to see more than the opening part of the Web page, message or email you are previewing. This would take the experience even further in my opinion.

Quick Actions into apps are an entirely new app interface model. These quick actions are super valuable, and could have some fascinating long-term impact on how developers leverage this for new interaction models with their software. After using 3D Touch for Quick Actions, it becomes second nature so quickly that you want all your apps to have it. It’s generally a good sign when you like a feature so much you want it to be pervasive. I truly hope Apple’s developers dig in to this feature and run with it and think creatively about all the ways they can use 3D Touch.

Death to the Lock Screen

When Apple said the second generation of Touch ID was fast, I initially thought to myself, “It is super fast now. How can it get faster?” Then you try the second-generation Touch ID, and realize it’s so fast you nearly don’t see your lock screen. From dark screen to home screen in milliseconds. There is no need to press and hold your finger on the sensor for a reading. Just press the home-screen button, and in the amount of time it takes to press, you are logged in and at your home screen. I had a nice picture of my family on the lock screen that I now only barely get a glimpse of. Here again is an example of shaving split seconds of time off an experience to let you do the things you want to do with your phone faster and more efficiently.

As I stated at the beginning, this was the first time a number of new behaviors emerged that are now ingrained in my patterns of device usage. Overall, these latest-generation iPhones are laying new ground for Apple to build upon for all future versions. There is simply no going back. Desktop-class CPU, 3D Touch and more are all new foundations for which Apple can further develop their hardware, software, and services.

Lastly, for fun, here are some benchmarks, using Geekbench, of Apple’s new A9 processor in the 6s Plus. To put this in perspective, the iPhone 6s and 6s Plus compare favorably to the new Macbook in terms of performance.

Ben Bajarin is a principal analyst at Creative Strategies Inc., an industry analysis, market intelligence and research firm located in Silicon Valley. His primary focus is consumer technology and market trend research. He is a husband, father, gadget enthusiast, trend spotter, early adopter and hobby farmer. Reach him @BenBajarin.