This year has been difficult for smartphones, which is a bit of a paradox when you consider just how much better things have gotten compared to last year. With Snapdragon 820, 650, 652, and 625 we’ve finally moved past the shadow of the Snapdragon 810, 808, and 617/615. While there were Android devices that shipped with the Exynos 7420, they were often paired with a modem that was not necessarily the most power efficient. Despite all of this, there seems to be a general disappointment with smartphones. People are increasingly finding it hard to justify phones like the HTC 10 or Galaxy S7 with competition from OnePlus, Xiaomi, and even Apple with their iPhone SE.

In this context the Galaxy Note7 brings much of the flavor of the Galaxy S7 edge, but blends it with the S-Pen of the Note line and a few new features like the iris scanner. If you were paying attention to the industry with the launch of the Galaxy S6 and Galaxy Note5, it’s very much more of the same rather than the major redesigns that we saw moving from the Galaxy S5 to the Galaxy Note 4. To better illustrate what I mean we can take a look at the spec sheet.

  Samsung Galaxy Note5 Samsung Galaxy Note7
SoC Exynos 7420
4x Cortex-A57 @ 2.1Ghz
4x Cortex-A53 @ 1.5GHz
Mali T770MP8

(Samsung 14LPE)
Snapdragon 820 (US)
2x Kryo @ 2.15GHz
2x Kryo @ 1.6GHz
Adreno 530

Exynos 8890 (ROW)
4x Exynos M1 @ 2.3GHz
4x Cortex A53 @ 1.6GHz
Mali T880MP12

(Samsung 14LPP)
RAM 4GB LPDDR4 4GB LPDDR4
NAND 32/64/128GB NAND (UFS 2.0) 64GB NAND (UFS 2.0) + microSD
KLUCG4J1CB-B0B1
UFS 2.0 1 lane MLC
Display 5.7” 1440p
SAMOLED
5.7” 1440p
SAMOLED Dual Edge
Network 2G / 3G / 4G LTE (Category 6/9 LTE)
Region Dependent
2G / 3G / 4G LTE (Category 12/10/9 LTE)
Region Dependent
Dimensions 153.2 x 76.1 x 7.6mm, 171g 153.5 x 73.9 x 7.9mm, 169g
Camera 16MP Rear Facing w/ OIS f/1.9, 1.12µm, 1/2.6"
(Sony IMX240, Samsung S5K2P2)
12MP Rear Facing w/ OIS, f/1.7, 1.4µm, 1/2.6"
(Sony IMX260, Samsung S5K2L1)
5MP Front Facing, f/1.9, 1.34µm
(Samsung S5K4E6)
5MP Front Facing, f/1.7, 1.34µm
(Samsung S5K4E6)
Battery 3000 mAh (11.55 Whr) 3500 mAh (13.48 Whr)
OS Android 5 w/ TouchWiz
(At Launch)
Android 6 w/ TouchWiz
(At Launch)
Connectivity 2x2 802.11a/b/g/n/ac,
BT 4.2
microUSB, USB2,
GPS/GLONASS/Beidou
NFC, MST
2x2 802.11a/b/g/n/ac,
BT 4.2
USB-C, USB3.1,
GPS/GLONASS/Beidou
NFC, MST
Fingerprint Sensor Capacitive by Synaptics Capacitive by Synaptics
SIM NanoSIM NanoSIM

When we look at the spec sheet, the Galaxy Note7 is almost identical to the Galaxy S7 edge, but sees a minor bump in size and the addition of an S-Pen. Of course, the Galaxy Note7 is a big step up from the Note5, but for perspective it's generally more interesting to look at recent smartphone launches to contextualize the device under test. For the first time we’re really starting to see the impact on internal volume that the S-Pen has, as the Galaxy S7 edge is slightly smaller than the Galaxy Note7 but actually has a larger battery, which wasn’t the case when comparing the Galaxy S6 edge+ and Galaxy Note5. Of course, the S-Pen does also provide functional value if regularly used, so it’s a trade-off that has to be considered by the end user. While we’re still on the subject of the S-Pen, it no longer breaks the phone if inserted backwards. It also has a thinner 0.7mm tip and an extra bit of precision for pressure sensing, but we'll have to take a closer look a bit later here.

Other than the addition of the S-Pen and a slightly larger display, the Galaxy Note7 also gains a USB-C port relative to the Galaxy S7 edge, which makes the connector reversible. It also supports USB 3.1 Gen 1 but the cable in the box is USB 2 only, which seems to be a popular trend with a number of recent flagships. There’s also the addition of the iris scanner, which supports iris scanning from one pair of eyes. Other than these changes, the Galaxy Note7 at a high level is rather difficult to tell apart from the Galaxy S7 edge.

Design

The similarities between the Galaxy Note7 and Galaxy S7 edge don’t really end at the spec sheet either. Both devices have the same curved pebble-like design on the front with a physical home button and pretty much the only noticeable difference here is that the curve of the edge display is changed. There are also some extra sensors to enable the iris scanning functionality at the top near the earpiece such as the IR LED and IR camera, but Samsung is sticking to what works for them here.

Along the sides of the phone we start to see some of the differences as the device has the same radius of curvature on the front and back of the phone so the front and rear glass feel identical when your finger approaches the edge of the device rather than a more severe edge on the display lens and a gentler one on the back as seen in the Galaxy S7 edge. This also causes some noticeable changes in viewing angles which we’ll address in later sections. What is worth talking about right now is how edge swipes seem inconsistent in the sense that the edge panel requires a swipe that basically starts right at the edge of the display, while apps seem to do best when an edge swipe starts at the junction where the display begins to curve. This kind of thing is fairly annoying for the first few days you use the phone but eventually you get used to it, but in some ways this is a missing step of usability.

The front and rear glass of the Note7 meet with an aluminum frame that functions as the backbone of the phone. As the glass back is epoxied to this aluminum frame, there is no removable battery, but this simplifies internal design and volumetric efficiency. The LG G5 continues to be the last notable flagship with a removable battery and has notable issues with volumetric efficiency as a result, so I’m not sure it necessarily makes sense with the kinds of designs that Samsung has been pursuing for the last year and a half, but it would be nice to see some method of removing the back cover without using a hair dryer if only as a concession to the likelihood of shattering a glass back in a drop.

Discussions of repairability aside, the aluminum frame of this phone also holds things like the SIM/microSD tray, microphone holes, and other cutouts. It’s worth mentioning here that the phone has better alignment of all the various buttons and holes compared to the Galaxy S7 edge by a significant margin. Things like the power and volume buttons are centered and in the right place ergonomically and remain fairly clicky, although if you’re used to something like the HTC 10 they have a much softer break and feel. There have been various people that have complained about these things, and presumably in an effort to address these kinds of complaints Samsung saw fit to resolve this issue but things like the microphone holes are not necessarily aligned, so if Samsung really intends to sell this design story of symmetry I suspect future generations will need to resolve these kinds of small niggles. It’s worth mentioning that even the antenna insulation on the frame is symmetrical. The entire phone is basically edge-less and feels like holding a pebble. The one notable area of this phone that actually has a perceivable edge here is the S-Pen, which has a slight edge that helps with pulling it out if you have slightly longer fingernails. If you don’t have fingernails, the head can be depressed to make it protrude for easy access. The device also has a 3.5mm jack, which is nice to see when at least one recent device has shipped without one, but I’m still undecided on whether this really matters one way or another.

The back of the phone is pretty much identical to the Galaxy S7 and S7 edge. The curved glass helps with ergonomics, with a slightly rounded square camera lens which now contains a subtle pattern of concentric circles between the cover lens and the camera module. I don’t think this actually serves a functional purpose and actually calls attention to the rather strange mismatch of a squared-off cover lens with a circular camera. It’s worth mentioning here that the coating underneath the glass is no longer as elaborate as the one seen in the Galaxy S6 or Note5, which had an extremely fine texture to it and multiple layers to produce a neat effect in the sun, but there are still some interesting optical effects to this back cover as reflections diffuse vertically instead of a simple mirror reflection.

Overall, the design of the phone is acceptable, but honestly at this point it’s nothing really special. If you’re stepping up from a phone like the Galaxy Note 3 it’s going to feel nicer in the hand but really the design is just keeping up with competition like Xiaomi. I would argue that OnePlus and Google/Huawei have surpassed Samsung here as far as ID goes from a lower price point, and that both Apple and HTC have done better than Samsung for about the same price. We can argue about how Apple uses a plastic liner or whatever minute detail in the design is “better”, but something as simple as the camera lens is square even though a circular design would look less out of place due to the use of concentric circles. The IR LED, front facing camera, and iris scanner are visually unbalanced, and microphone holes are just not aligned in any sensible way. For all of the marketing bravado about symmetry and a focus on design, these kinds of things make it feel like Samsung is really just doing this to address press criticism rather than doing it out of some kind of internal push for improved design.

The Galaxy S6 was a massive leap forward for Samsung, and as we approach the end of year two for this design I think it’s time for Samsung to move forward once again. There are only so many ways to make a slightly rounded rectangle, but something other than the same curved glass and aluminum frame with physical home button would go a very long way towards refreshing the design. It’s easy to argue that Samsung’s strategy of mostly making the exact same design with a slight twist has worked for them for a very long time, but just looking at the past Samsung and other OEMs in the business usually only get about two generations out of a given ID before they need to refresh their designs. The edge display is pretty, but functionally it’s honestly at least mildly annoying because edge swipes don’t really work the way you’d expect them to due to the abbreviated edge on the Note7 relative to previous devices. This is obviously my opinion, but the Galaxy Note7 feels designed in a disjointed manner. The right pieces are generally in the right place, but if you look too closely the seams of not-quite symmetrical parts are still there. I found it hard to fault phones like the Galaxy S5 and Note 4 that were unabashedly functional, but this is a phone that tries to have some sense of style so it’s harder to be kind here. The Note7 isn’t going to make you sad when you take it out of the box and functionally there’s absolutely nothing wrong with this phone other than the strange ergonomics of the edge display, but if you look too closely you’ll notice the incongruities.

Display
Comments Locked

202 Comments

View All Comments

  • name99 - Friday, August 19, 2016 - link

    For crying out loud. Read the damn comments to that article.
    Bottom line is it doesn't prove what you and Andrei seem to think it proves.
  • CloudWiz - Sunday, August 21, 2016 - link

    Simply because the scheduler is able to schedule a workload across multiple threads does not mean it is taking "full advantage of 4 or more cores". Browser performance is still heavily single-threaded whether you like it or not. Read the comments on Andrei's article.

    I'm going by Anandtech results here, and the E7420 suffers incredibly on battery life when on LTE compared to Wifi. Without results for the E8890 I can't say for sure, but with not even Qualcomm having found a way to get LTE battery life to even EQUAL Wifi (sure they're getting close, but there's still a small delta) I severely doubt that Samsung can do it, given their E7420 difficulties. Also, personal experience is highly unreliable and unless you have some time-lapse video to prove it, I won't believe you when all the data available says otherwise.

    Sure Safari is more efficient. (It's also far more performant, but that's another story.) And yes I will concede that the 6s renders at a fairly low resolution. However you must keep in mind that the 6s Plus actually renders at 2208 x 1242 which is 75% the pixels of 1440p, much closer than you would think. And render resolution on GS6/7 might not even be the display resolution - most Android apps don't bother to render at such a high resolution because most phone's don't have 1440p displays. And given screen technologies in 2016, if Apple switched to 1440p LCD I doubt there would be a 2 hour+ impact to battery life. Phones like the HTC 10 with 1440p LCD displays with the same chip as the GS7 can achieve better battery life than it with the same size battery. This is not only a testament to S820's inefficiencies but also Samsung's implementation inefficiencies. And no, the GS6/7 are not able to "keep up". I've already stated how the GS6 absolutely cannot compare with the 6s or 6s Plus, and even with the GS7 the S820 version barely edges out the 6s with a battery nearly twice the size, with even the more optimized E8890 version being unable to top the 6s Plus with a larger battery. These differences can't be attributed solely to browser inefficiencies or screen densities. Samsung's implementations are simply not efficient compared to Apple or even HTC with their S820 implementation and Huawei with their Kirin 950.
  • jlabelle2 - Monday, August 22, 2016 - link

    - if Apple switched to 1440p LCD I doubt there would be a 2 hour+ impact to battery life

    How do you explain then the huge drop of battery life between the MacBook from the MacBook Air?

    - And no, the GS6/7 are not able to "keep up". I've already stated how the GS6 absolutely cannot compare with the 6s or 6s Plu

    The iPhone 6S did not exist when the GS6 was released. The S6 had (slightly) better battery life than the iPhone 6, despite a screen bigger, with 3 times more pixels to push.
    The facts do not back up your claims.
  • CloudWiz - Thursday, August 25, 2016 - link

    You have to consider what you're comparing here. On one side you have smartphones which have screens not even 6 inches diagonally. On the other side you have full-blown computers which have screens more than twice the size diagonally. There's a reason most computers haven't moved far past the 1440p-1600p mark. The screens are so big that the power consumption gets unreasonable. The subpixels on computers have to be much larger than on a phone. A 1440p LCD on the HTC 10, for example, will not consume the same power as a 1440p Macbook screen. In fact, it consumes far less, allowing the phone to have basically the same web browsing battery life with a much smaller battery. In addition, the Macbook has a 25% smaller battery compared to the Air, further giving it the disadvantage.

    I stated that the 6s and 6s Plus destroy the GS6, correct? I never mentioned the 6 at all. Don't try to twist my words. The facts do indeed back up my claims and if you can't see that maybe you should take a look at those charts again. I do concede that the 6 series were terrible phones all around (terrible SoC, terrible display, terrible modem, terrible design, terrible Touch ID) but with the 6s Apple fixed almost all of these issues.

    Finally, the S6 actually had worse battery life than the 6 in web browsing and in gaming (taking into account frame rates). What are you basing your (non-existent) claims off of?
  • sonicmerlin - Sunday, August 28, 2016 - link

    You're hugely exaggerating the iPhone 6's deficiencies. Touch ID was it slow but it was far more accurate than any other fingerprint scanner implemented on a mobile device up to that point.

    The design was the exact same as the 6S. The modem had more LTE bands than had ever been integrated into a single device. The screen wasn't vastly different than the 6S screen either. And the SoC had great single core performance. The battery even lasted longer than the 6S.
  • lilmoe - Saturday, August 27, 2016 - link

    Please don't be offended, but I cannot take you, or any other biased user, seriously when you're claiming that someone's argument is unreliable, then go on and prove the opposite using the same (and/or worse) approach they did. Yes, I've read the comments on that article (all of them actually), and contributed/replied to lots.

    -- Simply because the scheduler is able to schedule a workload across multiple threads does not mean it is taking "full advantage of 4 or more cores"

    A good example of what I mean above. Your statement may (and I say: may) be correct if there WASN'T any DIRECTLY related data to prove it wrong. When 4 or more cores are ramping up (and actually computing data) during app loading and scrolling (including browsers, particularly Samsung's browser), then it sure as hell means that these apps (all the ones tested actually) ARE taking advantage of the extra cores. Unless the scheduler is "mysteriously" loading the cores with bogus compute loads, that is (enter appropriate meme here). Multi-core workloads ARE the future, Apple is sure to follow. It's taking a LONG while, but it's coming. We've also pointed out that there is LOTS of overhead in Android that needs work, and that most certain won't be remedied with larger, faster big cores. On the contrary it would be worse for efficiency.

    "I'm going by Anandtech results here"
    AHHHHHHH, right there is the caveat, my dear commentator. If YOU had actually read the comment section, you'd know just how much we're complaining about the inconsistency of Anandtech's charts. You never get the same phones/models, consistently, on a series of comparisons; you get the GS7 (Exynos) VS iPhone 6S+ on one, then the GS7e (S820) VS iPhone 6/6S (not the plus) on the other, considering both iPhone models are NOT the same phone, with different resolutions and different screen size/resolution (AND different process nodes among even the same models). It's relatively safe to claim, at this point, that those inconsistencies are intentional, while Anandtech's "excuse" is that not all phones are at the same "lab" at the same time. Selective results at its best. There's absolutely no effort in extracting external factors and testing HARDWARE for what it is. One could argue that the end user is getting a package as a whole, but that's also inconsistent with Anandtech's past and present testing methodology, where at times they claim that they're testing hardware, and times you get a review largely clouded by "personal opinion" like ^this one and the one before. If you really are reading the comment section, you'd see us mention that this review is personal opinion, and adds nothing to what's being said and shown online already. We can deduct the outcome, but we want to know the REASON. Anandtech fails to deliver.

    Back to the argument about radios. You haven't tested the devices yourself. You haven't used any as your daily driver. I have. Anandtech's results, charts and whatnot don't only contradict my findings (and many others) in one department, but MOST. It's safe to assume that their testing methodology is flawed, seriously flawed. They absolutely do NOT take into consideration any real world usage, NOR do they completely isolate external factors to test hardware, NOR is there any sort of deep dive or tweaking to justify their claims. For WiFi, they cherry picked the least common scenario to fault the device and they didn't show any performance/reliability data for more common workloads.

    I own a Galaxy S7 Edge (Exynos), since day 1. I actually go out and know lots of people, for business and leisure. My family, friends and clients own iPhones (various generations, mostly latest), Galaxies, HTCs, LGs and Huaweis. Guess who has the best reception. Guess who has better and more reliable WiFi. Guess who has the better rear camera (front camera is almost a wash now after updates, and sometimes better. It was worse at launch), BATTERY LIFE, SCREEN, features, etc, etc, etc... This isn't limited to this generation of Galaxy S/Notes, it's been this way since the S4/Note 2 (aesthetics aside). The only drawback on current generation Galaxies is reception COMPARED to previous plastic built Galaxies; they're still BETTER than the competition.

    Again, what are we comparing here? Actual hardware? Or real world usage? Performance consistency? I don't know at this point. Anandtech reviews are anything but consistent, and nowhere as easy for an objective user to extract the truer result between the lines. They weren't perfect, but it was easy to deduct the facts from the claims. Now, you get a deep dive of why X is better than Y, but then you get a "statement" of why Z is better than X without anything sort of rational reasoning (sugar coated with "personal taste").

    Did you read the camera review? What was Josh comparing there? If it's about post processing, then Samsung isn't worried about his "taste", what they're worried about is what MOST OF THEIR USERS like and want. Their USERS want colorful, vivid and more USABLE images, not a blurry mess. Nothing beats Samsung's camera there, especially with the new focusing system. Again, what's being compared here? Sensor quality or his own taste in post processing? If it's sensor quality, I haven't seen a side by side comparison of RAW photos using the same settings, have you??

    -- Sure Safari is more efficient.
    Let's stop there then, because we agree. At this point, software optimization can (and does) completely shadow any "potential" inefficiencies in radios. Everyone agrees that Chrome isn't the best optimized browser for ANY platform. Safari has built-in ad blocking, Chrome doesn't. Samsung's browser supports ad-blocking, but Anandtech didn't bother making the comparisons' more apples to apples. They claimed that it performed worse or the same as Chrome. ***BS***. Samsung's Game-Tuner enables the user to run Samsung's browser (or any other app, not just games) in 1080p and even 720p modes, but again, Anandtech didn't bother. I sure as hell noticed a SIGNIFICANT increase in render performance, battery consumption, and lower clock speeds when lowering the resolution (I run my apps and browsers in 1080p mode exclusively, and my games at 720p with no apparent visual difference, but with HUGE performance benefits).

    Other than the FACT that these phones are NOT running the same software (OS, apps/games, even if they were the same "titles"), Javascript benchmarks, in particular, are an absolute mind-F***. You get VASTLY different results with different browsers on the same platform, and different results using the same browser on different platforms. Any sort of software optimization can drastically change the results more so than any difference in IPC or clock speed. Any reviewer (or computer scientists for the matter) worth their weight would never, NEVER, claim that a freagin' CPU is faster based on browser benchmarks UNLESS those CPUs were running the SAME VERSION browser, on the SAME platform, using the SAME settings, AND the SAME OS. Anandtech, among others, are "mysteriously" refusing to bring this point to light, and instead chose to fool the minds of their audiences with deliberate false assumptions. Most commentators know this, so how do you expect me, and others, to take you seriously. With that said, javascript (at this piont) is the least deciding factor of browser performance (especially on mobile).

    -- This is not only a testament to S820's inefficiencies but also Samsung's implementation inefficiencies
    How so? Where's the log data to back this up? Where's the deep dive? Where, in Samsung's software, is the reason for that? How can it be fixed?

    This community is infested with false claims, inconsistent results, and bad methods of testing. Youtube is littered with "speedtests" and "RAM management" tests that have no evident value in everyday user experience, and FAIL to exclude external factors like router-scheduling and Google Play Services. No one runs and shuffles 10s of apps and 5 games at the same time.

    I'm a regular commentator here, and I've bashed Samsung more than praising them on various subjects. I'm the first to point out the shortcomings of their tradeoffs. But I also know that these shortcomings are far, FAR less user-intrusive for the majority of consumers.
  • lilmoe - Tuesday, August 16, 2016 - link

    "The GT7600 was only beaten in GFXBench this year by the Adreno 530 and surpasses both Adreno 530 and the T880MP12 in Basemark (it also has equal performance to the T880 in Manhattan). You make it sound like the GT7600 is multiple generations behind while it is not. It absolutely crushes the Adreno 430 and the T760MP8 in the Exynos 7420. The GX6450 in the A8 was underpowered but the GT7600 is not."

    You mean better drivers, right? You mean better a optimized benchmark for a particular GPU on a particular platform, VS a the "same" benchmark not optimized for any particular GPU on another platform...

    Even with that handicap, the GS6/7 still manage longer battery life playing games. Amazing right?
  • CloudWiz - Sunday, August 21, 2016 - link

    Whether or not a phone has "better drivers" or an "optimized benchmark" doesn't matter. Sure I can let Basemark go if you so wish, but GFXBench is cross-platform and not optimized (so far as I know) for either iOS or Android. The fact is that offscreen performance is very similar between all devices, and that your statement about the GT7600 'long being surpassed' is false. It's been half a generation since it's been surpassed, and I have no doubts that whatever goes in the A10 will again reclaim the GPU performance crown for Apple. And then Qualcomm/Samsung will pass it again next year - that's how technology works. But the fact that GT7600 is so close to Adreno 530 and T880MP12 despite being half a year older and on an inferior process is a testament to Apple and PowerVR. You couldn't say the same for GX6450 or even G6430.

    Also, have fun playing a game for 4 hours at 10 fps when it'd be far more enjoyable to play it for 2 hours at 30+ fps.
  • lilmoe - Saturday, August 27, 2016 - link

    What are we comparing here? Unused performance or efficiency as whole? What matters in gaming desktops isn't the same for mobile devices (Smartphones). Some benchmarks are comparing Metal to OpenGL ES 3.1 when they should be comparing a lower level API to its competitor, ie: Vulcan.

    Mali GPUs have far surpassed PowerVR in efficiency, for a while. You can actually measure that in both benchmarks (battery portions) and real life gaming.

    -- Also, have fun playing a game for 4 hours at 10 fps when it'd be far more enjoyable to play it for 2 hours at 30+ fps.

    Part of the reason why I can't take you seriously (again, no offense). What game exactly runs at 10fps even at full resolution? I haven't seen any. But it's also good that you do acknowledge that Apple's implementation isn't exactly the most power efficient.

    That being said, I run all my games at 720p (just like the iPhone) using Samsung's game tuner app, and not only do they run faster now, but the battery life (which was class leading at full res) is even better. Complaining that Samsung has larger batteries is like complaining that Apple has larger/wider cores. Because, again, what are we comparing here? Hardware? Or user experience??? It's not clear at this point, but the GS7 wins on both accounts at this particular workload.

    Game-tuner (and the latest resolution controls in the Note7) has completely diminished my concerns with the resolution race. It doesn't matter to me anymore. They can go 4K (or complete waco 8K) for all I care, as long as I can lower the resolution. I'm baffled how there isn't a complete section about this app/feature (and Samsung's underlying software to enable it). I also bring this up because you get a benchmark tuned-down to 1080p (on my GS7 at least) results more FPS than the 1080p "offscreen" test for "some" reason. After seeing this, I'm even more conservative about these benchmarks.....
  • jlabelle2 - Monday, August 22, 2016 - link

    - the modem on the S6 makes it last a ridiculously short amount of time on LTE and even on Wifi the 6s lasts half an hour longer

    Do you realize when you wrote that, that the iPhone has a ... Qualcomm modem ?
    There is nothing magic about iPhone hardware that people try to make you believe.

    It is crazy when you realize that the Note 7, with a bigger screen, with 50% more pixels, can still browse on LTE longer than the iPhone 6S Plus, while being significantly smaller.

Log in

Don't have an account? Sign up now