Leaked Nvidia N1 details
Posted by darkmitsu@reddit | hardware | View on Reddit | 35 comments
The Nvidia N1 and N1X (System-on-Chip) details have recently surfaced through motherboard leaks and shipping manifests. This series marks Nvidia's aggressive entry into the consumer "AI PC" market, directly competing with Apple’s M-series and Qualcomm’s Snapdragon X Elite.
Core Specifications
| Component | Leaked Detail |
|---|---|
| Architecture | Blackwell GPU + Arm-based Grace/MediaTek CPU |
| CPU Cores | 20 Cores (Heterogeneous: 10 Performance / 10 Efficiency) |
| GPU Cores | 6,144 CUDA Cores (Comparable to a mobile RTX 5070) |
| Memory | 128GB LPDDR5X (Unified Memory architecture) |
| Memory Speed | \~8,533 MT/s |
| Process Node | TSMC N3B |
| Packaging | CoWoS (Chip-on-Wafer-on-Substrate) |
Key Technical Leaks
- Unified Memory Powerhouse: Recent motherboard images from Chinese platforms (Goofish) show a densely packed PCB with eight SK Hynix modules, totaling 128GB of shared memory. This high capacity is specifically targeted at running large local LLMs and AI creative workflows.
- MediaTek Collaboration: Nvidia is reportedly handling the GPU and software stack, while MediaTek provides the Arm-based CPU expertise (likely leveraging Cortex-X925 and A725 designs).
- Comparison to DGX Spark: The N1 silicon is believed to be the consumer-grade variant of the GB10 "Superchip" used in the professional DGX Spark workstations.
- Windows on Arm (WoA): The chips are designed for a new "Bromine" branch of Windows 11, utilizing the Prism translation layer for x86 compatibility.
Launch & Availability
- Rumored Reveal: Expected at Computex 2026 (June).
- Target Devices: High-end 14-inch laptops and tablet hybrids (similar to the ROG Flow Z13).
- Confirmed OEM Partners: Leaked shipping manifests and support pages have linked Dell (XPS 16 Premium) and Lenovo (Yoga/Legion) to early N1X engineering samples.
- Timeline: While initial targets were H1 2026, some supply chain reports suggest a broader retail rollout in H2 2026.
hardware-ModTeam@reddit
Thank you for your submission! Unfortunately, your submission has been removed for the following reason:
Rumours or other claims/information not directly from official sources must have evidence to support them. Any rumor or claim that is just a statement from an unknown source containing no supporting evidence will be removed.
Please read the the subreddit rules before continuing to post. If you have any questions, please feel free to message the mods.
HeRmEs3xx@reddit
Their is an "AI Based" ARM pc market? Who is their actual target audience?
ea_man@reddit
Probably people with way much more money than common sense that believe in miracles.
Or people who want to run big LLM at low power and don't care about speed.
darkmitsu@reddit (OP)
it's niche, Nvidia seems like a bad consumer product because they are pricy and elitist but this is forced due their data center business.
Homerlncognito@reddit
It's just a DGX Spark In a mobile package.
Noble00_@reddit
This feels like it breaks rule 7 and regurgitates every single GB10/Spark/N1/N1X posts this past year on this sub.
https://www.reddit.com/r/hardware/comments/1i0bd3p/nvidia_n1x_soc_could_be_coming_to_lenovo_laptops/
https://www.reddit.com/r/hardware/comments/1n3kx6e/nvidia_n1x_cpu_leak_unveils_rtx_5070sized/
https://www.reddit.com/r/hardware/comments/1ql16e5/leak_confirms_nvidia_n1x_in_windows_on_arm_gaming/
https://www.reddit.com/r/hardware/comments/1sgp2xv/videocardz_nvidia_n1_laptop_motherboard_has_been/
And many more. I don't think this needs it's own post unless you want to share a source with new information.
darkmitsu@reddit (OP)
I can add N2/R1 information, I'm a data scientist not a reddit sweaty user.
MBILC@reddit
Link to your source? Always useful instead of just copying content with no reference to the source?
darkmitsu@reddit (OP)
there are multiple sources, even a pic of the board that corroborate past specs leak. since it's not official yet, mentioning or linking the sources to gather the same information will defeat my purpose of creating this post, which is ad free, unlike those places where you usually find the same info but all scattered over the website to make you falsely click an ad.
MBILC@reddit
You can still post your post, and then just add links at the bottom....just shows you got your info from somewhere vs people "just trust me"...
darkmitsu@reddit (OP)
so if I post this in an external website, would you trust me more? geez hold on I can create it right now https://nvidian1x.blogspot.com/2026/04/nvidia-n1-n1x-soc-leaked-end-of-x86.html
steik@reddit
I don't believe the memory specs for a second tbh. Willing to bet people are confusing bits and bytes. This is almost certainly 128 GigaBITS of memory, or 16 GigaBYTES. That is actually a reasonable and normal amount of memory for a laptop. 128GB is lunacy. This would be DOA for price reasons with 128GB of RAM.
R-ten-K@reddit
128GB is the usual mem config for the GB10, which the n1 is based on. OEMs will likely play with different chip densities to offer cheaper configs with less than the max 128GB for the same bus width.
Sopel97@reddit
16GB laptop of this kind would be ewaste
bunihe@reddit
16GBytes is impossible for a full 256bit wide LPDDR5x bus. You won't even find such low density chips. Also if it is 16GBytes, how's it gonna fit the higher end profile of N1?
Fyi, DGX Spark, which is basically this, comes with 128GBytes of LPDDR5x
steik@reddit
Thanks, I appreciate that reference - almost certainly shows I'm wrong.
Side note: I looked at SK Hynix's website and they appear to just use upper and lower case (Gb and GB) indiscriminately, even in the same document when they are clearly referencing the same thing. Bizarre for a company whose entire business resolves around these units :)
soggybiscuit93@reddit
128GN could just be the max memory capacity, not necessarily the only memory capacity.
battler624@reddit
Only thing that sucks about this is the X925 and A725, should've been C1 Ultra.
Vince789@reddit
10x P-cores + 10x E-cores is also a very weird setup
Even back in 2023, Arm was already recommending 10+4 (X4+A720) for laptops
12x X925 + 4x A725 would have been just 1mm2 larger GB10's 10x X925 + 10x A725
Although yea, the C1-CPU would have been FAR better, they could've gone Apple M5 Pro-style:
6x C1-Ultra + 12x C1-Premium would be a few mm2 smaller than GB10
Or even 6x C1-Ultra + 12x C1-Premium + 4x C1-Pro would be the same as GB10
R-ten-K@reddit
It's the same config as the desktop GB10 already uses.
FieldOfFox@reddit
If it’s using the smartphone Arm X CPU core stencils, then It’s going to be shit.
Arm has been exposed over and over again that they cannot produce a (modern) desktop/laptop class CPU.
R-ten-K@reddit
"core stencils" clearly shows you know what you are talking about LOL
Exist50@reddit
No way in hell it's using N3B. Either N3E or N3P.
darkmitsu@reddit (OP)
cutting edge hardware is reserved for data centers, they won't cannibalize their business.
Exist50@reddit
Yet that's exactly what they're using. Seriously, where did you get the idea they were using N3B? The only ones that did were Intel and Apple, and only because they were adapting designs for the original N3. N3B is completely useless vs N3E/P, and TSMC probably doesn't even have it as an option.
darkmitsu@reddit (OP)
it could be a typo since it's just from leaked documents that didn't specify the exact node (limited to N3) but I expect similar to GB10's N3E since the code name for this chip is GB20B. but again I do not expect the best node currently available because they use it for their data center chips
Exist50@reddit
It's all the same equipment, so they're dealing with that tradeoff regardless. And for that matter, Nvidia's GPUs are still on N4.
alabasterskim@reddit
Did you AI generate this post
darkmitsu@reddit (OP)
AI structured and summarized, it's clean and easier to read. do you have a problem with that?
martsand@reddit
A mobile 5070 has 4606 cores so I suppose this will be very low power and much slower to match it in performance
darkmitsu@reddit (OP)
The good thing is it will be able to sustain the performance while unplugged. Only Apple does better memory bandwidth speeds with LPDDR5X. Nvidia could in theory use HBM4 for their Vera Rubin version and achieve 1tb/sec but they are extremely expensive, probably going LPDDR6 for the N2/R1
yllanos@reddit
This thing will not be cheap
darkmitsu@reddit (OP)
Sure it will not. Nvidia is not going after mainstream but the AI mobile workstation, we do expect similar pricing to DGX spark
IBM296@reddit
If this is better than X2-Elite in performance and efficiency, that would be great.
DeconFrost24@reddit
So this will be super expensive!