Video ID: GY8YudAnxu0
YouTube URL: https://www.youtube.com/watch?v=GY8YudAnxu0
Added At: 13-06-25 21:17:16
Processed: No
Sentiment: Neutral
Categories: Tech, Education
Tags: AI, Machine Learning, Data Science, Nvidia, Mini PC, GB10, 128GB RAM
Summary
Nvidia announces a new Mini PC called Project Digits, featuring the Nvidia GB10 processor and 128GB of RAM. It's aimed at data science, machine learning, and AI workloads. The device will be available in May for $3,000.
Transcript
so CES is in full swing in Las Vegas and Nvidia has made its keynote announcements now at the very end of the presentation it announced the uh project digit Mini PC that's right it's announced a Mini PC it's got a new processor in it called the gb10 from Nvidia itself now if you want to find out more please let me [Music] explain okay so let's dive into the Nvidia Project digits digit will explain what all that means in a moment the key thing is here it features a new chip the Nvidia gb10 that's the grace Blackwell super chip and of course Nvidia have got you know the GB 100 the gb2 200 for that stuff it uses for its AI uh super computer this is what we're looking at we're looking at a mini uh PC it can be used as a workstation or of course it can be used you know connected remotely remote desktop secure shell just connecting to various web services connecting to various services that it offers for machine learning uh and so on and the design that uh Nvidia are currently showing here is it's kind of like a shrunk down version of the the dgx kind of supercomputers or modules that they've got with all of their gpus and so on inside of it so digits WhatsApp we found out during the keynote that it stands for deep learning GPU intelligence training system digits that was the internal name for basically all of the different systems that Nvidia have been working on for many many years now uh that have resulted in well what the djx system so it will shorten from digits to dgx to make it the same as you know RTX uh uh agx and so on uh but really it's deep learning GPU system that's really what it means so I don't think that this Mini PC will actually be called project digit in the end that's its current name they will probably come up with something like you know the dgx mini or something like that I don't know uh but the key thing is it features this Nvidia Grace Blackwell super chip the gb10 so it offers a Peta flop of AI Computing performance at uh 4 bits so that's really really quite Mega powerful for especially for that form factor now it's got an Nvidia blackw GPU so that's the latest GPU uh architecture coming out from Nvidia with the latest Cuda calls fifth generation uh tensor cores and then it's got the EnV link chipto chip interconnect to connect to the GPU to the CPU uh on the actual chip something that Nvidia uses in many of its other uh grace-based uh Super Chips the Grace Hopper Super Chips and so on now the CPU part is Arm based and we'll talk more about that that in a moment and it comes with 128 GB of unified coherent memory what that means is like other systems so type systems uh this kind of the the memories on the chip so it's a fixed amount of memory and they've gone with 128 gabt because you're going to need that if you're doing AI type uh workloads and then up to 4 terabytes of nvme storage there'll be different versions available with different amounts of storage again I think that storage is on the chip itself so it's kind of an so uh design now with that amount of memory and of course if you have the storage you can run up to a 200 billion parameter large language model if you're using 4bit quantization now 200 billion parameter today when we're talking about you know running you know llama 3.2 or something like that on you know uh your desktop or on an Nvidia Jetson Orin or something like that we're talking about seven billion parameters or three billion parameters so being able to run a 200 billion parameter is a whole different scale of things and that's that's the combination of that unified memory 128 GB of it and then also the ability to speak very quickly to the GPU and the tensor cores to get that best performance out of that model so the CPU is a 20 core Arm based CPU so you get 10 cortex x925 CPU calls I've got videos about the x925 here on this channel and then you get another 10 a725 CPU Calles and again I got videos about that here on this channel so it's uh heterogeneous in the sense that you've got two different types of core but of course they are uh compatible architecturally but the a725 uh is slightly more power efficient whereas the x925 gives you that maximum uh performance and so they've given you this 10+ 10 setup so a nice 20 core CPU setup along with a black GPU is certainly something to be interested in now earlier I kept mentioning it's kind of an so design and that was important because this was made in collaboration with mediatech that was even said during the keynote so mediatech have had their input into this the way of putting this all together the interconnects and everything uh they've worked with Nvidia which personally I find surprising because Nvidia have got a big history of working with Arm based CPU so there was obviously something they wanted from mediatech and maybe we'll get more details about that as time goes by mediatech have also published a press release saying they worked with Nvidia but thin in the details but actually what it was that they've provided that will be interesting to find out and it runs Nvidia dgx o Linux so it's a Linux based system uh based on Ubuntu just like other envidia AI products and it's compatible that means you can run all of the nvidia's AI stack all the stuff you need to run all the cudic cor programming the tensor programming all of the things you could run on a bigger djx system you can run on this because it's running the same OS and of course you've got an arm CPUs and also you've got the same Nvidia gpus so it's compatible across the the region because this is a smaller one so if you put this in comparison to some of the bigger ones the new blackw gpus in their full spec Out versions can offer maybe 40 times more per performance so this is something that's not in a huge briefcase size rack size thing this is something that's in a Mini PC but they really have specked it out maximum specs for the size of the little box that you're getting so there was lots of talk in 2024 about mediate Tech's collaboration with Nvidia we knew there was a collaboration going on and the main thrust was is because there's going to be a Windows on arm PC from mediatech with an Nvidia uh GPU that's not what's been announced today this is a Linux based machine that's coming purely from Nvidia it's not being sold by mediatech although mediatech have collaborated with the design of thec as I said so is Project digits this windows on arm PC absolutely not this is definitely not what it is could it be the foundation for something like that maybe if Nvidia and mediatech have been working together maybe mediatech has got some kind of deal where it can get access to these chips because normally Nvidia chips are in Nvidia products there are a couple of cases where they're in other products like for example the Nintendo switch so maybe um Media Take and Nvidia have got some kind of collaboration going on but whatever it is it won't be this configuration this uh is a very specific configuration 128 GB of RAM if you're selling a Windows on arm PC you're not going to sell it with 128 GB of RAM maybe 64 if you wanted to be crazy 32 16 that kind of thing 128 definitely not so it's not going to be in this configuration uh and there's other thing and that's the price which we'll talk about right now because these boxes will be available in May of this year so we're looking about another five months away from here and it will be sold directly from Nvidia and nvidia's partners and they start at $3,000 so you're going to really want to have to want an AI machine you can't afford a dgx rack size thing you want something you can do uh AI inference on your desktop uh and you've opted not to go down the line of an Nvidia or AMD GPU inside of a a beefy PC you're going to have to Fork over $3,000 which is a lot of money particularly when you consider the price of the new M4 Mac Mini which is quite an amazing little device but of course this is got the extra things all that memory 128 GB of memory and i' would imagine the GPU is many many times uh faster but if this was could be a Windows PC you're not going to sell it at that price point either no one's going to say yeah I'm going to aband an x86 and go over to your Mini PC at 3 Grand that's just not going to happen so it will be interesting to see what uh unfolds but in terms of AI products this is absolutely amazing 20 core CPU blackw GPU 128 GB of RAM all in a Mini PC running uh dgx OS Linux okay so there we go project digits a Mini PC aimed squarely at data science machine learning and so on it's got that $33,000 price tag love to hear what you think too expensive actually it's a lot cheaper than a big djx system is it worth it what about it being a stepping stone over to Windows on arm in collaboration with mediatech love to hear your thoughts on that as well okay that's it my name's Gary Sims this is Gary explains I really hope you enjoyed this video if you did please do give it a thumbs up and if you like these kind of videos then stick around subscribe to the channel okay that's it I'll see you in the next one [Music] he