Where I live, DRAM-less SSDs are a lot cheaper (half the price). Most sources online say “go for an SSD with DRAM”. But I wonder, are cases in which a DRAM-less SSD will do just fine?
My main focus is resurrecting old laptops (from 2006 to 2015), installing GNU/Linux and an sometimes investing in an SSD will give them a performance boost, but the budget is limited because I can’t sella uch an old laptop at a non very budgety price.
In most cases, DRAM-less makes little difference for the average user. The biggest difference is for very large transfers, like copying large games between drives. Either way, it’s an easy 3-to-5 times performance upgrade compared to an HDD.
I don’t even understand how dram ssd would be significantly faster outside of benchmarks.
The OS caches everything to PC dram and sends it out to the SSD. So adding more ram to your PC would have the same effect.
In benchmarks, the dram ssd appears to be much faster by return control to the OS much sooner. But a non dram ssd is getting data from PC cache dram via dma and that’s not impacting CPU load. So it’s not really improving the speed.
It allows the drive to be used more quickly. If you’ve ever tried using a computer while the disk is at 100% usage, you’ll have noticed that anything you do that requires disk access slows to a crawl. With DRAM on the drive, it takes more to overload the drive and makes smaller transfers nearly instant, as data gets buffered into the much faster DRAM rather than directly to the SSD.
Like I mentioned though, in most cases the average user won’t notice a difference. If you really want to squeeze a bit of extra performance out of your drive, that’s where you’ll want the DRAM. If you’re just trying to get old laptops running well again, it’s basically a non-factor.
It allows the drive to be used more quickly
But not anymore so than adding the same amount of dram to the PC. It’s cpu->cpu dram->SSD dram->ssd. It will only show a performance difference on benchmarks or if your PC ram is completely full. You could get more performance by adding dram to the PC and telling the OS to never go below X amount of disk cache.
makes smaller transfers nearly instant, as data gets buffered into the much faster DRAM rather than directly to the SSD.
That’s not actual speed but benchmark speed. A copy is going to PC cache and then gets written out to the SSD. Having SSD dram allows the SSD to say “done” sooner to the OS despite it taking the same total time.
If you plan to boot your OS off it, the benefits of a DRAM-less SSD over a traditional Hard Drive are somewhere weiteren negligible and non-existent in terms of performance. It may give you a bit better battery life, but that is all.
So, according to most answers. A DRAM-less SSD would do for me. Thanks!
If you’re using a modern NVMe SSD you can simply ignore the presence or lack of a DRAM cache. Modern PCIe devices can use Host Memory Buffer to let the CPU map part of your RAM as the cache, and because with PCIe the CPU is the one accessing the SSD directly anyway, the cost in latency is minimal. The end result is that if you do an extremely heavy I/O benchmark you can indeed measure the difference, but if you’re loading programs, saving files, playing games and whatever else, it really doesn’t matter.
For SATA SSDs the difference is way more significant, but then again, if you’re just restoring old laptops a DRAM-less SATA SSD will be so much faster at responding to each request compared to those little laptop HDDs that the upgrade will be more than worth it anyway, and spending extra for a DRAM cache might not be worth the machine you’re dealing with. The end result will likely be that your file write speeds won’t be super impressive, but your read speeds and latency will be great, so for most purposes it will behave like any SSD and give you the same benefits.
As it was already commented Host Memory Buffer can to some degree replace the DRAM cache (if the SSD supports it and even then the implementation can be bad). But the specification is from 2014 so unlikely that Laptops from up to 2015 will support it.
When there is no DRAM cache on the SSD, the SSD will use the NAND flash cells as cache. This results in more wear and a shorter lifetime. Also, when the SSD gets filled up, the SSD gets significantly slower since there will be less free NAND cells to use as Cache.
I think calling it a “cache” is not precise. The primary function of the DRAM is to hold the dictionary for translating logical addresses (e.g. sectors) from the OS to the physical addresses (which NAND chip, which bank etc.). This indirection is needed for the controller to do wear leveling without corrupting the filesystem.
On a SATA SSD without DRAM each read IO could mean 2 actual reads: first the dictionary to find the data and than the actual data being read. As you said HBM helps by eliminating this extra read.
The read and write caching is just a use of the remaining DRAM capacity. Since modern Operating Systems use the general RAM for the same function it is usually just a small increase to the throughput.
The primary function of the DRAM is to hold the dictionary for translating logical addresses (e.g. sectors) from the OS to the physical addresses (which NAND chip, which bank etc.). This indirection is needed for the controller to do wear leveling without corrupting the filesystem.
That data is still only cached on the DRAM, since it is losing its data when it is no longer powered.