219 points
114 points
Deleted by creator
permalink
report
parent
reply
67 points

Once we have super fast reliable internet we’ll likely have the whole computer as a service. We’ll just have access terminals basically and a subscription with a login, except for the nerds who want their own physical machine.

permalink
report
parent
reply
75 points

Bro just reinvented mainframes.

permalink
report
parent
reply
16 points

No. Just no.

And get off my lawn, ya whippersnapper.

permalink
report
parent
reply
10 points

Mhmm… Computer as a service. Why does that sound familiar…?

permalink
report
parent
reply
10 points

RAM as a service can’t happen. It’s just far too slow. The whole computer can though. It’s RAM can be local so it can access it quickly, then it just needs to stream the video over, which is relatively simple if creating some amount of latency to deal with.

permalink
report
parent
reply
8 points
*

Given how so many of us communicate, work, and compute using cloud platforms and services, we’re basically already there.

How many apps are basically just a dumb client using a REST API?

permalink
report
parent
reply
7 points

you will own nothing and be happy!

permalink
report
parent
reply
6 points

Wait, we already had that in the 70s.

permalink
report
parent
reply
5 points

sweaty gamers and nerds as always unite over having proper physical PCs rather than online services or consoles.

permalink
report
parent
reply
4 points
*

Given the digital literacy of many “regular people” (e.g. my father, and seemingly every other of my friends), the idea is appealing. Especially, as most of them don’t care about privacy. Give them decent availability, and they will throw money at you. And if you also give them support, I will, too.

permalink
report
parent
reply
4 points

That’s exactly how it works right now with VDI. I’m using one at work.

permalink
report
parent
reply
3 points

Honestly, cloud gaming is very good… when it is good. Sometime it suck. But when it’s good it’s incredible how much it feels like gaming locally.

permalink
report
parent
reply
3 points

Unsubscribe

permalink
report
parent
reply
37 points

It’ll never be fast enough. An SSD is orders of magnitude slower than RAM, which is orders of magnitude slower than cache. Internet speed is orders of magnitude slower than the slowest of hard drives, which is still way too slow to be used for anything that needs memory relatively soon.

permalink
report
parent
reply
11 points

Need faster than light travel speeds and we can colocate it on the moon

permalink
report
parent
reply
6 points
*

A SATA SSD has ballpark 500MB/s, a 10g ethernet link 1250MB/s. Which means that it can indeed be faster to swap to the RAM of another box on the LAN that to your local SSD.

A Crucial P5 has a bit over 3GB/s but then there’s 25g ethernet. Let’s not speak of 400g direct attach.

permalink
report
parent
reply
11 points

You can do it today, just put your swapfile on sshfs and you’re done.

permalink
report
parent
reply
11 points

So I could download more RAM?

permalink
report
parent
reply
36 points

It will crash as soon as it needs to touch the swap due to the relatively insane latency difference.

permalink
report
parent
reply
5 points

So use a small area in memory as cache

permalink
report
parent
reply
5 points

the infinite memory paradox. quaint. (lol)

permalink
report
parent
reply
22 points

Imagine doing this on a dial-up 56K modem

permalink
report
parent
reply
20 points

A:\SPICYMEMES\MODEMSOUND.WAV

permalink
report
parent
reply
9 points

Bwa-hahahahhah "A:" 🤣

permalink
report
parent
reply
2 points

For those too young to remember, the A:\ drive was for the hard 3" floppy disks and B:\ drive was for the soft 5.25" floppy disks. The C:\ drive was for the new HDDs that came out, and for whatever reason the C:\ drive became the standard after that.

permalink
report
parent
reply
18 points

wait, didn’t some tech youtubers like LTT try using cloud storage as swap/RAM? afaik they failed because of latency

permalink
report
parent
reply
20 points
7 points
*

I remember using ICMP data to bypass my high school’s firewall. TCP and UDP were very locked down, but they allowed pings. It was slow though - I think I managed to get a few KB per sec. Maybe there’s faster/fancier firewall bypass methods these days. This was back in the 2000s when an entire school would have a single OC-1 fiber connection.

permalink
report
parent
reply
5 points

Afaik they used it as redundant off-site backup

permalink
report
parent
reply
5 points

I wonder if there would be a speed boost by setting 2 gdrive as raid 0 for off site backups

permalink
report
parent
reply
13 points

I feel like this might be a giant gaping security risk.

permalink
report
parent
reply
13 points

So is pretty much all of the cloud services the average user already subscribes to. People still use them though.

permalink
report
parent
reply
4 points

Agreed. This is especially bad, though, because if it’s compromised they basically have hardware-level access to your machine. Unless you’re using encrypted swap, and I’m not sure how standard that is.

permalink
report
parent
reply
5 points

Obviously you should set up device mapper to encrypt the gdrive device then put the swap on the encrypted mapper device.

permalink
report
parent
reply
6 points
*

If your kernel isn’t using 90% of your CPU resources, are you really even using it to it’s full potential? /s

permalink
report
parent
reply
12 points

Oh wow, I didn’t even know Gdrive offered a 1 petabyte option 😂

permalink
report
parent
reply
15 points

They don’t to my knowledge, I believe that’s mounted through rclone which just usually sets the filesystem size to 1PB so that it doesn’t have to try to query what the actual limit is for the various providers (and your specific plan).

permalink
report
parent
reply
14 points
*

Once upon a time, Google offered unlimited drive storage as part of some GSuite tiers. They stopped offering it a while ago and have kicked most/all legacy users off of it in the past few months. It was glorious while it lasted 😢

permalink
report
parent
reply
9 points

Guess they ran everyone out of business that they needed to, so now the premium features get yanked and your choice of alternatives is curtailed. Hooray for enshittification.

permalink
report
parent
reply
5 points

And Google docs/sheets/slides used to not count in your used space.

permalink
report
parent
reply
3 points

At one point they offered unlimited storage for Play Music only. You could literally upload your entire collection. They changed it later to consume your Drive storage. Cheap enough plans so I subscribed. Then they killed off Play Music. I’m still salty about that.

permalink
report
parent
reply
1 point

Yea where do you get that? I can’t see anything on their pricing page, only goes up to 2tb

permalink
report
parent
reply
4 points

Even better:

Free cloud storage that doesn’t require an account and provides no limit to the volume of data stored

https://github.com/yarrick/pingfs

permalink
report
parent
reply
2 points

The image doesn’t load.

permalink
report
parent
reply
3 points

I posted that 10 months ago.

That being said, it seems to still work for me.

permalink
report
parent
reply
0 points
Deleted by creator
permalink
report
parent
reply
207 points

Protip: Put swapfile on ramdisk for highest speed

permalink
report
reply
90 points

Unironically that’s how zram works

permalink
report
parent
reply
29 points

Don’t do boy zram dirty, it has a ton of utility when you have ample spare compute and limited RAM.

permalink
report
parent
reply
4 points

Is that not how it works though? Lol

permalink
report
parent
reply
21 points

Doesn’t it compress the contents that it’s storing to help kind of get the best of both worlds?

You get faster storage because it’s in ram still, but with it being compressed there’s also “more” available?

I could be completely mistaken though

permalink
report
parent
reply
15 points

You are correct, although zram uses more cpu power since it compresses things. It’s not really an issue if you’re not using a potato :=)

permalink
report
parent
reply
88 points
*

Hopefully that swap is on an SSD, otherwise that query may not ever finish lol
Once you’re deep into swap, things can get so slow that there’s no recovering from it.

permalink
report
reply
59 points

WHAT FUCKING QUERY ARE YOU RUNNING TO USE UP THAT MUCH MEMORY DAMN

permalink
report
reply
68 points

In a database course I took, the teacher told a story about a company that would take three days to insert a single order. Thing was, they were the sort of company that took in one or two orders every year. When it’s your whole revenue on the line, you want to make sure everything is correct. The relations in that database were checked to hell and back, and they didn’t care if it took a week.

Though that would have been in the 90s, so it’d go a lot faster now.

permalink
report
parent
reply
35 points

What did they produce? Cruiseships?

permalink
report
parent
reply
29 points

No idea, but I imagine it was something big like that, yes. I think it was in northern Wisconsin, so laker ships are a good guess.

permalink
report
parent
reply
18 points

We have a company like that here somewhere. When they have one job a year, they have to reduce hours, if they have two, they are doing OK, and if they have three, they have to work overtime like mad. Don’t ask me what they are selling, though. It is big, runs on tracks, and fixes roads.

permalink
report
parent
reply
14 points

A very very badly written one no doubt…

permalink
report
parent
reply
11 points

Why stop at just one full table scan?

permalink
report
parent
reply
48 points

Exactly how I plan to deploy LLMs on my desktop 😹

permalink
report
reply
14 points

You should be able to fit a model like LLaMa2 in 64GB RAM, but output will be pretty slow if it’s CPU-only. GPUs are a lot faster but you’d need at least 48GB of VRAM, for example two 3090s.

permalink
report
parent
reply
6 points
*

Amazon had some promotion in the summer and they had a cheap 3060 so I grabbed that and for Stable Diffusion it was more than enough, so I thought oh… I’ll try out llama as well. After 2 days of dicking around, trying to load a whack of models, I spent a couple bucks and spooled up a runpod instance. It was more affordable then I thought, definitely cheaper than buying another video card.

permalink
report
parent
reply
4 points

As far as I know, Stable Diffusion is a far smaller model than Llama. The fact that a model as large as LLaMa can even run on consumer hardware is a big achievement.

permalink
report
parent
reply
2 points

*laughs in top of the line 2012 hardware 😭

permalink
report
parent
reply
3 points

I need it just for the initial load on transformers based models to then run them in 8 bit. It is ideal for that situation

permalink
report
parent
reply
2 points

That does make a lot of sense

permalink
report
parent
reply
2 points

Same. I’m patient

permalink
report
parent
reply

Programmer Humor

!programmerhumor@lemmy.ml

Create post

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

  • Posts must be relevant to programming, programmers, or computer science.
  • No NSFW content.
  • Jokes must be in good taste. No hate speech, bigotry, etc.

Community stats

  • 3.7K

    Monthly active users

  • 1.5K

    Posts

  • 35K

    Comments