Avatar

AK1174B

AK1174@alien.top
Joined
0 posts • 4 comments
Direct message

I’d probably buy everything used other than the power supply and drives.

permalink
report
reply

well a web server is a pen-testable thing, and is also a very common pen-tested thing so the background knowledge is useful .

permalink
report
reply

you could use LocalAI or ollama. but neither is going to work with 300mb of ram, and it needs a bunch compute resources for response speed to be usable. these models are also not very capable, in comparison to openAI’s gpt’s, but that depends on what your goal is with the models.

permalink
report
reply

all of the above + more???

permalink
report
reply