Avatar

abruptly8951

abruptly8951@lemmy.world
Joined
0 posts • 22 comments
Direct message

For me the infinity subscription bypass stopped working so I finally made the switch

permalink
report
parent
reply

You need rebase instead. Merge just creates useless commits and makes the diffs harder to comprehend (all changes are shown at once, but with rebase you fix the conflicts in the commit where they happened)

Then instead of your branch of branch strat you just rebase daily into main and you’re golden when it comes time to PR

permalink
report
parent
reply

You could try dexed, it’s a YamahaDX7 clone https://github.com/asb2m10/dexed/releases

permalink
report
reply

Privacy preserving federated learning is a thing - essentially you train a local model and send the weight updates back to Google rather than the data itself…but also it’s early days so who knows what vulnerabilities may exist

permalink
report
parent
reply

There’s is a huge difference though.

That being one is making hardware and the other is copying books into your training pipeline though

The copy occurs in the dataset preparation.

permalink
report
parent
reply

This sounds like a fun project “samples of chemistry”

permalink
report
parent
reply

This is an extremely odd outlook to have. Good luck with it.

Unfortunately the answer to your question is to not post at all, though if your contributions are worthwhile then that is not an excellent solution

permalink
report
parent
reply

Ooh edgy, say more

permalink
report
parent
reply

You seem a bit troll-y too but I’ll bite.

The only one suggesting to add women because they are women is the edgy beb. The change I want to see is edgies being made to feel small. I have achieved based on their anemic come back :)

The person starting this thread wants to see the world you’re suggesting I create. And in doing so has opened a whole new class of YouTuber for our dear OP to explore…the analogy about fish, fishing and teachers come to mind

permalink
report
parent
reply

It’s definitely not indexed, we use RAG architectures to add indexing to data stores that we want the model to have direct access to, the relevant information is injected directly in the context (prompt). This can somewhat be equated to short term memory

The rest of the information is approximated in the weights of the neural network which gives the model general knowledge and intuition…akin to long term memory

permalink
report
parent
reply