Avatar

azl

azl@lemmy.sdf.org
Joined
0 posts • 39 comments
Direct message

What’s the difference between one technology you don’t understand (AI engine-assisted ) and another you don’t understand (human-staffed radiology laboratory)?

Regardless of whether you (as a patient hopelessly unskilled in diagnosis of any condition) trust the method, you probably have some level of faith in the provider who has selected it. And, while they most likely will choose what is most beneficial to them (cost of providing accurate diagnoses vs. cost of providing less accurate diagnoses), hopefully regulatory oversight and public influence will force them to use whichever is most effective, AI or not.

permalink
report
reply

They could have gone with a “visor” frame design that would have been more fashionable, but I think this is pretty impressive for demonstrating the bare minimum amount of plastic needed to house holographic transparent displays, internal/external tracking sensors, and a sound system.

What they claim these glasses can do is absolutely incredible (we won’t really know because they are only being used internally for further development).

permalink
report
parent
reply

There’s a place for this, if it’s entertaining. Memes, comedy, maybe some more legitimate uses too. A lot of YouTube is some guy just sitting in front of a camera in the most boring perfectly curated home office. Throw in something visually interesting that enhances the subject matter and I may watch more.

permalink
report
parent
reply

can’t argue with that… if you don’t want to VR, that’s a nice setup. Easier to find your beer too.

permalink
report
reply

This would ideally become standardized among web servers with an option to easily block various automated aggregators.

Regardless, all of us combined are a grain of rice compared to the real meat and potatoes AI trains on - social media, public image storage, copyrighted media, etc. All those sites with extensive privacy policies who are signing contracts to permit their content for training.

Without laws (and I’m not sure I support anything in this regard yet), I do not see AI progress slowing. Clearly inbreeding AI models has a similar effect as in nature. Fortunately there is enough original digital content out there that this does not need to happen.

permalink
report
parent
reply

If it doesn’t offer value to us, we are unlikely to nurture it. Thus, it will not survive.

permalink
report
parent
reply

I don’t want to get in the way of your argument re. Usenet, but spinning hard drives will last longer if they stay on. Starting and stopping the spindle motor will impart the greatest wear. As long as you have the thermals managed, a spinning disk is a happy disk.

permalink
report
parent
reply

Just curious if you had a reference for this statement since it seems to be false in multiple ways.

permalink
report
parent
reply

This also works for binary cable or interface connectors formerly known as “male” and “female”.

permalink
report
reply

I want Ars content to be part of whatever training data is provided to the best models. How does that get done without appearing like they are being bought?

Even if their contract explicitly states that it is a data sharing agreement only and the products of the media organization (articles/investigations) are not grounds for breach or retaliation, it is assumed that there is now some impartiality in future reporting.

So, for all media companies, the options seem to be:

  1. Contribute to the greater good by openly permitting site scraping (for $0)
  2. Allow data sharing to contracted parties only (for a fee)
  3. Public or privately prohibit use of any data, and then seek damages down the road for theft/copyright infringement when the legal framework has been established.

Is there a GPL or other license structure that permits data sharing for LLM training in a way that it does not get transformed into something evil?

permalink
report
reply