So Apple is paying them in exposure? First time I’ve ever seen that where it might actually be worth something.
It’s worth more, more often than you think, it’s just hawked by random “influencers” with a paltry 50k followers or some shit. That kind of exposure is worthless.
I’ve heard from photographers and artists before that they will consider exposure offers, but only if you have an actual status. Like a major brand or near-celebrity or a top 50 social media “influencer” with millions upon millions of followers.
And they usually want a contract, i.e. you have to pin our work for X days and you have to tag us etc.
They getting ALL the data
If you look at the announcement, they’re pretty damn boxed in. They can’t scrap the local device, or iCloud. Open AI only gets queries that the dumber Apple models thinks would be better served by OpenAI. And each of those queries is prompted with a dialog that says “Do you want me to use ChatGPT to do that? Cancel / Use ChatGPT”
That said, on stage, Apple briefly mentioned that ChatGPT plus users would have more functionality. I’ll bet money that’s the real play. LLM model subscriptions in the App Store. Apple loves that sweet sweet AppStore and subscription money.
Question is, do they take a cut like with Spotify, or is basic, free, GPT 4 access payment enough?
What data? The data that the user affirmatively agrees to send them that is anonymized? That data?
I’m sure you understand this, but anonymized data doesn’t mean it can’t be deanonymized. Given the right kind of data, or enough context they can figure out who you are fairly quickly.
Ex: You could “Anonymize” gps traces, but it would still show the house you live at and where you work unless you strip out a lot of the info.
http://androidpolice.com/strava-heatmaps-location-identity-doxxing-problem/
Now with LLMs, sure, you could “anonymize” which user said or asked for what… but if something identifying is sent in the request itself, it won’t be hard to deanonymize that data.
So you would rather submit your non-anonymized data? Because those bastards will find a way to unanonimize it. Is Apple doing the right thing or not?
The point is that they can use that data for further training. They want to build a monopoly like Google is for search.
They want to build a monopoly like Google is for search.
There’s Bing, and some others. I’m using Kagi. You can pretty much drop one in for another.
Google has a significant amount of marketshare, but it doesn’t really have the ability to determine the terms on which a consumer can get access to search services, which is what lets a monopoly be a monopoly.
They’ve got a monopoly over providing some services to Android users, maybe.
Like Google did with user queries and crawling data. I’m just saying everyone is happily giving these companies data. You are welcome to not use the GPT functionality just like you are welcome to use DuckDuckGo. I’m not getting the hostility to Apple. Microsoft on the other hand…
I’ll take bets here.
I don’t think the user data where OpenAI makes its money. It’s the $20 a month GPT Plus subscriptions.
Apple announced that Plus users would get more functionality. Also, OpenAI is basically only limited to collecting data from queries that the user explicitly says it wants to sent to a 3rd party model. Each GPT4 query prompts the user with “Do you want me to use ChatGPT to do that?”
Apple’s not really in the business of selling data, but they are famously and infamously in the business of selling subscriptions to shit.
And this is the way it has to be! Fuck all this “free” stuff paid by data and ads!
Google: tries to track your behavior so they can sell targeted ads in their freeware
Apple: aggressively tried to lock you into platforms and ecosystem so they can sell you subscriptions, apps, and hardware
Both shitty, but one company’s business model is much more dependent on user data to exist.
You forgot to add that Apple tracks your behavior so they can sell you targeted ads.
Pretty sure the data they’ll be getting will be payment enough.
How will this be financially viable for OpenAI? It costs lots of money to run this crap