Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
By the way, this also happens to be how WE humans associate and process information. We're complicated predictive / associative models.
Yes, I’m sure toddler has to see million pictures of a cat so that it doesn’t confuse it for a toast. No it doesn’t, it has to see a cat only once and it then it knows what a cat is for the rest of its life. Whether it’s white cat, or black cat or ginger cat or calico cat, whatever it is never gonna make a stupid mistake that some of the smartest deep learning models in existence do even after being trained on gigantic amount of data.
 
Have you being living under a rock? Open ai has over 1 million plus subscribers ($20/mo) and that doesn’t even consider API users and deals with big companies like Microsoft.

OpenAI is literally a money printing machine right now.
The amount of subscribers is not near enough to cover operating costs + the number of their users is right now in downwards trajectory

So no, as you can see I’m quite updated on the matter
 
Yes, I’m sure toddler has to see million pictures of a cat so that it doesn’t confuse it for a toast. No it doesn’t, it has to see a cat only once and it then it knows what a cat is for the rest of its life. Whether it’s white cat, or black cat or ginger cat or calico cat, whatever it is never gonna make a stupid mistake that some of the smartest deep learning models in existence do even after being trained on gigantic amount of data.

You're missing the point.

The point is that no single human being has a single, unified idea of what is a thing. We break that thing down into many simple, atomic concepts, and when we think back about it, we stitch it back. Which is exactly how GPT was modelled (but in a simpler way).

Case in point: if you ask 100 humans "what is a cat" or "what is a dog", or "what is a car", without any other criteria, you are going to get 100 different answers which have common points between them.
 
I love that you live in a world where this isn't an absurdly, paint-eatingly wrong take.

Honestly I'd love to live in that world too.
I love it that you have such a wonderfully descriptive turn of phrase - "paint-eatingly" wrong. Excellent! I have visions of you gnawing at the door-frame.
 
Do you talk to ChatGTP? You write it out. Siris biggest weakness is understanding speech...an ongoing problem no one has solved perfectly yet.

Hope Apple can make some progress on implementing useful generative AI. The little boost they gave SD was nice but Mac still sucks for creative and hobbiest generative AI right now.

Absolutely I'm "talking to chatGPT" every single day. GPT and generative AI drives several rapidly growing healthcare dictation and documentation platforms, like 3M MModal (currently in its direct speech version, which will evolve to ambient listening scribing - Align - soon, once its beta testing at Mayo Clinic and other major hospitals completes), Freed (getfreed.ai), and Dragon (the most well-known healthcare dictation tool which has been around much longer than generative AI, but now relies on it). These apps all have astounding level of accuracy in terms of comprehension of spoken word, combined with comprehension of context and content, creating full scale, technical, understandable documentation - like GPT4 would with typed content - based on not just spoken word, but conversations happening between multiple people, some muffled, some on distorted lines, and still getting it right at such a high frequency that it seems more reliable than my own ability to listen. (So every time Siri is incapable of understanding even the most basic message, I think, wow, I just talked to, and near, an AI platform all day that actually understood not just what words I said, but what I meant by them, and the context in which they were said, and predicted other specific topics and messaging around those words... and Siri can't even understand the words more than half the time, let alone act on their meaning.)
 
It should be expected that AI is something everyone is entitled to, which means allowing every citizen to have access to their own AI, which they can keep as a sort of digital companion, privately and across time.

I hope what Apple may be offering in the future is a personalised AI subscription, 1 per Apple account, a personal assistant of sorts, far beyond Siri but maybe sharing her name.

An AI bot that is personalised to you, remembers every conversation ever had, knows all your files and history across devices. Serves as a valuable personal assistant. Unfiltered and uncensored, fully programmable AI by each customer through use.

Tie that in with potential real-time image generation experiences through VisionOS, describing environments and actions taking place and the AI generates real-time immersive and animated scenes
Kind of like Shuri’s personal AI in Wakanda forever? I like it.
 
The amount of subscribers is not near enough to cover operating costs + the number of their users is right now in downwards trajectory

So no, as you can see I’m quite updated on the matter
Last I heard Open AI was bringing in $80 million a month or about $1b a year. That’s a lot of money considering you said that no one wants to pay for AI. Microsoft have invested over $10b in openAi and they recently launched a $9k per month enterprise plan. I’m pretty they have clients already.

Mid-journey is also another very successful AI product with lots of customers.

Same with GitHub copilot which is now 2 years old.

The truth is AI is very hot right now and people want to spend money on AI products.
 
recently launched a $9k per month enterprise plan
For ChatGPT models it’s $100k-$380k minimum buy in a month depending if you want 3 or 4 and there are plenty of customers taking that. There will be cheaper smaller SKUs in the future but not launched yet.
You may be talking about Bing Chat Enterprise which is cheaper but less flexible.
 
For ChatGPT models it’s $100k-$380k minimum buy in a month depending if you want 3 or 4 and there are plenty of customers taking that. There will be cheaper smaller SKUs in the future but not launched yet.
You may be talking about Bing Chat Enterprise which is cheaper but less flexible.
 
  • Like
Reactions: lusty
Well, chatGPT 4 does not "understand" things. At least in a way that a real AI will (if one ever comes about.)

These generative machines are extremely elaborate mimeograph machines.

Whether or not Apple plans on competing with chatGPT remains to be seen. Tim Cook is rather good at being nebulous.

It’s still debatable whether anything is actually “conscious” or “understands” things since we still can’t quite agree on what that even means. Not sure I want to live in a world where the machines are conscious.
 
It's nowhere near a fad. Generative AI is incredibly useful for content generation (images and written content), programming, and many other applications.

Remember that GPT is a LANGUAGE model, so it particularly excels at language-related tasks. For example, it absolutely shines if you use it to improve the flow of badly written text.

Of course, you have to KNOW what to ask. But if you do, it's pure gold.

I think this is where Apple will use it to improve Siri and other things. My phone hears my voice and dictates my text with about 90%+ accuracy. Siri then interprets that with about 50% accuracy on a good day.

I think Apple will, at least at first, just make it a front-end to translate to Siri’s simple inputs to make it work better right away. Then, use things that are somewhat transparent to the user as AI prompts. Users won’t prompt it directly. But camera input, location input, sensor data, all that stuff the little spy device in your pocket already knows.

Kind of what Google promised with Google Now ten years ago. But then typical Google they got distracted and just started using it to show ads and kind of lost interest and then killed it in favor of Assistant which of course suffered the same cycle.

That’s where those little niceties like the phone suggesting a maps location when it connects to car Bluetooth came from. That fad is back in full force and probably not going away this time.
 
I think this is where Apple will use it to improve Siri and other things. My phone hears my voice and dictates my text with about 90%+ accuracy. Siri then interprets that with about 50% accuracy on a good day.

I think Apple will, at least at first, just make it a front-end to translate to Siri’s simple inputs to make it work better right away. Then, use things that are somewhat transparent to the user as AI prompts. Users won’t prompt it directly. But camera input, location input, sensor data, all that stuff the little spy device in your pocket already knows.
But this is the standard approach. It's called "intention analysis". You essentially turn a user sentence into a variable that dictates the intention.

For example:

  • Turn on alarm
  • Siri, turn on alarm
  • I want to turn on the alarm
  • I want the alarm to be turned on
All these sentences can be converted into the variable TURN_ON_ALARM, for example, and then Siri will proceed to the next step, prompting the user what time it wants to set the alarm to. Usually, you will train a model with thosands of variations, and eventually Siri will be able to recognize sentences even if they are not directly listed in the model.

For example, suppose the user instead says:

Siri, I want you to turn on the alarm

Notice none of the sentences exactly correspond to the user's new sentence. But because the new sentence is similar to the sentences the model already knows, there is a high chance it will correctly infer that "Siri, I want you to turn on the alarm" is TURN_ON_ALARM.

This is also how our head works, by the way. We compare what someone is saying to the sentences we already have in our head to infer someone's intention (but in a much more complex, nuanced way).
 
I have been thinking about things I would like to see Siri do that she is not doing already. I have no need for Siri to write an essay because I’m not in school anymore, but it would be nice to see her summarize a long TLDR article. I desperately want her to be able to search my photos, like Spotlight does, as I’ve mentioned before, and because she used to do that, I don’t know why that’s not achievable again. I would like her to have access to my notes so that she can add dictation to a note. I don’t know why she doesn’t have access to that because she has access to Contacts. Other than those things, I’m not sure what would constitute a massive Siri upgrade for me. But then I am not a student and I am retired so my needs are not as great. I’m wondering what else other people would like to see to be capable of in order to stop calling her dumb and saying, Siri sucks? Consistently being able to do the things that she does already?
 
  • Like
Reactions: xxFoxtail
But this is the standard approach. It's called "intention analysis". You essentially turn a user sentence into a variable that dictates the intention.

For example:

  • Turn on alarm
  • Siri, turn on alarm
  • I want to turn on the alarm
  • I want the alarm to be turned on
All these sentences can be converted into the variable TURN_ON_ALARM, for example, and then Siri will proceed to the next step, prompting the user what time it wants to set the alarm to. Usually, you will train a model with thosands of variations, and eventually Siri will be able to recognize sentences even if they are not directly listed in the model.

For example, suppose the user instead says:

Siri, I want you to turn on the alarm

Notice none of the sentences exactly correspond to the user's new sentence. But because the new sentence is similar to the sentences the model already knows, there is a high chance it will correctly infer that "Siri, I want you to turn on the alarm" is TURN_ON_ALARM.

This is also how our head works, by the way. We compare what someone is saying to the sentences we already have in our head to infer someone's intention (but in a much more complex, nuanced way).

Not sure if you’re saying this is how the LLM version would work or if this is how Siri works now. But yeah a smart LLM just to help interpret intention is the main thing Siri needs. That, and the ability to do far more offline.
 
  • Like
Reactions: Fraserpatty
Not sure if you’re saying this is how the LLM version would work or if this is how Siri works now. But yeah a smart LLM just to help interpret intention is the main thing Siri needs. That, and the ability to do far more offline.

That's how language models work in general to extract the user's intention. I'm not sure if SIRI works that way, but it's the usual approach.
 
  • Like
Reactions: CarAnalogy
That's how language models work in general to extract the user's intention. I'm not sure if SIRI works that way, but it's the usual approach.

From what I’ve read pretty sure Siri is just a giant table of phrase - action pairs. If you don’t say the exact phrase it wants, apparently it’s programmed to just play something from Apple Music.

I seem to recall someone already did this, a giant shortcut that runs through a paid OpenAI account to pipe GPT-4 to Siri to translate what you said into the very limited set of phrases Siri understands.
 
Notice "we're investing" and not "we've invested". Sounds like they're late to the party with respect to AI. Siri is clear evidence of this.

Apple is lucky that their products are in such demand that they can just take their time with whatever product their making. #complacency
 
Now if Siri would just work without an Internet connection. Even on a limited basis.
 
Now if Siri would just work without an Internet connection. Even on a limited basis.
We lost our Internet connection a few days ago and I was very happy that Siri was able to turn on lights and tell me the time. I know that she can do some other things too like opening up apps. I would imagine more cooperation from her will require revved up processors and neural engines. I’m here for it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.