Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,496
37,783


Anthropic has finally added web search capabilities to Claude 3.7 Sonnet, allowing the AI chatbot to access up-to-date information beyond its knowledge cutoff date of October 2024.

claude-ai.jpeg

Announced on Thursday, the new feature enables Claude to search the internet for current events and information, so its accuracy should be significantly better when answering questions about recent developments.

When Claude uses web search to inform its responses, the interface provides clickable citations that allow you to verify sources and fact-check information, meaning you won't have to conduct separate searches.

The feature is currently available only to paying subscribers in the United States. Anthropic says it plans to roll out web search to free users and additional countries "soon," but the company gave no specific timeline.

Web search functionality has become a standard feature among leading AI chatbots. OpenAI began introducing ChatGPT Search to paying subscribers last fall and eventually made the feature available to all users – including those without a ChatGPT account – early last month.

Anthropic is pitching the web search function as particularly valuable for professionals across various fields. The company suggests sales teams can use it to analyze industry trends, financial analysts can assess current market data, and researchers can build stronger literature reviews by searching across primary sources.

claude-web-search.jpg

For everyday users, the feature also promises to simplify comparison shopping by evaluating product features, prices, and reviews from multiple sources simultaneously.

Paying Claude users in the US can access web search by enabling the feature through their profile settings menu. The functionality is currently limited to Claude 3.7 Sonnet, which is Anthropic's first "hybrid reasoning model" capable of both quick responses and step-by-step problem solving.

Article Link: Anthropic's Claude Finally Gets Web Search, Months After ChatGPT
 
Gemini DeepResearch is reasonably decent, and a limited number of free searches.

Also, for a local LLM, Google’s recent Gemma 3 release is pretty good, a number of versions too. Can run 27b on an M2 Studio base model reasonably well.

I know Google gets some deserved hate, but their AI offerings are rapidly improving and, shockingly, either free or better priced than a lot of the competition.
 
  • Like
Reactions: Jarman74
“Finally”, “months after”

Oh praised!
Spent endless hours waiting. Prayed every day. Hoped. Rolled to sleep.

Now the world is saved. Now we can rescue everything.
/s

And then there are actually people who wonder why Apple releases unfinished products. Media like Macrumors are begging for it with their impatience.
 
  • Like
Reactions: CasinoOwl
“Finally”, “months after”

Oh praised!
Spent endless hours waiting. Prayed every day. Hoped. Rolled to sleep.

Now the world is saved. Now we can rescue everything.
/s

And then there are actually people who wonder why Apple releases unfinished products. Media like Macrumors are begging for it with their impatience.

We live the age of impatience where nothing is ever good or new enough.

Our insatiable demands have become our new North Star, the way in which we relate to and orient ourselves in the world around us.

The price we pay for our endless appetite is the destruction of the planet and ourselves.

By all accounts, we are well on our way.
 
Last edited:
Some people I know really like Claude, I've tried it a few times but always end up back on ChatGPT, o3-mini-high has been excellent at coding for me.
 
  • Like
Reactions: mganu
Why would you spend the time to input this long rambling prompt 👇

Screenshot 2025-03-21 at 09.21.05.png


So it can input this 👇 simple concise search that you could/should just input yourself

Screenshot 2025-03-21 at 09.21.08.png



Which leads you to web-links to click on and read and, you know ... "go learn something"

This is all so ridiculous
 
Why would you spend the time to input this long rambling prompt 👇

View attachment 2494465

So it can input this 👇 simple concise search that you could/should just input yourself

View attachment 2494466


Which leads you to web-links to click on and read and, you know ... "go learn something"

This is all so ridiculous
It could be their guardrails they've been trying to implement delaying the web searching? They trying to be the safest GPT and all that, but yeah, their progress on the news have been slow..still rooting for them though 😆
 
Why would you spend the time to input this long rambling prompt 👇

View attachment 2494465

So it can input this 👇 simple concise search that you could/should just input yourself

View attachment 2494466


Which leads you to web-links to click on and read and, you know ... "go learn something"

This is all so ridiculous
Have you used it? It doesn't stop there.

I love Claude, beats ChatGPT, for what I need it, by miles.
Yes, I'm glad it's flying under the radar a little bit because I hate ChatGPT but this search was sorely missing. Now that you can sort of pre-prompt with known-good links it just got an order of magnitude more useful to me. I just ran a test case and it pulled a lot of the same information I did manually for some chip architecture research I performed recently. It was kind of wild seeing it find the same esoteric GitHub project that I came across and helped me understand some things last month.

...

LLMs aren't perfect and will never be but used carefully and with full understanding of how they work (which takes a lot of time and an informed POV) they can be quite an accelerator.

That said, I strongly think no one should use them without fully absorbing Wolfram's excellent write-up imo, although it is a bit out of date now and doesn't cover more modern techniques the bones are still correct: https://writings.stephenwolfram.com/2023/02/what-is-chatgpt-doing-and-why-does-it-work/

I also really like Kagi (sans-AI features) and am considering paying up long-term for it, but I wish they had some cheaper 'lite' model. It's very useful to surface actionable information which most search engines just totally fail at now.

Apple has a ton of catching up to do if they are ever even going to enter this space. If I were them I would probably buy Anthropic outright and keep it running as a stand-alone service forever while integrating some of the useful parts into the OS, assuming they have the research internally making progress toward eventual world models which, who knows.

That's where there's real possibility of society-changing technology and I think Meta and possibly Google are the only ones resourced and focused well enough to accomplish it. OpenAI and Sam et. al. are just lying outright when they say that scale will solve their problems, and they know it but that investor money is hard to resist.

The next 10 years are going to be nuts, assuming the research actually works out. I trust in what Yann LeCun is doing even though I loathe Meta.
 
  • Like
Reactions: MLVC
Good to see it finally but might take some more time for it to be available on the free plan. Claude is really good and works well for me.
 
  • Like
Reactions: MLVC and mganu
3.5 was good but 3.7 with the extended reasoning has been amazing. I tried the web search a couple times and it’s been really additive to the quality of responses I’ve gotten. I already appreciate the ability to make artifacts and visualizations and organize stuff in Projects, so to be able to do that with web search data is a big deal for the workflow I use Claude for.
 
Or…do your own research?
How hard is it to go to google.com and type your search in the little box?

AI is a bad non solution to a non existing problem.

Unless you count the total lack of actual innovation recently in consumer electronics and the need to wow the consumer to move sales of course….
 
  • Like
Reactions: UpsideDownEclair
Integrating web search into a remote AI is essential. If you don’t need web search, run a local LLM.

My use case recently has been using Grok to figure out how best to use a local LLM. I ventured into the world of setting up llama.cpp cli, and I used Grok every step of the way. Far better to have a single point of contact for everything than spend the day Googling everything.

Even when Grok got it wrong (providing the wrong Git repo for llama.cpp) and, despite it insisting it was correct when it wasn’t, it ultimately went off and discovered why its repo choice was wrong, and then provided a correct one.

It would have taken me a lot longer to do this manually. I don’t want to spend all day setting something like this up - I want to get on and use it.
 
Or…do your own research?
How hard is it to go to google.com and type your search in the little box?

AI is a bad non solution to a non existing problem.

Unless you count the total lack of actual innovation recently in consumer electronics and the need to wow the consumer to move sales of course….
Actually AI already ruined search for everyone because even if you don’t use AI for search the web is now filled with AI generated content that is plainly made up and wrong.
 
  • Like
Reactions: UpsideDownEclair
I actually have high hopes that the AI ******** train will collapse in on itself and that a new (old?) better human web will be reborn.
 
  • Like
Reactions: UpsideDownEclair
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.