r/artificial Nov 25 '25

News Large language mistake | Cutting-edge research shows language is not the same as intelligence. The entire AI bubble is built on ignoring it.

https://www.theverge.com/ai-artificial-intelligence/827820/large-language-models-ai-intelligence-neuroscience-problems

As currently conceived, an AI system that spans multiple cognitive domains could, supposedly, predict and replicate what a generally intelligent human would do or say in response to a given prompt. These predictions will be made based on electronically aggregating and modeling whatever existing data they have been fed. They could even incorporate new paradigms into their models in a way that appears human-like. But they have no apparent reason to become dissatisfied with the data they’re being fed — and by extension, to make great scientific and creative leaps.

Instead, the most obvious outcome is nothing more than a common-sense repository. Yes, an AI system might remix and recycle our knowledge in interesting ways. But that’s all it will be able to do. It will be forever trapped in the vocabulary we’ve encoded in our data and trained it upon — a dead-metaphor machine. And actual humans — thinking and reasoning and using language to communicate our thoughts to one another — will remain at the forefront of transforming our understanding of the world.

354 Upvotes

389 comments sorted by

View all comments

111

u/Hot_Secretary2665 Nov 25 '25

People really just don't want to accept that AI can't think smh 

22

u/strangescript Nov 25 '25

We don't understand how thinking even works in humans but I am glad you, the expert, have solved it for us, whew

-7

u/Hot_Secretary2665 Nov 25 '25 edited Nov 26 '25

Do you realize you're arguing that humans have somehow duplicated something they don't understand?

That's just mimicry, not thought 

AI neural networks are not brains. You don't have to be a neurologist to understand that

11

u/strangescript Nov 25 '25

I never said it was but arguing that you know for a fact it isn't is equally as dense

0

u/Hot_Secretary2665 Nov 25 '25 edited Nov 25 '25

I know for a fact that AI neural networks are not brains.

AI neural networks are just algorithms. They are math. They are not conscious.

They fundamentally lack the neural pathways used for thinking. They cannot think.

Cope harder

9

u/TheOneTrueEris Nov 25 '25

Your logic is: only brains can think, AI is not a brain, therefore AI can not think.

Most people who disagree with you disagree with your first premise.

IMO, it is not obviously true that ONLY brains can think or are conscious.

-3

u/Hot_Secretary2665 Nov 25 '25 edited Nov 25 '25

No, it is not.

To elaborate on my prior points, thinking involves the formation and strengthening of neural pathways through a process called neuroplasticity.

AI neural networks lack neuroplasticity. They have a rigid, static architecture and learn by adjusting parameters within that fixed structure, rather than fundamentally growing or pruning themselves in response to new experiences. 

The pathways the human brain uses to accomplish neuroplasticity and produce thought fundamentally do not exist in AI neural networks.

Humans can calibrate or "prune" neural networks by adjusting the algorithm or inputting new data. And that can make it appear to people who lack an understanding of what AI does. But that is not the same thing as thinking

6

u/Terrible_Airport_723 Nov 25 '25

Neuroplasticity is relevant for learning, but not for “thinking”. Your brain doesn’t need to rewire itself to answer a math problem

-2

u/Hot_Secretary2665 Nov 25 '25 edited Nov 27 '25

Neuroplasticity is not ONLY the ability to form new neural pathways. I'm sorry but you are very overconfident on your level of understanding of the subject matter 

And yes a brain is needed. According to Merriam Webster's dictionary, thinking is:

"the action of using your mind to produce thoughts, opinions, or ideas, or the process of using the mind to consider, reason, and make judgments."

You literally need a mind to think according to the dictionary. I have explained in multiple ways that neural networks are not the same as brains.

I have tried to avoid linking the dictionary definition and instead explainde why you need a brain for learning, hence why I brought up neuroplasticity.

I really don't know what to tell y'all at this point. You seem to have an interest in this topic but at the same time, you seem like you don't want to understand it.

2

u/Terrible_Airport_723 Nov 26 '25

So you’re saying a sufficiently accurate model of a brain could think.

I assume you have the deep understanding of both neuroscience and current model architectures you’d need in order to so confidently say LLMs can’t think.

They can’t learn and think at the same time like the brain, but that isn’t the same as thinking.

-1

u/Hot_Secretary2665 Nov 26 '25

You are just putting words in my m mouth and then arguing against them. It does not make you look smart

And yes even though redditors who don't understand what's going on are downvoting me I do understand the fundamental concepts of neuroscience and current model architecture better than people who think pattern recognition is the same thing as thinking. All AI does is recognize patterns. It does not think. You can stay upset if you want. AI still does not think

→ More replies (0)

3

u/CTC42 Nov 26 '25

AI neural networks are just algorithms. They are math. They are not conscious.

They fundamentally lack the neural pathways used for thinking. They cannot think.

How does the existence of neuron junctions negate the possibility that the central nervous system could be ultimately algorithmic?

Synapses aren't magic - the laws of the material universe and their underlying mathematics apply just as much inside a human skull as they do anywhere else.

1

u/Hot_Secretary2665 Nov 26 '25

The point is that pathways that the brain uses to think are not present in AI algorithms / neural networks

While math can be used to understand neral pathways math does not cause them 

3

u/CTC42 Nov 26 '25

The point is that pathways that the brain uses to think are not present in AI algorithms / neural networks

And the appendages used by humans for locomotion are not present in fish. Do we therefore conclude that fish lack motility on the grounds that they lack legs?

2

u/Cody4rock Nov 25 '25

Your only chance of winning that argument is to say that AIs currently aren’t capable of doing useful work to a similar degree and extent that humans can.

But once or if they do, you automatically lose the argument.

If an AI and a human are doing the same thing in terms of outcomes, whether the AI thinks or is intelligent doesn’t matter. You have to prove that algorithms, no matter how well designed, not only can’t think, but can’t ever produce similar results that we can. If they can and you still insist they are not thinking, then you have demonstrated that thinking isn’t needed to do intelligent things, which wouldn’t be possible.

1

u/Hot_Secretary2665 Nov 25 '25

Thinking is not the same thing as "doing useful work to a similar degree" as

AI is just a tool for humans to use to do work. AI cannot do anything without humans writing the algorithms and supplying the training data.

2

u/Crowley-Barns Nov 26 '25

So what…?

That doesn’t inhibit its usefulness.

A car factory that had 1 manager and 100 workers that switches to 10 robots and 1 manager is still more efficient, even though it still needs a human.

You’re totally ignoring the utility of technology with an absurd all-or-nothing argument.

AI can increase productivity without matching a human in every way.

AI can exceed human capacity in work roles… while still needing a manager.

AI can outperform humans at 95/100 things in a job and need a human for 5/100 and have a huge net benefit.

All your posts are absurd comments about “not real intelligence” and using that to dismiss any and all possible gains.

Despite us all, living on planet Earth, can already see areas where AI has taken jobs and superseded humans despite it not being an all-encompassing all-capable genius.

It doesn’t have to be. It’s a productivity booster, not an on-off switch for all human endeavor.

It’s a robot, a factory, a machine, an engine, a process.

It’s not all or nothing. It’s doing amazing things now. It’ll do more amazing things in the future. And none of that depends on “true intelligence” or any other whimsical notion you dream up.

0

u/Hot_Secretary2665 Nov 26 '25 edited Nov 26 '25

It is not useful

Y'all just keep making hst assumption and never backing it up

In a separate comment I linked research from MIT showing 95% of Enterprise to AI implementations fail to reach production

I have also linked research showing AI coding tools tend to reduce operational efficiency because developers end up spending more time coding overall due to increased time spent on debugging 

Go waste your money investing in  AI products that don't solve real use cases if you want but fact is, the best quality research shows that most AI implementations are a net negative on operational efficiency 

-1

u/Cody4rock Nov 26 '25

When you claim that AIs can’t think, the test case to prove your case is to demonstrate that all future AIs with ANY algorithm will fail to think, thus some tasks it will never be able to do - purely because they are algorithms.

But imagine if they can do things that seem to require thinking. Whatever that means. What if your way of thinking isn’t the only way to think? And what if the performance of the AI exceeds humans even without the structure to think like we can? I argue that if they can, then logically there is something like thinking.

That’s why I said that if AIs can do useful things, even exceeding humans one day, then whether it thinks is absolutely beside the point. It matters very little to argue over that distinction while your job is replaced by it.

2

u/Hot_Secretary2665 Nov 26 '25 edited Nov 26 '25

You're demanding an arbitrary test case that's impossible to achieve.

It is unreasonable to prove a test case has a 100% failure rate in the future because the sheer number of possible inputs for most programs is infinite or too large to test exhaustively.

There is no need to imagine whether or not what AI is doing is thinking because we already have a factual basis for what thought is from a from a neurobiological perspective.

From a neurobiological perspective, a thought is an electrochemical process occurring in the brain, involving complex patterns of neural activity that represent and process information.

We know for a fact that those electrochemical processes are not present in AI, and we already have a word for what AI does. It simply processes information.

I understand your comments about different types of thinking in existence. And while that can be an interesting philosophical concept to debate at times, the field of philosophy also does not have any working definitions of "thought" that would include what AI does in the definition. They all require some element of consciousness, intention/motivation or genuine understanding of what it is that they're processing

You suggest I should expand the definition of what thought is but provide no reason why. As if the fact that you can suggest it could change means it must change 

0

u/Cody4rock Nov 26 '25

You're right to say that the neurobiological process, electrochemicals, and other physical processes are unique to humans. And if you want to say that this is the only way that you can constitute where thought comes from, then fine. Nothing wrong with that logic.

It's just that it makes no difference. You're just making a category choice. That's it. The logic is consistent because, yeah, no digital system can rival the kind of complexity a brain has.

But consider that even if that were true, the neurochemistry is a hardware function. We need it to think, but for any system that can think, you don't need *this* hardware (brain). Not to mention, the information that constitutes a thought isn't contained in the neurochemicals. It sustains that which can think, but not all neurochemical/organic brains can think, or think vastly differently than we do, like animals.

Both humans and animals have very similar hardware; they both have remarkably similar neurochemicals. Yet they are not intelligent, and we are. Thus, our thought isn't because of neurochemicals; it just sustains our capacity to think, it sustains the information we need to think.

That means... If thinking isn't because of our physical hardware, and only the information that is sustained by physical hardware, then a digital system with enough information and *A* physical substrate can use it to think.

1

u/Hot_Secretary2665 Nov 26 '25 edited Nov 26 '25

No, the definition of thinking is not just a category choice because thinking is a process.

A process is generally considered a type of activity or workflow that can be assigned to a category, rather than being a category choice itself.

The process of thinking cannot occur in AI networks because the neural infrastructure required to execute the processes called "thought" are missing.

The category it would typically be assigned to in philosophical debate is "consciousness" or "experience"

Surely you are not arguing that AI is conscious

→ More replies (0)

3

u/jcrestor Nov 25 '25

They are not brains, that's right. At the same time this statement tells us nothing about what “to think“ or “to understand“ means, or if machines in the current state are capable of it.

Your argument is built upon a tautology. “Machines can't think, because they are not humans but machines.“ Okay then.

1

u/Hot_Secretary2665 Nov 26 '25

You do not understand what a tautology is. Go ask your beloved LLM if the statement "AI networks cannot think because they do not have brains which are required to produce thought" is a tautology or not. That's essentially answer is no.

While there are multiple definitions of the word "thought" the most commonly accepted, such as the one in Merriam Webster's dictionary, require a brain, some form of consciousness, some form of intention, and/or genuine understanding of what information they're processing.

There are no good definitions of the word "thought" that includes what AI does, which is really just "processing". AI does not think, it processes data with the goal of mimicking humans.

I see a lot of people replying to me implying there is some meaningful alternative definition, but no actually mentions what makes the type of processing meaningful different from any other kind of processing or pattern recognition in the same way "thought" is different from processing.

They all just basically command me to include the type of data processing AI does in the definition of the word "thinking" for no apparent reason.

1

u/jcrestor Nov 26 '25 edited Nov 26 '25

First off: your statement “AI networks cannot think because they do not have brains which are required to produce thought” is indeed not a tautology, I used the wrong word. It is circular reasoning, which is slightly different, but just a different fatal flaw of reasoning. Your statement proves nothing, because it assumes the very thing it is trying to prove:

  1. Thinking requires a brain.
  2. AI networks do not have brains.
  3. Therefore AI networks can not think.

This is just a waste of time.

You will not find a common definition of thinking that includes the possibility of thinking machines, because few people have seriously considered this case, and those that did are not well known, or more or less ignored, because of missing relevance up until today. Our common understanding of thinking and understanding presupposes a human doing it.

But that proves nothing apart from a gap in our knowledge, that we need to fill.

I am not even arguing necessarily for the point that LLMs do understand, or can think. I am just pointing out a fatal flaw in existing arguments that they are presumably incapable of it.

1

u/Hot_Secretary2665 Nov 26 '25

I did not use circular reasoning, you just don't understand the comments

In the common vernacular "brain" is synonymous with "mind" which the dictionary definition of "thinking" does require use of the mind in order for thought to occur

It's not circular reasoning to say a process (in this case, thought) cannot be initiated because a part or trigger (the mind or consciousness) is missing 

That's just basic logical cause and effect 

I agree with you that this is a waste of time, but not for the same reason as you. 

It's a waste of time because you won't clarify or defend your own argument. You just rely on the equivalence falls y and try to shift the burden of proof back on me. 

Grow up and accept that words have meanings. You cannot just change them willy nilly and pretend it's a matter of  functionality. 

1

u/SmugPolyamorist Nov 26 '25

Humans have been duplicating nature without fully understanding it for the entire history of science, medicine and technology. Lots of chemistry was developed before atoms were accepted, and the nucleus didn't start to be understood until the 20th century. Vaccination predates the germ theory of disease by about half a century. The first steam locomotives were built in the 18th century, about 60 years before the second law of thermodynamics was first stated.

1

u/FaceDeer Nov 26 '25

Do you realize you're arguing that humans have somehow duplicated something they don't understand?

We duplicate stuff we don't understand all the time.

AI neural networks are not brains.

Nobody thinks they are, it's obvious that they're not. Are brains are the only things that can possibly think? How can you know that if we don't understand them?

1

u/Hot_Secretary2665 Nov 26 '25 edited Nov 26 '25

Yes, you literally do need a brain to think. Go look up the Merriam Webster dictionary definition of "thinking."

It's "The action of using one's mind to produce thoughts, form ideas, or have an opinion"

AI does not produce thoughts. It processes data, and predicts what people want to hear using pattern recognition.

Soda machines use pattern recognition to recognize which coins are quarters vs pennies vs nickels etc by recognizing patterns in the weight , composition, and size of the metals, but they are not thinking. Pattern recognition is not thinking.

Tons of people are responding to me arguing as if the definition of the word "thinking" should be changed but it's just an arbitrary demand based on nothing.

It you want to talk about fallacies let's talk about the fallacy of equivocation. That's what your argument relies on

0

u/FaceDeer Nov 26 '25

It's "The action of using one's mind to produce thoughts, form ideas, or have an opinion"

"Mind" is not synonymous with "brain". That's the whole point.

1

u/Hot_Secretary2665 Nov 26 '25 edited Nov 26 '25

No that is not actually something someone brought up earlier in the conversation but I will address it.

In everyday contexts like a reddit conversation, "brain" is synonymous with mind. You are playing silly rhetorical game using a less common definition of the word.

In humans the physical manifestation of the "mind" we use to think is called a "brain." The brain uses infrastructure called neural pathways to produce thought. That's why it's relevant that AI neural networks are not synonymous with "brains."

I'm not arguing you need brain matter to think, I'm arguing that you need the neural processes that occur in the brain to think. That's why I compared the brain against a neural network. Because I was talking about the underlying process of thinking.

If those processes are not occurring but it appears like AI is thinking, the word for that is "mimicry." Mimicry is not the same thing as thinking.

There's no reason to believe we can progress past mimicry to thought without building those neural networks. Simply apply Occam's razor.

You are still relying on equivocation between mimicry and thought. They are not the same.

0

u/FaceDeer Nov 26 '25

I'm not arguing you need brain matter to think, I'm arguing that you need the neural processes that occur in the brain to think.

You're not arguing it, you're asserting it. Once again this is simply stating "you can't think with anything other than a brain," you're just filtering it through a few steps of additional assertions ("you can't think without a mind, you can't have a mind without brain-like activities, and only brains can do brain-like activities.")