Connectionism and Real Neurons

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to Lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

Which of the following best describes the relationship between connectionist computation, parallel distributed processing, and artificial neural network (ANN) models?

  • They are distinct approaches with different underlying principles.
  • They are different terms that refer to the same concept. (correct)
  • Connectionist computation is a subset of parallel distributed processing, which is a subset of ANN models.
  • ANN models are used for connectionist computation, while parallel distributed processing is a theoretical framework.

According to Rumelhart (1989), which aspect of the brain is emphasized in connectionist models?

  • The gross anatomical structures such as lobes and hemispheres.
  • The metabolic rate and energy consumption of different brain regions.
  • The specific neurotransmitters used for communication.
  • The computational architecture, including algorithms and processes. (correct)

In the context of cognitive science and simulating brain functions, what does Marr's algorithmic level primarily concern?

  • The transfer of information between inputs and outputs. (correct)
  • The physical implementation of neural networks.
  • The hardware components required to run simulations.
  • The subjective experience of cognitive processes.

How does a real neuron transmit signals to another neuron?

<p>Through the release of neurotransmitters at synapses, triggered by electrical-chemical signals. (C)</p> Signup and view all the answers

What determines whether a signal passed from one neuron to another is excitatory or inhibitory?

<p>The type of neurotransmitter released and its effect on the receiving neuron. (C)</p> Signup and view all the answers

In the context of real neurons, what determines whether a neuron will 'fire'?

<p>If the total input received by the neuron exceeds its threshold. (D)</p> Signup and view all the answers

In artificial neurons within connectionist models, what is represented by the 'weight' attached to an input?

<p>The strength and nature (excitatory or inhibitory) of the input signal. (C)</p> Signup and view all the answers

What is the role of the threshold in an artificial neuron?

<p>To prevent the neuron from firing if the level of activation is too low. (C)</p> Signup and view all the answers

Which of the following describes a 'linear' output function in the context of artificial neurons?

<p>The output signal increases in direct proportion to the total input. (A)</p> Signup and view all the answers

What is a key difference between single-layer and multi-layer networks of artificial neurons?

<p>Multi-layer networks have hidden layers that allow them to learn more complex representations. (D)</p> Signup and view all the answers

According to the Computational Theory of Mind (CTM), what is the relationship between mental processes and computation?

<p>Mental processes are a type of computation. (A)</p> Signup and view all the answers

According to the formal conception of computation (FCC), what aspect of internal symbols is relevant to computations?

<p>The syntax or formal structure of the symbols. (B)</p> Signup and view all the answers

How is information stored in a conventional programmable Physical Symbol System (PSS) compared to a connectionist model?

<p>In a PSS, information is stored explicitly in symbols, while in a connectionist model, it is stored implicitly in the connections. (A)</p> Signup and view all the answers

What is a key characteristic of how information is processed in a connectionist approach compared to a PSS?

<p>Connectionist models process information in parallel through interactions among many units, while PSS models use serial steps. (B)</p> Signup and view all the answers

What is meant by 'graceful degradation' in the context of connectionist models?

<p>The ability of the model to continue functioning even when some processing elements are damaged. (D)</p> Signup and view all the answers

How does an artificial neural network (ANN) 'learn'?

<p>All of the above. (D)</p> Signup and view all the answers

In the delta rule/perceptron convergence rule, what is the amount of learning proportional to?

<p>The difference between the actual activation and the target activation. (C)</p> Signup and view all the answers

Why is backpropagation necessary in multilayer networks?

<p>To calculate the difference between the target and current activation levels in the hidden layers, which are not directly accessible. (D)</p> Signup and view all the answers

How does reprogramming a PSS differ from adjusting an artificial neural network (ANN)?

<p>PSS requires reprogramming the algorithmic rules, while ANN adjusts itself through error signals. (A)</p> Signup and view all the answers

In an artificial neural network, the output signal is equal to the total input signal, and the threshold is 1. If the input signals are 1, 1, and 0.5 with corresponding weights of 1, -1, and 1, will the node fire?

<p>No (C)</p> Signup and view all the answers

Which statement accurately describes the truth table result of a logical 'AND' operation?

<p>The output is true only if both inputs are true. (C)</p> Signup and view all the answers

What is the primary reason why a single-layer network might fail to solve the XOR problem?

<p>Single-layer networks can only implement linear functions. (D)</p> Signup and view all the answers

In the context of logic gates, what output would you expect from an inclusive 'OR' gate when both inputs are true?

<p>True (B)</p> Signup and view all the answers

Which of the following is an example of a constraint satisfaction problem that ANNs can solve?

<p>Perceiving a Necker cube with a single, stable interpretation. (B)</p> Signup and view all the answers

What is an example of 'mutually interacting constraints' in the context of ANNs and constraint satisfaction problems?

<p>When feature A is present, feature B is expected to be absent. (A)</p> Signup and view all the answers

In Rumelhart's constraint satisfaction networks, what happens after the weights are modified?

<p>The system will change from satisfying less constraints to satisfying more. (C)</p> Signup and view all the answers

According to the material, what is the key requirement for pattern recognition when using ANNs?

<p>The involvement of hidden layers. (C)</p> Signup and view all the answers

What happens to an input pattern in the memory system of an ANN when it is unfamiliar?

<p>The input pattern becomes weaker ('dampened'). (A)</p> Signup and view all the answers

According to Rumelhart, what is a key advantage of ANNs over conventional PSS-based AI programs regarding adaptation?

<p>ANNs can easily adapt to their environments, while PSS-based AI programs have difficulty modifying themselves. (D)</p> Signup and view all the answers

What does it mean for ANNs to 'directly represent' similarities among patterns according to the provided text?

<p>Similarities are encoded in the connection weights, leading to similar effects for similar patterns. (D)</p> Signup and view all the answers

What is the role of hidden layers in enabling similarity-based generalization in ANNs?

<p>Hidden layers allow ANNs to 'learn' appropriate internal representations that map similar input patterns to similar output patterns. (B)</p> Signup and view all the answers

During the learning process in ANNs, what happens if the target values are not actually attained by certain units?

<p>The weights are modified (typically using backpropagation). (A)</p> Signup and view all the answers

According to Fodor and Pylyshyn (1988), on what level does the comparison between classical cognitive architecture and connectionism primarily occur?

<p>The implementational level. (B)</p> Signup and view all the answers

According to the source material, what is one major difference between classical cognitive architecture and connectionism?

<p>Classical cognitive architecture is reliant Von Neumann architecture while connectionism is not. (D)</p> Signup and view all the answers

According to the source material, what is a fact about ANNs?

<p>All ANNs are implemented on computers with the Von Neumann architecture. (B)</p> Signup and view all the answers

Flashcards

Connectionism

Connectionism uses connectionist computation, parallel distributed processing, and artificial neural networks (ANN) as neurally inspired models.

Brains Architecture

The computational architecture (algorithms and processes) of brains. Connectionist models aim to simulate this on a computer.

Excitatory Signals

Signals that promote neuron firing.

Inhibitory Signals

Signals that inhibit neuron firing.

Signup and view all the flashcards

Total Input (Neuron)

Strength of the signal at the dendrite. Strength value of incoming signal * corresponding weight.

Signup and view all the flashcards

Threshold (Neuron)

If total input exceeds this, the neuron will fire.

Signup and view all the flashcards

Weight (Artificial Neuron)

The weight attached to input in artificial neurons.

Signup and view all the flashcards

Threshold (Artificial Neuron)

The threshold of the neuron in artificial neurons

Signup and view all the flashcards

Level of Activation

The sum of the activation values of all inputs to a neuron.

Signup and view all the flashcards

Linear Output Function

Output signal increases in direct proportion to total input.

Signup and view all the flashcards

Threshold Linear

No output until threshold is reached, then output increases linearly.

Signup and view all the flashcards

Binary Threshold

Zero output below threshold, max output above threshold (on-off switch).

Signup and view all the flashcards

CTM (Computational Theory of Mind)

Mental processes are computations.

Signup and view all the flashcards

Explicit Information (PSS)

Information stored in the states of units and manipulated symbols.

Signup and view all the flashcards

Implicit Information

Information implicit in the structure of the device that carries out the task (patterns of connections & weights).

Signup and view all the flashcards

Parallelism

Processing elements carry out computations simultaneously.

Signup and view all the flashcards

How ANNs Learn

Changing connection weights via development/loss of connections or by modifying strengths.

Signup and view all the flashcards

Delta Rule

The amount of learning is proportional to the difference between the actual and the target activation.

Signup and view all the flashcards

Backpropagation

Transmitting information forwards, error signals (difference between actual and target) are sent backwards.

Signup and view all the flashcards

Connectionist Models

Unlike PSS, connectionist models do not operate by following programmed rules.

Signup and view all the flashcards

Translated Constraints

Mathematically translated constraints into positive or negative connections between units.

Signup and view all the flashcards

Evolving Satisfaction

After modifying weights, system will change from satisfying less constraints to satisfying more.

Signup and view all the flashcards

Pattern Recognition

Requires involvement of hidden layers. Learning rules are automatically employed to configure the pattern of connections

Signup and view all the flashcards

Similarity-Based Generalization

Conventional PSS-based AI programs could not modify themselves and it is difficult for them to adapt to their environments. ANNs can.

Signup and view all the flashcards

Study Notes

Connectionism

  • Connectionist computation is also known as parallel distributed processing.
  • It is also known as artificial neural network (ANN) models.
  • Connectionist models are neurally inspired, according to Rumelhart (1989).
  • Connectionist models simulate the computational architecture of brains on a computer.
  • It is useful to consider whether connectionist models reflect how the brain processes information.
  • There is no evidence of backpropagation in the brain.
  • Algorithms should be carried out to simulate the cognitive architecture of brains.
  • The simulation is functional and algorithmic, not implementational.
  • Marr's algorithmic level involves how information is transferred between inputs and outputs.

Real Neurons

  • Real neurons pass electrical-chemical signals to other neurons through synapses along their axons.
  • Dendrites receive synaptic input signals from axons.
  • Signals fired by a neuron's synapses is either excitatory or inhibitory.
    • Excitatory signals promote firing
    • Inhibitory signals inhibit firing.
  • The strength of the signal at the dendrite equals the total input received by a neuron.
    • The strength is the value of the incoming signal * the corresponding weight.
  • Excitatory signals from synapses have positive weights, while inhibitory signals have negative weights.
  • If the total input exceeds the threshold, the neuron will fire.

Artificial Neurons

  • Artificial neurons in connectionist models receive input signals from many units/nodes.
  • Each input (I) is assigned a weight (W) between -1 and 1
    • This is to simulate the excitatory/inhibitory signals from the synapses.
  • For each input, the activation value (WI) is the input value multiplied by the weight (W).
  • The level of activation of each neuron is the sum of the activation values of all the inputs.
  • A neuron transmits an output signal if the level of activation reaches the threshold.

Output Signals

  • Output signals will be transmitted to the next neuron.
  • This depends on the output function
    • Linear output: signal increases in direct proportion to total input
    • Threshold linear: no output signal until threshold is reached, then linear
    • Sigmoid: roughly linear between threshold and max firing rate, similar to real neurons
    • Binary threshold: zero output below threshold, max output above threshold (on-off switch)

Network Layers

  • Single layer networks exist, but there are also multilayer networks.

Computational Theory of Mind (CTM)

  • CTM proposes that mental processes are computational.
  • The mind should be seen as software executed by the hardware of the brain
  • Mental states are internal symbols that are contentful and representational.
  • The formal conception of computation (FCC) states that computations are only sensitive to the syntax of internal symbols.

PSS vs Connectionism

  • A comparison exists between PSS (Physical Symbol System) and Connectionism
  • Rumelhart (1989) said that all knowledge is in the connections (Textbook, p. 204).
  • In a conventional programmable PSS, information is explicitly stored in the states of units and symbols being manipulated.
  • In a connectionist model, information is implicitly stored in the structure of the device that carries out the task.
  • Information is distributed across the entire network in the patterns of connections between weights/strengths of the connections.
  • The connectionist approach says that information processing relies on the interactions among the large numbers of processing units. -Parallelism refers many processing elements carry out computations simultaneously.
    • Information processing is implemented by huge numbers of processing elements, realized by the patterns of connections between the units.
    • There can be different constraints for different processing elements.
  • In a PSS, algorithmic rules have serial steps.
  • Parallelism implies fast processing and graceful degradation
    • High processing speed includes large numbers of steps and each step takes time.
  • Parallelism leads to steps processed simultaneously, saving time.
    • Graceful degradation means damage in some processing elements will not interfere with other processing elements since they work in parallel.
  • An ANN learns by changing weights
    • This happens through the development of new connections
    • Loss of existing connections, and/or modification of the connection strengths that already exist
  • Learning is successful when the configuration of the network allows it to produce the right outputs.
  • A single layer network follows, the delta rule/perceptron learning rule/the perceptron convergence rule.
    • The amount of learning is proportional to the difference between the actual activation and the target activation.
      • δω = ε * (τ₁ - a₁) * o₁ (Textbook, p. 209) where amount of learning.
  • Multilayer networks do not know the target activation levels of the units in the hidden layers and cannot calculate the difference in activation levels.
    • Backpropagation is a generalized version of the delta rule.
    • Information is transmitted forwards, with error signals propagated backwards throughout the network.
  • Weights are modified on the "second to last layer" and then sent back through another layer.

Programming

  • In a PSS, the information carried by the symbols is distinct from the algorithmic rules about how to manipulate those symbols.
  • In a connectionist model, connectionist models do not follow programmed rules
  • To change the activity of a physical symbol system, the algorithmic rules need to be reprogrammed.
  • To change the activity of an artificial neural network, point out where it's going wrong and it should automatically adjust itself.
  • In an ANN, for each node (unit):
    • The output signal equals total input signal.
      • The threshold is 1.
      • The node will fire when the total input signal is greater than or equal to the threshold.

AN Gate

  • A logic gate that returns true only if all inputs are true.

OR Gate

  • A logic gate that returns false if all inputs are false.

XOR Gate

  • XOR Gate, the "exclusive or," returns true if and only if one of the inputs are true, and the other is false.
  • The XOR problem “cannot be solved by networks that lack hidden units” (Textbook, p. 213).
  • This problem is solvable by adding another hidden unit.
    • This hidden unit will take the value 1 if and only if the first two are both 1.

Rumelhart's Examples

  • Constraint satisfaction problems exist.
  • Pattern Recognition is another Rumelhart example.
  • Automatic, similarity-based generalization is also a thing.
    • ANNs can solve constraint-satisfaction problems very well. Constraints and Interactions
  • Mutually interacting constraints emerge when feature A is present, feature B is expected (not) to be present.
  • One constraint is that each vertex has a single interpretation.
  • The other constraint is that the interpretation of any vertex is constrained by the interpretations of its neighbors.
  • Constraints can be mathematically translated into positive or negative connections between the units. Each Unit
  • Each unit has three positively connected neighbors.
  • Each unit has two negatively connected competitors.
  • Each unit has one positive input from the stimulus (p. 210).
  • Some constraints must be followed.
  • The degree of constraint satisfaction/goodness happens when weights modify the system.
  • The system will change from satisfying less constraints to satisfying more.
  • Stable State
  • Eventually, one subsystem will have all its units.
  • Other units will have all its units (competitors) not activated.
  • Hidden layers, automatically employed learning rules, and configured pattern connections are involved with Pattern recognition
  • In the recognition response, patterns become stronger if they are familiar/previously stored.
  • The input pattern becomes weaker if unfamiliar.
  • Input familiar patterns can be filled in and “assimilated".

Similarity-Based Generalization

Conventional PSS-based Al programs could not modify themselves or adapt to their environments. ANN can: “similarities among patterns are directly represented along with the patterns themselves in the connection weights. ANN can do this in such a way that similar patterns have similar effects” (Textbook, p. 212, emphasis added). Rumelhart's proposal says ANNs automatically make correct similarity-based generalization about the environment.

  • Similar input patterns are mapped to similar output patterns.
  • Some hidden layers serve a similar function to “internal representations".
  • The XOR problem exists when the added hidden unit and the weights function to "represent" a constraint from the environment
  • For example, when the two input stimuli are both present, the output pattern should be 0.
  • The learning procedure involves two stages
  • The ANN receives an input.
  • After a period of time, certain units of the ANN receive target values.
  • If target values are actually attained, then the weights between those units and their input units remain unchanged
  • Otherwise, the weights are modified (hidden layer!).
  • In backpropagation, this learning procedure starts from the output units.

Classical vs Connectionism

  • Implementational levels differentiate classical architectures from connectionism
  • Arguments against Classical cognitive architecture depend on properties that are not intrinsic to Classical architecture.
  • Classical architectures implement on current computers, but "need not be true of such architectures when differently (e.g., neurally) implemented".
  • ANNs are alternative to PSS on the algorithmic level (Slides #3-4)!
  • All ANNs are implemented on computers with the von Neumann architecture.

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

Related Documents

More Like This

Use Quizgecko on...
Browser
Browser