Skip to content

Commit c298d9e

Browse files
authored
Update README.md renumber Qs to match new sim order
1 parent 0abba38 commit c298d9e

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

ch10/sem/README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ You should observe sparse patterns of weights, with different units picking up o
2222

2323
One of the most interesting things to notice here is that the unit represents multiple roughly synonymous terms. For example, you might see the words "act," "activation," and "activations" and "add", "added", "adding", "additional".
2424

25-
> **Question 10.9:** List some other examples of roughly synonymous terms represented by this unit.
25+
> **Question 10.1:** List some other examples of roughly synonymous terms represented by this unit.
2626
2727
This property of the representation is interesting for two reasons. First, it indicates that the representations are doing something sensible, in that semantically related words are represented by the same unit. Second, these synonyms probably do not occur together in the same paragraph very often. Typically, only one version of a given word is used in a given context. For example, "The activity of the unit is..." may appear in one paragraph, while "The unit's activation was..." may appear in another. Thus, for such representations to develop, it must be based on the similarity in the general contexts in which similar words appear (e.g., the co-occurrence of "activity" and "activation" with "unit" in the previous example). This generalization of the semantic similarity structure across paragraphs is essential to enable the network to transcend rote memorization of the text itself, and produce representations that will be effective for processing novel text items.
2828

@@ -50,7 +50,7 @@ You should see that attention and spelling are only related by around 0.06, indi
5050

5151
* Compare several other words that the network should know about from reading this textbook (tip: Click `Envs` in the left control panel, then `Train`, then `Words` in the window that appears to see a list of all the words, and scroll through that to see what words are in the valid list (these are words with frequency greater than 5, and not purely syntactic).
5252

53-
> **Question 10.10:** Report the correlation values for several additional sets of Words comparisons, along with how well each matches your intuitive semantics from having read this textbook yourself.
53+
> **Question 10.2:** Report the correlation values for several additional sets of Words comparisons, along with how well each matches your intuitive semantics from having read this textbook yourself.
5454
5555
# Distributed Representations of Multiple Words
5656

@@ -78,7 +78,7 @@ The similarity does now decrease. Thus, we can see that the network's activation
7878

7979
You should see that the similarity goes back up. Thus, this is potentially a very powerful and flexible form of semantic representation that combines rich, overlapping distributed representations and activation dynamics that can magnify or diminish the similarities of different word combinations.
8080

81-
> **Question 10.11:** Think of another example of a word that has different senses (that is well represented in this textbook), and perform an experiment similar to the one we just performed to manipulate these different senses. Document and discuss your results.
81+
> **Question 10.3:** Think of another example of a word that has different senses (that is well represented in this textbook), and perform an experiment similar to the one we just performed to manipulate these different senses. Document and discuss your results.
8282
8383
# A Multiple-Choice Quiz
8484

0 commit comments

Comments
 (0)