> There are millions of examples of usage of the word red - none of them communicate the subjective experience of seeing red.
The trick is that all of them provide perspectives and the model composes those perspectives in the embedding space. "Red" is related to apples but also to communism in its internal vector space. And it also related to all situations where humans used the word "red" and expressed emotions, encoding emotional valence as well.
I think confusion comes from how we expect models to represent things. People think models represent the thing in itself, but instead they represent how each thing relates to other things. Thus inputs are both content and reference, they have dual status, and are able to create this semantic space from themselves without external reference.
In a relational system you don't need access to the object in itself, just how its perception relates to other perceptions.
The trick is that all of them provide perspectives and the model composes those perspectives in the embedding space. "Red" is related to apples but also to communism in its internal vector space. And it also related to all situations where humans used the word "red" and expressed emotions, encoding emotional valence as well.
I think confusion comes from how we expect models to represent things. People think models represent the thing in itself, but instead they represent how each thing relates to other things. Thus inputs are both content and reference, they have dual status, and are able to create this semantic space from themselves without external reference.
In a relational system you don't need access to the object in itself, just how its perception relates to other perceptions.