Denotative

I found this word. You might like to consider it. It is taking the literal meaning of words, free from emotion or other associations.

I have noticed in some discussions with AI it gives overly emotional responses that I don't expect. Or that my writing is cold with no emotions. It also aligns with some real world interactions. I write precisely, which is a strength in environments where words have specific meanings, but a problem in general communication.

It also affects interpretation of sentences or questions. I find quizzes, market survey, autism screening questions, various training confirmation questions, etc. difficult because the way I interpret words is not quite what was meant. It can lead to discussions or claims of pedantry, rather than being views as clarification of the meaning.

It is more subtle than not getting humour or sarcasm. I think this is an underlying mechanism behind some communication issues. Words are intended to be somewhat sloppy.

When tied with cognitive empathy but limited affective empathy, you get confusion.

I realise I model things, people, processes, interactions, everything. They are all systems with rules, which require evaluation,  precise descriptions and language. I can't not do it. It is subconscious but comes out in language.

I think this is a feature of how systematizing you are.

Parents
  • Certain writing requires precise wording. But when you don't need to write like that, can you?

    The fact people have problems with the questions on the AQ50 and  have to think about the intent of the question, rather than just answering it, shows there are issues with processing the words. I think the literal meaning is taken, not a broader contextual one.

    I have always answered questions based on what I think the intent of the question is, not based on what the words actually literally say. There is a translation layer based on experience, but it is effortful.

    Asking for clarification to simple questions suggests an internal model is needed before the concept can be understood. But once modelled the answers are almost instant.

    Ambiguity causes confusion and deadlock, rather than being glossed over. It is an example of a processing difference 

    And when writing (or speaking I think) informally can you include emotion, either through the use of emotive words, phrasing or implication from context? I don't think I can.

    In some discussion with AI I realised I didn't interpret the sentence the same. I was missing the emotional context. I need to check more, but it aligns with other things. I get emotional responses to things I think are factual, and vice versa. Yet I don't see the difference.

    In a work conversation it is less obvious than in close interpersonal ones. When you get an unexpected response, try and remember the wording and then see if AI (or you, later) can see what was wrong.

    I think there is less bluntness than just factual accurate speech that lacks emotion.

    It can be obvious, or more subtle, depending on how much you have learnt or adapted your communication.

    Words that are used incorrectly jar or jump out instantly because they break the accurate modelling. They are are hard to ignore because the processing system throws an error.

    I have been thinking for months, identifying differences and traits and looking for the underlying causes, so I can work around them.

Reply
  • Certain writing requires precise wording. But when you don't need to write like that, can you?

    The fact people have problems with the questions on the AQ50 and  have to think about the intent of the question, rather than just answering it, shows there are issues with processing the words. I think the literal meaning is taken, not a broader contextual one.

    I have always answered questions based on what I think the intent of the question is, not based on what the words actually literally say. There is a translation layer based on experience, but it is effortful.

    Asking for clarification to simple questions suggests an internal model is needed before the concept can be understood. But once modelled the answers are almost instant.

    Ambiguity causes confusion and deadlock, rather than being glossed over. It is an example of a processing difference 

    And when writing (or speaking I think) informally can you include emotion, either through the use of emotive words, phrasing or implication from context? I don't think I can.

    In some discussion with AI I realised I didn't interpret the sentence the same. I was missing the emotional context. I need to check more, but it aligns with other things. I get emotional responses to things I think are factual, and vice versa. Yet I don't see the difference.

    In a work conversation it is less obvious than in close interpersonal ones. When you get an unexpected response, try and remember the wording and then see if AI (or you, later) can see what was wrong.

    I think there is less bluntness than just factual accurate speech that lacks emotion.

    It can be obvious, or more subtle, depending on how much you have learnt or adapted your communication.

    Words that are used incorrectly jar or jump out instantly because they break the accurate modelling. They are are hard to ignore because the processing system throws an error.

    I have been thinking for months, identifying differences and traits and looking for the underlying causes, so I can work around them.

Children
No Data