living with trauma, basically]
A thread that keeps coming up in speculative fiction I'm reading at the moment (which is probably more indicative of what I'm seeking out than any publishing trends?) is the necessity for artificial intelligences to have emotions
, in order to facilitate making arbitrary choices (the Imperial Radch; the Wayfarers; ...). Logic alone isn't adequate for a complex responsive intelligence: they'd stall out agonising over minutiae.
I've also been having a fair few (they say, wryly) conversations around emotional reactions and responses to contexts and events. I've known for a long time that going "okay, but that's not what's going on, here's a coherent model for my actions and behaviour and motivations that demonstrates that the thing you're scared of isn't actually happening" doesn't actually
seem to have as much effect on most people's decision-making and behaviour as I'd (naively) expect. And then yesterday my interlocutor said: doesn't impact how I feel about the thing ;-) just what I logically conclude
... and -- oh. oh
. Between the BPD or c-PTSD or whatever and the depression, I've in fact had to spend a lot of time working on... precisely
that, right? I have to spend a lot
of time and energy directing myself away from reacting based on compelling emotional "truths" and toward responding
based on logical frameworks. I don't have to act as though people I'm close to want me to vanish absolutely from their lives unless they directly tell me that in fact they have changed their mind and they do
*. For me, having a logical framework that contradicts my emotional understanding of the world doesn't stop me having feelings
. It just -- informs what I do with them? I can free up a lot of processing power because I stop "having to" worry about how accurate they are, how much I should be taking them into account, whether I should be acting based on them. The solution to the feelings then becomes self-validation ("wow yep feeling like this is pretty rubbish, have some hot chocolate and do some stretches"), rather than their being an additional constraint I have to try to solve for, that's usually mutually exclusive with what other people are actually telling me they want
"This information changes what I logically conclude about the situation" seems to be pretty powerful for me in a way that, as far as I can tell, it perhaps isn't for many folk? And I'm just... amused by having fitted together a model for why "no, that's not what's happening" doesn't do what I expect, that is superficially such a contradiction to the fiction.
I think it isn't, of course: this is how emotion interacts with making big decisions, not trivial ones. I'm simultaneously (still) exploring the potential of having unjustified or arbitrary preferences, particularly in the context of modern art
. Just: goodness, but the inherently contradictory nature of existing. Think, two things on their own and both at once.
* Yes, we're aware that puts them in potentially awkward positions, but we've negotiated this very carefully in specific instances where I get the strongest compulsions to Just Vanish.