I think one aspect that an AI system would struggle to handle convincingly is the blend of conflicting emotions in response to incidents in our personal lives. Because the system is designed to operate by probabilities, it will tend to smooth out less common or expected emotional responses - what I call the 'bland airline food phenomenon'.
When we write, we tend to include examples from our own lives that illustrate particular points, that have elicited strong emotions or prompted telling insights.
But as I say, there will often be conflicting or overlapping feelings in there, which I don't think AI is likely to be able to reproduce, as it will tend - for all its famed love of tapestries - to follow one conventional strand of 'thought', rather than intertwining several different emotional and logical responses.