There is this famous TED talk by daniel kahneman on experiences and memory - http://www.youtube.com/watch?v=XgRlrBl-7Yg. The "experiencing self" as he calls it, makes me think of a machine which stores a likely state of the present situation. It stores the context (the state of present situation) and then the "remembering self" comes into effect and passes this context through a "filter" to "decide" if it would be worth retrieving it in the future.
This analogy sounds simple to reason, but are there only two selves? Quite funnily, I think the answer would be a unanimous NO. Well then why does the reasoning here, of the two selves, seem sound? The reason could be because of the way the question was presented, "had it ruined the experience of the event or the memory of it?" The question no doubt is very interesting, do we have a probabilistic matching scheme where we generate predictions, for a finite state, and decide to make error corrections based on the feedback. The prediction of spending money on a concert, is to have a pleasant experience. This matching is violated if we have an unpleasant experience. Does this change if we did not have to pay for the concert? Does this change if the concert was inexpensive? Does it change if the artist dies right after the concert?
In such a probability matching mechanism it would be highly pertinent to store this feedback when it was completely valid and more pertinent when it was negative. (you know to make better predictions) The higher the deviations from the prediction the higher the probability the event should be stored. Of course, we run into capacity constraints, processing bottlenecks and plain random chance. These factors make the process inefficient and reduce the feedback effects for the matching. This said I think it's pertinent to investigate what events bypass one system completely (of the two identified) and go into the next. Can we remember without experience and can we experience without remembering? Can we improve the experience without improving the memory? Can we improve the memory without improving the experience? Can we reduce the experience and still improve the memory?
The bigger question becomes what other projective selfs are evident in such weird defects? Is there a action self, which tries to predict action in unlikely scenarios (say you were hit by a tornado or an earthquake). What information is used in this situation? Do we create false experiences to rely on? Do we freeze in a processing bottleneck?
What other processing selfs can we think of: save your soul self, don't be stupid self, oh no she didn't self!!