The terms “narrative” and “development” would appear to be difficult to relate to one another. While “narrative” frequently connotes movement backward in time and would thus seem to be a retrospective concept, “development” connotes movement forward in time and would thus seem to be a prospective concept. In this article, I seek to rethink both of these terms in such a way as to render them more compatible. In doing so, I focus on the idea of narrative identity, which, I suggest, is not only about the self but about the other-than-self, especially those goods that draw the process forward.
1 I am going to take the liberty of beginning this article with a brief bit of intellectual autobiography in order to establish some context for the ideas to come. As a student in the Committee on Human Development at the University of Chicago back in the early 1980s, I became fascinated by the concept of development, not only as developmental psychologists such as Piaget and Kohlberg talked about it, but in a more “existential” way, focused on the very ends of being human. This interest is still with me today in many ways, but at this point I am much more likely to speak about the good—which is to say, I am much more likely to speak in explicitly ethical terms. One way or the other, I continue to be fascinated by the idea that human lives are characterized not only by change but, at times, by some form or other of progress, growth, such that we move from a lesser state of being or knowing to a better one.
2 Another fascination emerged around this time as well. Not long after I arrived at Chicago, I had the great good fortune of studying with the philosopher Paul Ricoeur, first in a seminar on the “Phenomenology of Time Consciousness,” and later in courses such as “Historicity, History, and Narrative” and “Mythical Time.” These courses were extraordinary. Related ideas were being pursued elsewhere in the University, too—in the Department of English Language and Literature, for instance, along with several others. Especially noteworthy in this context was the publication of the edited volume On Narrative (Mitchell, 1981), which seemed to bring an entirely new field of inquiry into being. Not surprisingly, these ideas had begun to seep into the Committee on Human Development as well, particularly through the work of Bert Cohler (e.g., 1981), a psychoanalyst and student of the life course, who was among the very first psychologists to pursue the study of lives through the lens of narrative. Narrative was therefore very much in the air during my early years at Chicago, in disciplines ranging from philosophy to psychology and beyond.
3 As exciting as this period was, it created a tension of sorts. For while “development” generally connotes movement forward in time and would thus seem to be an essentially prospective concept, “narrative” generally connotes movement backward in time and would thus seem to be an essentially retrospective concept.1 How might I work through this tension? One of my challenges, both then and since, has been to try to relate these two constellations of ideas. As I suggested back in 1984, in a piece entitled “History, Narrative, and Life-Span Developmental Knowledge,”
4 This is not unrelated to what Paul Ricoeur, in his remarkable book on Freud (1970), referred to as the “dialectic of archeology and teleology,” a dynamic process in which the unearthing of the buried past is, at one and the same time, a revelation of what is to come.
5 Basically, then, my argument at the time was that the trajectory of the past articulated through narrative bore within it a certain momentum, a directedness toward the future. But that was not all. In a distinct sense, I also argued, that which we often conceptualize in prospective terms—for instance, what predicts what, as determined through longitudinal inquiry —itself relies on retrospection and narration: we only know what the predictors are after the data are in; the results at time 2 (or 3 or whenever) reveal the potential of time 1; with the “ending” in hand, the story of the relationship between the two can finally be told. This seemed to be true of development as well: only by knowing the ending, the telos, could one even begin to talk about it. Development, therefore, and prospective inquiry more generally, turned out to be a function of historical understanding, of a synoptic act of historical seeing that transforms events into episodes in an evolving narrative.2 Ricoeur (1991) refers to this process of emplotment as a “synthesis of heterogeneous elements” (p. 21). Development and narrative were thus not so far apart as initially seemed. Positing the former required the latter. Development was an ongoing story to be told. That was phase one. It was useful but quite incomplete.
6 Later on, in another series of attempts to work through this set of issues (e.g., Freeman, 1985, 1991, 1993; Freeman & Robinson, 1990), I found myself moving in a somewhat different direction. Development, it was argued (especially in a 1990 piece by myself and Rick Robinson called “The Development Within”), might profitably be understood not as the movement toward some pre-specified end (as in Piaget’s or Kohlberg’s theories) but as the revision of ends. Moreover,
Along the lines being drawn here, therefore, development was to be understood as a perpetual process of reconstructing or “rewriting” one’s own ends, goals, teloi, in the service of something better.
7 The good news (for some) was that we had aspired in this work to think beyond the traditional cognitive-developmental point of view and, in a sense, “free” individuals to articulate and pursue their own ends rather than those posited by developmental theorists. Our “model,” such as it was, was ipsative rather than normative: it was the experiencing individual who called the developmental shots. We thus encouraged a “‘research attitude’ that acknowledges and is informed by the need to remain open to the variety of ways in which the process of development might be manifested in the lives we study” (p. 70). The not-so-good-news was not unrelated to these very ideas. As my colleague and friend Bernard Kaplan (Professor Emeritus from Clark University who passed away several years ago) asked, shortly after I spoke to him about this new way of thinking about development, “So if Charles Manson decided that he was developing by slaughtering innocent people (in the service of some “ideal”), that would make it so? Development is ultimately a matter of individual preference? It’s simply my story of why this new end is better than the old one? And what exactly is meant by better, anyway?” I was always grateful to Bernie for his comments. So that was phase two.
8 During the next phase, I would obviously have to come to terms with some of these thorny questions. The process of development, I went on to argue, “taken . . . in the broad sense of a progressive movement toward desired ends, is necessarily tied to the moral [domain], for the simple reason that this progressive movement is itself unthinkable outside of some conception of where it ought to be heading. To this extent,” I added, “it can plausibly be maintained that the concept of development, whatever the specific domain of interest, is intrinsically bound up with both the idea of narrative and the idea of the moral” (Freeman, 1991, p. 83). So far so good: it wasn’t just individual preference that was involved but morality and ethics. Even then, however, an important and still-thorny question remained. Here is how I put it: “If, in fact, there is no objectively-based, universally prescribable ‘ought’ to be identified a priori, that is, ahead of the dynamic movement of life itself, then how are we to speak about development at all?” (p. 83). The movement toward a better way would have to be argued for and justified. We know what is better “precisely by the juxtaposition of our newly fashioned interpretive context against our old one, which becomes inadequate in the very process of its supersession. . . . The notion of better, of developmental progress, therefore, derives not from a comparison of two readings of experience held apart from one another in putatively objective fashion, but rather from their relationship, from the transformation of the one into the other” (p. 97).
9 Was this formulation any better? Yes and no. On the one hand, this notion of the relationship between “worse” and “better” served to clear a more adequate space for thinking about the moral and ethical realm. Plus, this perspective seemed to better handle the relationship between past and future. Nevertheless, there remained what appeared to be an overly individualistic orientation to the entire framework. The story of development was still the story of the sovereign self, struggling with life, trying to decide how best to forge ahead, dizzied by the fact that there was no certain path to follow. It eventually struck me not only as overly relativistic, as Kaplan had suggested, but a bit too existentially self-enclosed. That, in any case, was the third phase. Clearly, there was a good deal more to be done. But what?
10 The story continues. Alongside the tension between the idea of development and the idea of narrative, there began to emerge another tension, equally vexing and equally urgent to work through. For much of the time I had been working on the ideas considered thus far, the primary category of interest was the self—not the self-enclosed, monadic self that so many had critiqued but the narrative self, located in language, culture, and history. I was also interested in exploring memory and, most recently, the phenomenon of hindsight (Freeman, 2010), my perspective being that this process of looking backward over the terrain of the past is of central importance not only in “the examined life,” but, more specifically, in moral and ethical life, where there is a special tendency to act first and think later. I refer to this state of affairs as “moral lateness.”
11 One very tragic instance of this lateness is found in Primo Levi’s chapter on “Shame” from The Drowned and the Saved (1989), where he describes the horror, following liberation from the Nazi concentration camp in which he had been imprisoned, of looking backward and seeing his own moral and ethical failings during his time there. “Few survivors feel guilty about having deliberately damaged, robbed, or beaten a companion,” Levi notes. “By contrast, however, almost everybody feels guilty of having omitted to offer help” (p. 78). There were, of course, good reasons for their behaving the way they did at the time; they did what they could to survive. But this did little to ease their guilt and shame. He goes on to tell a brief story that had continued to haunt him. In August of 1944, Auschwitz was extremely hot, and Levi and his fellow prisoners were tormented by thirst. During the course of an assignment that involved clearing out rubble from a cellar, he had come across a two-inch pipe that contained a spigot above the floor. It looked like it might contain water, so he stretched out on the floor and had a few drops. “How much,” he had asked, “can a two-inch pipe one or two meters high contain? A liter, perhaps not even that. I could have drunk it all immediately. . . . Or save a bit for the next day. Or share half of it with Alberto,” his friend. “Or reveal the secret to the whole squad.” He chose the third, “that of selfishness extended to the person closest to you. . . . We drank all the water in small, avaricious gulps, changing places under the spigot, just the two of us. On the sly.” Shortly after, Levi had encountered a man named Daniele, “all gray with cement dust, his lips cracked and his eyes feverish.” “I exchanged a look with Alberto; we understood each other immediately and hoped nobody had seen us. But Daniele had caught a glimpse of us in that strange position, supine near the wall among the rubble, and had suspected something, and then had guessed” (p. 80). They had been discovered.
12 Looking at this incident now, through the terrible “wisdom” of hindsight, Levi cannot help but be ashamed at what he had done. But of course what is also of central importance in this example is that this process of hindsight, this “work” of hindsight, is very much about our relationship to others and to the larger world. Stories of this sort, along with a number of others, led me to be dissatisfied with the way I had been formulating things. Some of this dissatisfaction also stemmed from other readings I was pursuing at the time, particularly the works of Emmanuel Levinas, Iris Murdoch, and Charles Taylor. For Levinas (e.g., 1985), the primary category for understanding the human realm is not the self but the Other, the face of the Other, the flesh and blood person. For Murdoch (e.g., 1970) too, personal and spiritual growth requires that we are oriented to what is other than ourselves, to that which “unselfs” us, as she puts it, takes us beyond the perimeter of our own egocentric concerns. For Taylor, whose (1989) work is perhaps most directly applicable in the present context, the main sources of inspiration for our commitments and passions must likewise derive from without, from those external “goods” that serve to direct the flow of our lives and their resultant stories. For each of these thinkers, then, it was precisely the other-than-self that deserved center stage in thinking about both narrative and development. Moreover, they had each posited the centrality of relationship in thinking about these issues—to other people (Levinas especially), but also to those other “objects of attention” (see especially Weil, 1952/1997)—for instance, art, nature, God—that draw the self outward, beyond its own borders. I also began exploring the seminal work of Martin Buber (e.g., 1965, 1970), who speaks of the importance of “the life in which the individual . . . is essentially related to something other than himself” (p. 166). For Buber, “the genuineness and adequacy of the self cannot stand the test in self-commerce, but only in communication with the whole of otherness” (p. 178). As such, “The question of what man is cannot be answered by a consideration of existence or of self-being as such, but only by a consideration of the essential connexion of the human person and his relations with all being” (p. 180). Along these lines, it came to seem that personal narrative was as much about the Other as the self and that relationship—or, perhaps more appropriately, relatedness (see Freeman, 1999, 2007)—is constitutive of the stories people tell about their lives.
13 In much of this work, I have thus sought to rethink the idea of narrative and thereby to rethink the idea of development as well. Rather than thinking of narrative mainly in terms of its orientation to the past, I have tried to suggest that it bears upon the future as well: the process of rewriting the self is at one and the same time a process of articulating the self-to-be, or the self that ought to be. At the same time, rather than thinking of narrative mainly in terms of this category of the self, I have tried to suggest that narrative is also very much about the other-than-self, about the ends—and the goods—that are operative in the process at hand. In fact, as I argue in my recent book, The Priority of the Other: Thinking and Living Beyond the Self (2014), this category of the Other is primary; that is, it comes before the self, which in turn suggests that we may need a new language for conceptualizing significant dimensions of human experience.
14 The challenge has been to try to bring these two tensions—development/narrative and self/other—together in some serviceable whole. In some recent work, in which I explore the idea of narrative identity, especially as it is found in Ricoeur (e.g., 1992), I consider the dynamics of what I have come to call the “double triad” of narrative identity (Freeman, 2013). The first triad, which I call “spheres of temporality,” suggests that narrative identity emerges in and through the interplay of past, present, and future in the form of remembering, acting, and imagining. As for the second triad, which I call “spheres of otherness,” I suggest that this temporal interplay is itself interwoven with our relation to other people, to the non-human world (e.g., nature, art), and to those moral and ethical goods that serve to orient and direct the course of human lives. By linking these two triadic spheres together, therefore, my aim is to arrive at a picture of narrative identity appropriate to the complexities of its formation. I should confess that I am generally not the model-building type, and it still isn’t entirely clear how useful this particular model is. So far, I have tested it out, in writing, on only two cases: my mother, a 91-year old woman with dementia (among other things) and Keith Richards (of Rolling Stones fame), who, as represented in his book Life (2010), seemed particularly appropriate for the model (see Freeman, 2013a, 2013b). The model is thus very much a work in progress. Nevertheless, my hope is that, with further refinement, it will prove to be a useful tool for explicating some of the complexities of narrative identity.
15 The ideas discussed thus far remain rough and need to be unpacked substantially further. Let me therefore try to put some flesh onto them by turning briefly to one of the very few pieces I ever did on youth and youth identity (which I wrote with Mihaly Csikszentmihalyi and Reed Larson almost 30 years ago [Freeman, Csikszentmihalyi, & Larson, 1986]) that does fairly well, in some ways, to highlight the relationship we have been considering. Let me hasten to add that there is lots of good (and up-to-date) work being done on youth identity, much of it deriving from Dan McAdams’s ideas about narrative identity, in adolescence and beyond (see, e.g., Habermas & Bluck, 2000; McLean, 2008; McLean & Pratt, 2006; McLean, Pasupathi, & Pals, 2007). But this particular piece actually speaks more directly to the narrative/development connection. Indeed, I consider it a real stroke of luck to have had the opportunity to pursue the study on which this piece is based, because it really served to focus and sharpen some of the issues I had been exploring, especially the difference between life as lived and life as told.
16 In this study, my colleagues and I focused on the relationship between ongoing emotional experience and remembered experience; that is, we wanted to know the quality of their emotional experience as it was actually transpiring as well as their recollection of experience. That there might be a disparity between the two would come as no surprise. As is well known, recollection, especially of the “taking-stock” sort being considered, is an interpretive and reconstructive process. For this reason, of course, it is often held in suspicion, the supposition being that it cannot help but distort and falsify the past. This sometimes happens, to be sure. But recollection of this sort—which entails what I (e.g., 2010) have come to call narrative reflection—can also perform important “developmental work,” as we called it. The challenge in this case was to try to determine what kind of developmental work it might perform. By looking at both immediate emotional experience and recollection and examining the intersection of the two, we would be able to have in hand some helpful information about what might be happening developmentally.
17 Following the lead of Erikson (e.g., 1968) and others, we noted that this developmental work is especially salient in adolescence, at least in much of contemporary Western culture. The process of identity formation is frequently one of taking-stock, of looking inward, and trying to discern who and what one is, in the eyes of others as well as in one’s own inner depths. Following on the hindsight idea, this looking inward frequently assumes the form of looking backward. When an adolescent looks back on his or her past, Erikson had suggested, he or she does so as a person in transit and, perhaps, flux. One tries to find one’s place in what may feel like a shifting, unstable world. Through narrative reflection, some of this shifting ground can be stabilized, the disparate dimensions of one’s life brought together, if only temporarily. In this respect, narrative reflection, in adolescence and beyond, is often assumed to be in the service of personal coherence—which it sometimes is. This isn’t always the case, however. For some people, in fact, the challenge at hand is to disrupt and deconstruct this very coherence in order that a new self and a new identity might be born. In both cases, it should be noted, the developmental work being done is, at one and the same time, moral and ethical work, oriented toward how one ought to behave and, more substantively, how one ought to live.
18 Returning briefly to the study, we examined immediate emotional experience and remembered experience in the context of time with family, with friends, and alone. In regard to the former, we looked at the quality of experience at two different times, separated by a span of two years, through the Experience Sampling Method (ESM), which asks people to describe their experiences via a brief questionnaire throughout the day over the course of a week or two and to indicate how they felt at the time. (When a randomly-set “beeper” goes off, they are to pause and fill out the questionnaire.) In regard to the latter, participants in the study were asked to look back over this two-year span and reflect on whatever changes they believed themselves to have undergone. Without getting into all the details, what this study showed was that, for this particular group of adolescents, there was little change in the quality of their immediate experience in the three different contexts across the course of the two years. Their recollections, however, generally suggested that significant positive change had occurred. So the question was: “How, if at all, are these adolescents able to substantiate these subjective changes in the absence of more ‘objective’ ones?” (p. 175)
19 I would frame this question differently now, especially in light of my thinking about hindsight. I would ask instead: What might these adolescents have been able to discern, now, looking backward, about the movement of their experience across the two years? I offer this reframing for an important reason. In line with what was said earlier, there is a tendency to think of immediate experience, in the present moment, as somehow more real and objective than remembered experience, indeed to see it as a kind of baseline of the real. From this perspective, the truth of the matter would be that there hadn’t been any change at all in the nature of their emotional lives across the two years but that, whatever the reason, they believed, erroneously, that there had. That is possible. But the stories these adolescents told about how and why they had changed pointed us in a different direction. With family, for instance, they talked about a developing commitment to communication and understanding along with an increased recognition of others’ own needs and desires, quirks and flaws. This didn’t mean that conflicts had ended or that their parents and siblings no longer annoyed them, only that they were somehow able to situate their experience in a broader, more expansive frame of reference. In the context of time with friends, they recounted their ability to move beyond surface talk and dig deeper into people’s inner lives. As with family, conflicts had remained. But their newfound capacities for understanding and intimacy allowed them to re-locate these conflicts, put them in their proper place. As for time alone, it had apparently become a more positive space for reflection, for being with oneself, and not feeling isolated or worrying about what others might think.
20 Again, it is possible that these accounts are rationalizations or illusions, fictions they told themselves, perhaps in order to assure themselves—and us—that they had in fact been growing, moving forward. In this case, faux-recollection would bring with it faux- development. It may be worth noting that these (hypothetical) fictional changes could still have had very real effects on their identity and sense of self. On some level, in other words, it may be enough for one to believe one has been developing even when, by all indications, it is not the case. But we would be hard-pressed to call this sort of process “development.” For true development can only issue from true insight into the meaning of the past. In any case, this study proved to be a remarkable vehicle for exploring both the emotional life of this group of adolescents and, more specifically, the difference between life as lived, moment to moment, and life as told, looking backward. By way of note, I am not inclined to erect too firm a boundary between the two. “Life,” I have argued, is itself narratively constituted and structured (e.g., Freeman, 1998).
21 However, there still remains a difference between this dimension of lived narrativity and that form of narrativity that emerges when we “take the time” to tell. Let us therefore examine in greater detail the nature of this difference and its possible significance for understanding the development of identity.
22 Now that we have undertaken this brief exploration of adolescent experience, we can return to some of the theoretical issues introduced earlier. To fashion one’s identity is also to fashion what Charles Taylor (1989) has referred to as a “framework.” “To articulate a framework,” he writes, “is to explicate what makes sense of our moral responses” (p. 26). It is a structure of hierarchically-ordered commitments, an identification of one’s priorities. Indeed, he offers, “we are only selves insofar as we move in a certain space of questions, as we seek and find an orientation to the good” (p. 34). I would modify this statement slightly by saying that we only shape and reshape our identity insofar as we move in this questioning and questing space. To form an identity, then, is not only about articulating who and what one is, but what one stands for, what one considers right and good. And “this sense of the good,” Taylor tells us,
The movements from self to Other, and from past to future, thus turn out to be intimately related to one another.
23 Development, Taylor goes to suggest (in a decidedly more sophisticated version of some of my own ideas described earlier), is a process of “reasoning in transitions. . . . It aims to establish, not that some position is correct but rather that some position is superior to some other. It is concerned, covertly or openly, implicitly or explicitly, with comparative propositions” (p. 72). I am not completely sure my aforementioned colleague and friend Bernie Kaplan would like this either; in the absence of a correct position, an absolute position, it might be argued, there inevitably remains in the picture a measure of individual preference. What I would suggest in response, however, is that the positing of a superior position, rather than being a matter of mere preference, may instead be understood as a matter of “narrative gain,” as we might put it—which is to say, a matter of articulating, through narrative, a view of one’s existence that is demonstrably more adequate, capacious, truthful, and/or ethically sound than the one it has superseded. It should be noted that there was something of a confounding factor involved in the adolescence study just discussed by virtue of the fact that we asked our respondents how their emotional lives had changed. By doing so, we called forth a measure of narrative reflection that may not otherwise have taken place. There may have been some positive bias involved too; few people would want to speak in detail about having gone downhill. These qualifications notwithstanding, there still remains an important connection between narrative and development. For it is only by coming to terms with one’s past, in the present, that one can move toward the future armed with a better, more adequate vision of who and what one ought to be.
24 With this set of ideas in mind, let me offer a few cautious and somewhat tentative words about the contemporary world of adolescence—which, admittedly, I know mainly through my own daughters (now 24 and 27) and their friends as well as the many college- age young people I have taught over the course of the last few decades. There is no question but that social media in particular, and the technologization of communication more generally, is having some impact on the shaping of identity (e.g., Boyd, 2014; Gardner & Davis, 2013). The question is what kind of impact it has. I don’t think there is a single, unequivocal answer to this question, and the main reason is that it all depends on how one thinks about life and what it’s all about. I could tell my two daughters to look away from their iPhones once in a while because it is detracting from “real” communication and real emotional connection, and I could also tell them that they are missing out on the world by virtue of attending to a little machine. But they would likely be mystified by my perspective and turn back to their iPhones.
25 Rather than waxing nostalgic, then, I want to voice a concern that grows out of the ideas I introduced earlier, one that is less about the possible unreality of technologically-mediated communication and more about the idea of narrative and its relation to development. I shall try my best to be descriptive here rather than prescriptive, at least for the time being. Here is what I am seeing, especially in terms of the use of social media such as Facebook. First, there appears to be a shrinkage, of sorts, in what I earlier referred to as “spheres of temporality,” a tendency to be focused more on the present—and, on some level, the transient and ephemeral—than on past or future, reflecting or imagining. By all indications, many no longer “take the time,” as I put it before, to engage in considered reflection, narrative reflection. Instead, they are often swept along by the tide of this or that momentary concern or crisis. Perhaps this is okay; some might even portray it as a Buddhist-like privileging of the present moment, “the power of now.” But I am concerned about this momentariness and fleetingness. And there is a reason: insofar as the examined life depends, to some degree, on narrative reflection, on this “pausing” that I have been referring to, then this process of examination is being cut short, diminished, leading to a kind of “de-narrativized” self. Perhaps I have phrased this wrong. Perhaps I have got this wrong. It may be that we are in the process of witnessing new forms of life examination, self-inspection, and new forms of emotional life, neither better nor worse than any other. Nevertheless, I wonder about this presentness, this rootedness in the (mediated) moment. Interestingly enough, the only other context in which I have used the term “de-narrativized” is in the work I have done on my mother, a 91-year old woman who suffers from dementia. I am not suggesting that today’s youth is suffering from dementia! But the fact is, the world many of them inhabit is transforming experience and memory in radically new ways. There is more focus on the immediate moment, the now, of the text message or tweet, and more heterogeneity; there is new information coming in constantly, from different sources, addressing different things. It can be exciting, to be sure. It undoubtedly allows people (some people, at any rate) to feel recognized, affirmed, perhaps wanted. This may well play out positively in one’s identity. But with this increased focus on the present moment and with the relentless influx of information, there is also likely to be less narrative synthesis, reflective taking-stock, less time or inclination to look backward and see what it all might mean. Does this matter?
26 Some may say no. As my friend and colleague Michael Bamberg (e.g., 2006, 2011) might argue, there still remain plenty of opportunities for narrative synthesis, for hearing and telling stories; they are just smaller stories, tied more to the demands of the moment. Instant messaging is a good example of just this sort of micro-synthetic small story exchange. Developmental identity work is being done in such exchange too, he would likely add, only in a different way, one that’s more ongoing, ever under construction. There is surely some validity to this perspective.
27 But my concern stands. One of the things I argued in my book Hindsight (2010) was that “Self-understanding occurs, in significant part, through narrative reflection, which is itself a product of hindsight” (p. 4). I also went on to suggest that “hindsight plays an integral role in shaping and deepening moral life” (p. 5). This is because through hindsight, of the sort we find in narrative reflection, there emerges the opportunity not only to see things anew but in and through this very seeing to move beyond our previous view in service of a better one. This process is particularly salient in the case of moral life, where there is tendency to act first and think later; we can correct the errors of our earlier ways and thereby move ahead. If narrative reflection of this “larger” sort is a requisite condition for moral and ethical development and if the opportunities for such reflection are being curtailed by the increasing primacy of the present moment, the now of instantaneous communication, it may be that development is being curtailed as well.
28 I do not offer these ideas as assertions, only possibilities. It may be that narrative reflection, as I have presented it here, is not a requisite condition for moral and ethical development at all. It may be that narrative reflection is not being curtailed either; it may just be occurring in a different way. Moreover, it may be that the idea of “development” is an outdated one, less relevant to the kind of life-world that has emerged. So too for the idea of “identity,” which presumes a continuity-in-and-through-difference that may be in the process of being superseded by the heterogeneity and “multiphrenia” (Gergen, 1991) that characterizes much of the contemporary landscape. Having offered these qualifications, I do wonder about the pervasiveness of social media technology and where it will all lead. What has happened in the course of the last decade or so is quite extraordinary, and technological “development” continues apace. How it will play out in terms of human development—assuming the idea still retains some sense—remains to be seen.
29 A related difficulty has to do with attention, which would appear to be more dispersed, even scattered, through the world of social media (Carr, 2011; Jackson, 2009). There would appear to be correlates in emotional experience as well; it too is frequently being dispersed and spread wide, moving from encounter to encounter, surging forth with the latest post, only to recede and give way to the next. Perhaps this isn’t a problem either. Perhaps what we are seeing are new forms of attention, more “distributed” (if we wanted to be charitable about it). But I do wonder about this as well, and I especially wonder about its consequences for moral and ethical life. I am not merely referring to the fact that there is frequently a certain solitariness to what is going on, the fact that it’s you and the screen rather than a flesh-and-blood person. I am thinking instead of the quality and kind of attention that is involved.
30 In this context, I think of the work of Iris Murdoch (1970), who was extremely interested in the idea of attention and its consequences for moral and ethical life. Her comments on the importance of art for sharpening and deepening attention may be instructive. “Art,” she notes, “presents the most comprehensible examples of the almost irresistible human tendency to seek consolation in fantasy and also of the effort to resist this and the vision of reality which comes with success.” Success is rare and difficult: “To silence and expel self, to contemplate and delineate nature with a clear eye, is not easy and demands a moral discipline.” Much the same may be said for the appreciation of art. And yet, “The appreciation of beauty in art or nature is not only (for all its difficulties) the easiest available spiritual exercise; it is also a completely adequate entry into (and not just analogy of) the good life, since it is the checking of selfishness in the interest of seeing the real. Of course great artist are ‘personalities’ and have special styles. . . . But the greatest art is ‘impersonal’ because it shows us the world, our world and not another one, with a clarity which startles and delights us simply because we are not used to looking at the real world at all” (p. 63). As Murdoch goes on to suggest, “It is important too that great art teaches us how real things can be looked at and loved without being seized and used, without being appropriated into the greedy organism of the self.” As noted earlier, she refers to this process as “unselfing,” and it is central to moral and ethical life. For, in becoming more attentive and better attuned to the otherness of works of art, we may also become more attentive and better attuned to the separateness and differentness of people. And, “The more the separateness and differentness of other people is realized, and the fact seen that another man has needs and wishes as demanding as one’s own, the harder it becomes to treat a person as a thing” (p. 65).
31 In speaking about attention in this way, Murdoch certainly wouldn’t want to claim that the relationship at hand, between attention and moral life, is a necessary one. There have surely been plenty of people who have been highly attentive to this or that work of art or literature or activity who were quite bad people. Likewise, there are surely quite inattentive people who are pretty decent folks. But if Murdoch is right, the scattering or diminution of attention is bound to play itself out negatively in the socio-moral realm, the human realm. This is because, in the absence of undivided attention to what is other, our own ego-driven projections and fantasies are bound to step in and occlude what—and who—is there.
32 This brings me to a second feature I am seeing in many young people. Just as there has, arguably, been a shrinkage of sorts in the aforementioned spheres of temporality, there has been a concomitant shrinkage in what I called “spheres of otherness,” in relation to people, to the non-human world, and to those basic goods, outside the self, that serve to orient and bring purpose to people’s lives. What has emerged, therefore, is a kind of double ego-centricity, both temporal and relational, and in turn a possible involution of certain aspects of emotional life. I could be wrong about all of this too. What may be more solitary and self- enclosed on one level (insofar as it’s just you and the screen) may be more social on another. As such, and again, we may simply be encountering new forms of sociality, less focused on the fleshy intimacy of the face-to-face relation and, once again, more “distributed.”
33 It is difficult, however, for me not to see some of what is going on as a problem, a loss. It is a loss of the flesh-and-blood other, the face of the other, and if Levinas (e.g., 1985) is right about the significance of the face, there cannot help but be a diminution in responsiveness and responsibility. From Levinas’s perspective, moreover, there cannot help but be a diminution in one’s very being, for “responsibility [is] the essential, primary and fundamental structure of subjectivity” (p. 95). The comment I just made about the loss of world—the non-human Other—is relevant here too. In attending to a machine, in some instances for many hours per day, one deprives oneself of the existential “nourishment” the world can provide, and without this nourishment the self is sure to be depleted in some way. As I suggest in The Priority of the Other (2014), the dispersion of attention—not just in the contexts of social media but in the context of busy, chaotic, even fragmented lives (of the sort many of us live)—can result in a kind of existential undernourishment or malnourishment; we can become self-enclosed, even “autistic” in a way, just focused on the next task, the next thing to be done.
34 I don’t want to go too far with this way of thinking either. For one, others have already done it (e.g., Turkle, 2011). For another, the world of today’s youth is, again, different from the one that shaped my own identity, and it is important to be cautious about proclaiming it a lesser one. Indeed, it may be that different criteria—indeed different kinds of criteria—need to be brought to bear on the issues in question, ones that are more hermeneutically appropriate to what is going on now. This is surely what my own daughters would say (though not in these exact words!). I love them dearly. But they are hardly the last word.
35 In closing, I want to address one final idea, more directly related to the kind of emotional evocation that some others whose pieces are included in the present volume are concerned with. According to Paul Ricoeur (1981b), human action and human speech may themselves be regarded as “texts” of a sort. In the era of social media, the textuality of human action and language are in the process of being transformed. So too is the process of “reading” others. The question is how these new forms of textuality and reading might affect the dynamics of emotional education—whether, for instance, they serve to blunt the kind of emotional tonality that accrues from the intimacy of the face-to-face relation. Consider the emoticon in this context. In a recent text message to one of my daughters, I ended with a smiley face, a heart, and a beer mug. I am sure she had some idea of what I was feeling; all she had to do was add up the three symbols, do a bit of emotional synthesis, and arrive at the feeling in question. Clearly, I was a happy, loving, beer-quaffing dad, raising my mug in celebration of something or other. Is there a need to tell her more than this? Or am I pining for the old pre-emoticon days, when the messages and end of message signings-off were fuller, more complete?
36 There is no doubt but that the kinds of changes we have been considering in last few pages will have a significant impact on our moral and ethical lives. It is already happening. How it will all turn out in terms of the larger realms of culture and history is difficult to say.