G-RPL8H870J8
top of page

Available in:

France.png
usa.png
arabe.png
Kenya.jpg

(Re)discovery

The Digital Revolution and the Attention Economy


A New Age of Capitalism?

Bado Ndoye

Professor of Philosophy

Cheikh Anta Diop University, Dakar

ndoyenne1789@gmail.com

issue:

Économie numérique en Afrique

Digital Economy in Africa

Uchumi wa Kidijitali Barani Afrika

الاقتصاد الرقمي في إفريقيا

GAJ numéro 02 première.jpg.jpg

Published on:

December 20, 2024

ISSN: 

3020-0458

08.2024

Given how digital technology is overturning the order of our lives, by reconfiguring not only the knowledge architecture, but also all areas of our daily existence, it has become commonplace to view this transformation as a revolution. Its consequences may already be out of all proportion to the great technical revolutions that have punctuated the history of mankind. For the time being, we cannot take the exact measure. This is partly because, as contemporary witnesses, we lack sufficient historical hindsight to make definitive judgments, and partly because it has not yet finished unfolding all its effects. However, we can already see some broad outlines taking shape. In particular, a new economy built around the discovery of a new scarcity - attention - is being put in place. This shift is visibly reconfiguring our modes of production, exchange, and communication.

In what follows, I would like to show that the so-called “attention economy” marks a new stage in the evolution of capitalism. In its frenzied quest for new territories, capitalism is now turning its attention to our psychic resources. For this purpose, I will proceed in a double movement: first, I will show what the radical novelty of the digital age consist in, by tracing its genealogy and showing the history and epistemology behind it; second, I will characterize the attention economy, showing how it constitutes a new phase in the development of capitalism.This means identifying the strategies by which it has established itself as an economic model, before pointing out the potential dangers it implies.


Keywords 

Attention economy, digital revolution, capitalism

Plan of the paper

Introduction


Grammatization


History


A new phase of capitalism?


Conclusion

Introduction

The history of technology abundantly shows that major innovations are not those that prolong an established tradition, which they help to reinforce, but those that bring about breakthroughs. The latter disrupts the infrastructure of societies, generating new issues that can no longer be understood in the terms and canons of the prevailing culture. When such ruptures occur, history changes regime, and the radical newness of the issues at stake demands a critical rethinking of the tools and concepts by which reality has hitherto been apprehended.
Given how digital technology is overturning the order of our lives, by reconfiguring not only the knowledge architecture, but also all areas of our daily existence, it has become commonplace to view this transformation as a revolution. Its consequences may already be out of all proportion to the great technical revolutions that have punctuated the history of mankind. For the time being, we cannot take the exact measure. This is partly because, as contemporary witnesses, we lack sufficient historical hindsight to make definitive judgments, and partly because it has not yet finished unfolding all its effects. However, we can already see some broad outlines taking shape. In particular, a new economy built around the discovery of a new scarcity - attention - is being put in place. This shift is visibly reconfiguring our modes of production, exchange, and communication.
In what follows, I would like to show that the so-called “attention economy” marks a new stage in the evolution of capitalism. In its frenzied quest for new territories, capitalism is now turning its attention to our psychic resources. For this purpose, I will proceed in a double movement: first, I will show what the radical novelty of the digital age consist in, by tracing its genealogy and showing the history and epistemology behind it; second, I will characterize the attention economy, showing how it constitutes a new phase in the development of capitalism.This means identifying the strategies by which it has established itself as an economic model, before pointing out the potential dangers it implies.

Grammatization

At first glance, it may seem odd to see the use of computers and their associated applications as revolutionary practices that, what's more, imply a new age of capitalism[1]. However, it should be clear that, by definition, the technical object is, in essence, plural, in the sense that, beyond its strictly utilitarian function, it is also defined by its capacity to reconfigure social relations and redistribute them according to new perspectives, as we will show below. From this perspective, it will be possible to put into practice the central hypothesis of this work, namely that digital technology, like all major technological advances, but perhaps more so than those that preceded it, is above all an “anthropological constituent”. In other words, through the interaction of machine and brain that it establishes, digital technology inaugurates a new phase in the process of hominization. To establish this purpose, following Bernard Stiegler[2] and Sylvain Auroux, we propose to define it as a new stage in the process of grammatization, due to the extremely dynamic collective intelligence it brings into play which we can see is completely transforming the face of contemporary culture. What does grammatization mean? For Sylvain Auroux, grammatization is at the origin of the invention of writing. It is defined as a process of externalizing human mental and behavioral content into discrete units spread out in space[3]. More generally speaking, it is a process by which movement is spatialized, by isolating its different parts through technical processes of discretization, so as to make it visible and therefore reproducible, making it easier to manipulate, since it has thus been de facto automated. To put it in a word, grammatization is a process of externalizing the life of the mind by materializing thought, language or behavior in manipulable symbols. Through this objectification, the aim is to construct an object of knowledge that could not otherwise exist as such. In this way, Sylvain Auroux demonstrates that, insofar as writing manifests language by making it visible on material support, so to speak, requiring the eye rather than the ear, it makes visible to us the structures that organize it, and which cannot be perceived solely by virtue of oral speech. It is this spatialization of language on a material support, made possible by writing, that lies at the heart of the invention of grammar since it provides us with a means of observing the structures of language, which it then takes as its object. What objectivation enables, then, is the constitution of a scientific object that now lends itself to the practice of manipulation. The cognitive order - that of meaning - is then superimposed on a material order that reproduces it, so to speak, and makes it possible to manipulate it using technical tools.
Digital technology is the final stage in this process of externalizing our cognitive structures, and in a way, it also brings them to a close. This means that we can only grasp its full significance if we succeed in situating it within the history of technology, precisely that which stretches from the emergence of writing to the invention of computers. For this reason, it is important to define it, as a first approximation, as a form of writing whose main characteristic is its automaticity. Indeed, digital writing is an automatic form of writing that makes use of powerful algorithms and computational resources, a form of writing that is carried out and transmitted via networks, and whose main characteristic consists of converting information from a medium – text, audio, video or image – or from an electrical signal into digital data that is then processed by computer devices. This data takes the form of a sequence of characters and numbers representing or symbolizing the information in question. More than alphabetical writing, digital technology is a cognitive and memory prosthesis with almost infinite reproduction and storage possibilities. Given its unprecedented possibilities in terms of storage, reproducibility, data processing and access, it ushers in a new paradigm in which the totality of knowledge is being reconfigured according to dynamics that would be tedious to analyze here.
Posing the question of the digital within the general framework of the history of technical innovations, a history which is at the same time that of different grammatizations, means making the history of hominization, by which humanity is increasingly separating itself from its natural origin through a process of artificialization of its life. What this approach should make clear is that the great ruptures that have marked the intellectual history of mankind have largely been made possible by technical innovations. To sum up, when mankind invented writing, geometry appeared; when it invented printing, experimental sciences were born with Galileo. In both cases, what made the invention possible was the fact that the human mind, relieved of the task of memorization because it had prostheses for this purpose, made itself available to create a new science, which in each case reconfigured the economy of knowledge. Now that all our cognitive functions have been completely outsourced and objectified in machines, we might think that our minds are once again available to create something new. This means that we could be on the threshold of a great bifurcation that could bring the technical and social systems out of alignment[4]. In this respect, we should take a close look at how our minds are used.
In this respect, we should take seriously Bertrand Gille's[5] thesis that every society is founded on a more or less stable coupling between the individual, the socio-cultural system, and the technical system, which is constantly evolving, sometimes at prodigiously rapid rates, often in unexpected leaps and bifurcations, producing maladjustments that can destroy social systems which, as we all know, evolve at extremely slow rates. This is particularly true from the Industrial Revolution onwards when the accelerating pace of technical change took on hitherto unknown proportions. However, the full extent of these misalignments between technical and social systems can only be appreciated if we look at the three moments in the history of technology: the 4,000-year-old transition from orality to writing, Gutenberg's invention of printing during the Renaissance, and the advent of digital technology in the 20th century. Each of these three revolutions fundamentally altered the economy of knowledge and modes of cultural transmission - in a word, the habits of thought and ways of being that, over time, have been codified into laws, customs, and rules by which people understand themselves and define their relationship with the world.
 

History

Let us start with the transition from orality to writing. Before the emergence of writing, the human body itself served as a medium for thought. The fundamental cognitive functions of producing, processing, storing and transmitting information are exclusively performed by the brain. In such an ecosystem, it is easy to understand why a function as strategic as memory must play such a central role. The type of cognition that characterizes this ecosystem is memorization, because we are more concerned with conservation than innovation. We know, for example, that when Albert the Great lectured on Aristotelian cosmology and physics at the Sorbonne in the Middle Ages, his students didn't take notes and just listened, because even though writing had existed for millennia, parchment was still extremely rare. They were able to reproduce almost everything they had been taught years later, down to the last comma. The same is true of all societies that have remained traditional, particularly in Africa, where the old man and the griot are emblematic figures, precisely because they are the living memory of their societies[6].
By switching to the written word, the brain's function of memorization is externalized onto a material support. It is no longer the brain that stores information, but stone, animal skin, papyrus or parchment. This is the first externalization onto the inert matter of a subjective function that until then had been the preserve of the human mind. This transition from subject to object is the source of all the changes that will transform man's way of being. What changes is everything that writing will enable in the way of derivative inventions, which were rigorously unthinkable in the oral stage. We might mention the emergence of money among the Phoenicians, which put an end to the barter economy, but above all the birth of geometry, the science of the written word par excellence[7].
These inventions, made possible by the coupling of the written word with a material medium, have given humanity an incredible leap forward in terms of cultural, social, and political progress. One constant that emerges, in the light of these various revolutions, is that when a breakthrough of this scale occurs, it is always met with resistance. Take Platon's famous dialogue, the Phaedrus[8], where writing had long since been invented, and there is still discussion in Athens about what is lost and what is gained by the transition from orality to writing. Socrates develops the famous theory that writing not only empties our minds and causes us to lose our memory, but could also undermine the foundations of the city[9]. In this dialogue, Socrates likens the written word to a pharmakon, i.e., both a remedy and a poison. In other words, something that cures and kills. This means that, while it is the reason for the constitution of the city, because it has produced the positive knowledge on which it is based, it could also be what destroys it. Here are the words Socrates puts into the mouth of the god Thamous, who attacks Theuth, the inventor of writing:
Indeed, this art will produce oblivion in the souls of those who have learned it, because they will cease to exercise their memory: putting their trust in the written word, it is from the outside, thanks to foreign imprints, and not from the inside, thanks to themselves, that they will act of recollection; it is therefore not from memory, but from recollection, that you have found the remedy. As for science, it is the semblance of it that you provide your disciples with, not the reality. When, then, thanks to you, they have heard of many things, without having been taught, they will seem to have much science, whereas, in most cases, they will have no science at all; moreover, they will be unbearable in their trade, because they will have become semblants of scholars, instead of being scholars[10].
The apparent paradox here is that writing, which is supposed to overcome oblivion because it enables us to archive everything, will actually produce the loss of memory, since memory ceases to pass into the object. It should be pointed out, however, that the memory Socrates refers to is obviously not the mnemonic that enabled students in the Middle Ages to reconstruct their lectures years later, but what he calls anamnesis in the Phaedrus and Menon, the ability to appropriate knowledge, so as to be able to reproduce it for oneself, because one possesses it as if it were written on one's soul, and for this reason is in a position to criticize it[11]. What Socrates is saying is that, as a result of writing, this ability to reconstitute the circuits of knowledge could be lost to the benefit of the simple repetition of what we don't understand. In a word, what he criticizes the sophists for doing is developing manipulative techniques that consist of putting stereotypes into the heads of the Athenians, by means of that new science that writing has made possible - rhetoric - to the detriment of the positive knowledge on which the City is founded. For Socrates, the logography practiced by the sophists meant short-circuiting the life of the mind as it should be founded.
With the transition to digital technology, an old utopia that has long haunted philosophy, at least since the Middle Ages' search for the perfect language and the means to overcome what was then known as the “curse of Babel”, comes true. The aim was to find a universal, unambiguous language, capable of dispelling the misunderstandings and ambiguities that distort the correct use of natural languages[12]. However, it was in the 17th century, with Leibniz and his project to formalize and mechanize the operations of thought, that this research program moved decisively towards its technical realization phase. Leibniz assumed that when we reason, we are in fact combining symbols[13]. All thought can therefore be considered as a combination of symbols representing simple notions. On the basis of this conviction, he argues that if we could draw up a systematic table of the simplest, most elementary notions that enter into our thoughts, we could then devise computational procedures for discovering all possible, non-contradictory combinations, and hence all possible thoughts. Leibniz's conviction is that thought is in fact a calculation, but a spontaneous, unconscious calculation that can err, grope, and go astray. He wants to turn it into a conscious calculation of perfect rigor, so that it could be done a machine. This was the very first project to mechanize thought, which, via Boole's binary logic, led to the creation of computers as we know them today. As they exist today, computers are Turing machines. Turing's project was to have a theoretical machine reproduce mechanical calculations. It was from this project that computer science and its corollary, Artificial Intelligence, were born. In 1969, the Pentagon set up a vast rhizome-shaped interconnected network, the Advanced Research Project Network (ARPNET), to speed up communications within the U.S. military. The project was absorbed by the National Science Foundation in 1986. But it was only in 1993, when the invention of the HTTP (Hyper Transfer Protocol) and HTML (Hypertext Markup Language) protocols by CERN researchers enabled the creation of websites and e-mail addresses, that the Internet became a reality accessible to the general public[14].
Like all the great technical revolutions that have shaped human history, the digital revolution has raised all kinds of questions. We will not address them in detail here. We will, however, show that, with the automation of industrial production, we are entering a new age of capitalism, as we announced above, which raises highly complex questions that the social sciences in Africa have not yet seriously begun to investigate. What is new is that machines will gradually replace workers in almost all production activities, as Marx predicted in 1857[15]. This is a major trend in the direction of lower labor costs, and we can assume that in the not-too-distant future, if robot production costs fall - as they inevitably will - all industrial production will move in this direction. With this in mind, the Belgian newspaper Le Soir of July 19, 2014, announces, on the basis of American prospective studies, that France, Belgium, Italy, the USA and Poland could lose between 40% and 50% of their jobs in the next few years. In most developed economies, this trend is already clearly perceptible. Barnard Stiegler draws attention to the fact that, when Marx spoke of the proletarianization of workers in the 19th century, he was referring not only to the impoverishment of workers, but also to the more serious fact that the ancestral knowledge of craftsmen is being transferred to machines, which capture and automate it, in order to reproduce it on an industrial scale[16]. Of course, new types of jobs will be created as a result of the technological innovations themselves. In the same way that the new book economy has led to the disappearance of certain professions and the creation of new ones, new professions will emerge as a result of digital technology. However, no one knows yet how long this will take or even if they will be able to fill the gap left by the jobs destroyed. Immediate economic consequence: this could spell the end of the Ford-Keynesian consumerist model, as it was conceived and founded on the idea that growth can only be based on consumption, which must be sustained by productivity gains distributed in the form of wages to workers. Clearly, if there are fewer and fewer workers, there will obviously be less consumption to sustain growth, and it is hard to see how capitalism could avoid collapsing in a crisis of overproduction far worse than that of 1929. Henry Ford's genius was to see that, to cope with the effects of competition and revive the production machine, it was not enough simply to reorganize work according to the famous Taylorian principles. It was also necessary to extend the market to all those excluded from it. In this case he saw the workers as potential consumers. The latter had to be paid well enough to buy - on credit - the cars they built themselves. In this way, Marx's famous prediction of the “falling rate of profit” could be curbed, or at least postponed. The author of Capital brilliantly demonstrated that this was inevitable, and that in the long term, it would condemn capitalism to a systemic crisis from which it could never recover. In our view, this aporia is the answer to the need to constantly find new markets, to colonize new spaces - in this case, the mental spaces of consumers, i.e., their attentional capacities, so that the infernal dialectic of production and consumption never comes to an end.
 

A new phase of capitalism?

We therefore need to know how to link the automation of industrial production with the development of modern techniques for capturing and manipulating attention, induced by new advertising techniques. Neuromarketing, for example, a commercial technique that consists of inundating consumers with subliminal messages, is the first illustration of the death of the citizen, and his replacement by the consumer. In 2004, Patrick Le Lay, then director of Tf1, made this admission, which caused quite a stir: “From a business perspective, let's be realistic: basically, TF1's job is to help Coca-Cola, for example, sell its product. But for a message to be perceived, the viewer's brain has to be available. The aim of our programs is to make the brain available: in other words, to entertain it, to relax it, to prepare it between two messages. What we sell to Coca-Cola is available brain time"[17]. This means that TV programs have a single purpose: to condition viewers' brains and prepare them to receive advertising messages without resistance. Today, the Internet goes even further, and the whole ecosystem of new screens exposes us to this ubiquitous advertising that has become scientific. We believe that it destroys the traditional processes of education, which it short-circuits by means of highly sophisticated techniques whose aim is to make the public's attention foreign to itself because it has become a commodity. Attention, which has thus become an object of covetousness, finds itself at the heart of commercial, political, and educational issues, testifying to an unprecedented mutation of capitalism. Until now, scarcity only concerned the production of material resources. Everything points to an inversion that has shifted scarcity from the production towards new, softer forms,[18] largely based on communication.
In fact, the question of what we pay attention to has become such a crucial issue that in recent years, many voices have been calling for us to go beyond the categories inherited from classical economics, in order to think about this new reality, which is now subsumed under the concept of the attention economy. Here's how French philosopher Yves Citton sums up the situation:
The new rarity is no longer to be found in the material goods to be produced, but in the attention required to consume them. With this somewhat disconcerting practical consequence, which quickly takes on the form of a prophecy: my publisher has taken advantage of your naiveté and our ancestral economic ideology to sell you the book you're holding in your hands (or the digital file currently scrolling across your reading tablet), as if it were he who had the rare and precious resource (the book and its contents) ; in reality, it's you, the readers, who now hold the knife by the sleeve, without anyone daring to tell you, and without you even realizing it yet, since, faced with the plethora of books written and distributed every month, it's your attention - the attention you're mobilizing right now to follow the unfolding of this sentence - that is now the rarest and most ardently desired resource. In all fairness and logic, it is I, the author of these lines, who should not only thank you, but pay you for the grace of devoting your precious time to reading this book, rather than to the millions of texts, songs and films available to you on the Internet. Hence the prophecy: within a few years or decades, we'll be able to ask to be paid for giving our attention to a cultural good, instead of having to pay for the right to access it, as is still demanded of us in this backward age.[19]
We have quoted this text at length because it seems to us to characterize very clearly the apparent paradox of this new economy, in which the author's prediction is already being fulfilled. As we all know, search sites and platforms like Google and YouTube spend millions of dollars every year to be able to offer consumers all kinds of cultural products (music, books, films, etc.) free of charge, in exchange for their attention only. Because attention has become the standard by which we measure the value we place on objects (but also on people), it acquires the status of the principal parameter of the new market order. This is why, a few lines further on, Citton writes: “If a product is free, then the real product is you”[20], in other words, the attention we devote to it, which the Net giants, using extremely sophisticated algorithms, manage to capture and resell to advertisers. That is to say, it is the “available brain time” of TF1 viewers that is captured for resale to Coca-Cola[21].
It should be pointed out, however, that if digital technology has given the attention economy a hitherto unsuspected dimension, it existed long before, at least from the 1920s, with the invention of radio, in the context of the emergence of consumerism. It was no longer just a question of production, but above all, of winning market shares, in an environment marked by increasingly aggressive competition. More than any other media, radio, the birthplace of advertising that had become scientific, developed hitherto unknown strategies for capturing attention and controlling behavior by building up mass audiences, and exposing listeners to an ever-increasing demand for advertising messages. It was in this context that Herbert Simon's work, following on from Gabriel Tarde's pioneering work in the early 20th century, laid the foundations for this discipline. In a lecture published in 1971, Herbert Simon posited what would become the basic axiom of attention economics: "a wealth of information leads to a shortage of something else, a scarcity of what information consumes. Yet what information consumes is obvious enough: it consumes the attention of those who receive it"[22].
But it was in the mid-1990s, with the explosion of digital technologies whose power, infinitely greater than that of traditional media such as radio and cinema, unified the planet, that the economy of attention acquired its rightful place, testifying to a profound mutation of capitalism towards a post-industrial phase essentially focused on communication and information. In this new “ecology of attention”, where everything is good, cultural life in the broadest sense is necessarily parasitized by a ubiquitous advertising discourse from which no one can escape. The term ecology of attention here refers to the way in which our current material environment, largely determined by advertising and the configuration of new media, is transforming the conditions in which attention is exercised, particularly among young people - but also among adults - whose difficulty in concentrating for long periods of time is increasingly deplored. Indeed, everything seems to indicate that it is the configuration of the medium of the Internet that is making attention to education and its corollary, the ability to read deeply, increasingly problematic, posing serious problems for schools and, more generally, for democracies.
In a particularly enlightening article, Katherine Hayles has attempted to take account of this situation, showing that we are living through a particularly crucial period, characterized by what she calls a “generational shift” in the modes of cognition, between, on the one hand, those who are often referred to as digital natives, and on the other hand, adults whose education was provided by books[23]. For this author, the generational shift from “deep attention” to “hyper-attention” is characterized as follows: deep attention is defined by the capture of our interest by a single object over a long period of time, such as reading a novel or a philosophical text, whereas hyper-attention, on the contrary, is characterized by the dispersion of attention, resulting in rapid fluctuations and oscillations between several activities and several objects, within several heterogeneous streams of information[24]. Katherine Hayles' thesis is that, as a result of the widespread use of digital tools, we – young people and adults alike – are moving from deep attention to hyper-attention. It is important to take the scale of this mutation because what is at stake is the way in which the brain allows itself to be determined by the cognitive artifacts that extend our minds. As we all know, the brain's main characteristic is its ability to reconfigure itself in response to experience and the material environment. Given that synaptic connections expand and evolve as a function of the material environment, we can assume that children educated in environments dominated by digital tools will certainly have brains connected and structured differently from those of their elders (the reading brains) who grew up and matured in contexts where education was essentially based on the deep attention paradigm.
The cognitive revolution that Katherine Hayles is talking about is therefore to be taken seriously, all the more so as learning, which used to involve books, isolation and intellectual effort, has now become a visual, ludic and impoverished experience. To fully grasp the significance of this contrast, we need to show how the book has been an agent of progress for mankind. For Daniel Bougnoux, this purpose is essentially due to the austere typography of the book, which contrasts point by point with the exuberance of oral discourse.
The black-on-white written text, with its lines of clearly justified alphabetical characters, is perhaps the most sensorially impoverished, the most austere process that men have devised to represent the world or their history. By retaining only the alphabetical form of the oral chain, the book elides the speaker's rich polyphony, the theater of his body, and the relational warmth that surrounds him; it isolates the sender of the message, and at the same time internalizes his consciousness by concentrating it solely on the content of the work and its logic, to the detriment of any external seduction[25].
To understand the mechanisms by which this typographic severity of the book will produce the epistemic revolution of the graphosphere (Régis Debray), we need to compare the written text, as Bougnoux invites us to do, with oral discourse. It's not hard to see how, in the presence of a gifted orator, external elements or “noises” that have nothing to do with the intrinsic content of the message as such, can and do distort communication. These “noises” are numerous: the speaker's facial expressions, the timbre and intonation of his voice, the way he is dressed, the place where he is, an amphitheater for example, which can add solemnity to the speech - in short, all these external elements, which have no relevance in strict terms of semantic content, act on the message's receiver and incline him, so to speak, to acquiesce to what he is being told. All this disappears when it comes to the solitary act of reading a book. Indeed, a text is reduced to very few things, simple signs that we patiently decode. The point of this simplification of the book, which reduces the entire flowering of oral discourse to austere signs, is to ensure that it can only deliver thought, and nothing else that might be in any way objectionable to it. It is clear, then, to what extent this face-to-face encounter with the book enables the development of a critical mind, because reading is never a matter of passively receiving the content of knowledge. Such an exercise consequently strengthens autonomy, with the acquisition of personal knowledge, and develops attention and the ability to analyze and synthesize, all things that constitute the conditions of possibility for the moral and spiritual liberation of the citizen-individual. In short, the strength of the book lies in the fact that the reader is not a passive consumer, but an active subject who participates in the elaboration of knowledge as he acquires it, which means that he cultivates himself and thus becomes an autonomous subject capable of self-determination[26]. Reading is thus without doubt humanity's most decisive cultural invention, the one that most clearly demonstrates the extent to which the process of hominization, far from being natural, is rather part of an ongoing process of artificializing life through technical means. While the use of speech is in a sense innate, because it is a product of evolution, reading and writing, as they are recent acquisitions, are purely cultural inventions that owe nothing to our natural constitution. As the brain has not had time to evolve and acquire these skills naturally, it has no neurons naturally destined for them. They must therefore be acquired from existing brain modules, in particular, the visual and auditory modules, which were originally designed to process sounds and shapes. This means that the brain has the capacity to recycle already specialized neurons, redirecting them towards the acquisition of new skills. This theory of “neuronal recycling” is now widely accepted by the brain biology community[27]. To understand what this theory is all about, we need to look at how the auditory and visual modules interact to make reading possible. These two modules are processing areas whose function is to process information from sub-modules such as images and sounds. But for the connections necessary for reading to occur, circuits need to be established between these modules. This characteristic is a rather special case of what is known as cerebral plasticity or neuroplasticity, that prodigious capacity by which the brain makes itself capable of modifying and therefore reconfiguring itself, meaning that the "wiring” of neuronal circuits, far from being fixed once and for all in a definitive form, as we believed for a long time, is constantly evolving, according to experience and learning. In other words, it is as if the brain were programmed to deprogram itself, i.e. to evolve and transform itself, not only as a function of the psycho-social environment, but above all as a result of the technical environment. What many neuroscientists now suspect is that, with the digital age, we may be witnessing a new phase in this brain plasticity, because reading on a computer does not excite the same neurons, nor the same areas of our cortex as reading a book. This suggests that our children, these digital natives whose brains may have been shaped in the image of microprocessors, may be like the beginnings of a new stage in the hominization process[28].
 

Conclusion

If we have widened our scope to explore disciplinary fields far from economics on the knowledge map, it is because the subject of the economics of attention is not exclusively economic. Its ramifications make it a social issue, involving considerations that can only be problematized by a holistic approach that embraces the philosophy of technology, cognitive psychology, pedagogy and literature, as well as the history of science and the biology of the brain.
The fact that we are on the threshold of a great bifurcation, as we have tried to demonstrate, is clear enough for those who reflect on the potentially disastrous consequences of manipulative techniques placed at the service of increasingly innovative and efficient industrial logics. However, it must be emphasized with the utmost force that, in drawing attention to the dangers of these techniques and the new economy they bring about, we are not succumbing to any form of technophobia. The fact that technology, like writing, is both a poison and a remedy, as we pointed out above, should be sufficient proof that our aim is a critical examination in the strict sense of the term, i.e., to identify the political implications of this new state of affairs, as recommended by the ethical imperative of theoretical lucidity.
The fact that our psychic resources have become the battleground of capitalist logics, which compete to subjugate them, can only mean one thing: we are now the objects of covetousness of the new capitalism. It is impossible not to remember Socrates' ancient recriminations against the sophists, for today, as in the past, what is at stake remains the same: what can we write on souls, science that liberates citizens, or knowledge that manipulates them? So we are at a crossroads.

Notes

[1] Bernard Stiegler has brilliantly highlighted the reasons why technology has so far fallen out of favor with philosophers, even though it is constitutive of the objectivization of human “nature”, and why it should be at the heart of their preoccupations. ” Today, we need to understand the process of technical evolution, because we experience a strong opacity in contemporary technology: we don't immediately understand what is really at stake and what is being profoundly transformed, even though we are constantly having to make decisions, the consequences of which we increasingly feel escape us. And in day-to-day technical news, we can't spontaneously distinguish between spectacular but ephemeral events and transformation processes that are set to last. (...) The question is whether it is possible to foresee and direct the evolution of technology - that is, of power. (...) The confidence that has governed this question since at least Descartes is no longer valid. This is also because the partition originally operated by philosophy between tekhnè and episteme has become problematic. ” Stiegler, B. (1994). La Technique et le temps I. Galilée.

[2] Stiegler, B. (1994). La Technique et le temps I. Galilée.

[3] Auroux, S. (1994). La révolution technologique de la grammatisation. Mardaga.

[4] See Floridi, L. (2014). The fourth revolution: how the infosphere is reshaping human reality. Oxford University Press. http://nfwopdf.tomtattoo.eu/the-fourth-revolution-how-the-luciano-53107176.pdf 

[5] Gille, B. (1978). Histoire des techniques. Techniques et civilisations, techniques et sciences. Gallimard.

[6] It is worth noting in this connection that Platon's dialogues have a distinctive oral tone because they are first and foremost narrated dialogues, which he reconstructs with his characteristic literary genius. Two interlocutors meet, and one of them asks a question about Socrates, years after his death. When the other answers, he faithfully reproduces a conversation that took place years before

[7]As for the monotheisms born from Abraham, these religions of the One God that define themselves as religions of the Book (Ahl al Kitab), they are rigorously impossible without writing. Régis Debray, who situates mediology in the tradition of thought opened up by Leroi-Gourhan, writes that the emergence of the one God can be understood to a certain extent as the result of a technical conditioning that favored its expansion and universality. What writing makes possible is the liberation of the Divine from its territorial anchorage, i.e. its establishment in a single place. For this reason, we can say of the God of revealed religions, as Régis Debray affirms, that it is “a portable God, insofar as he is no longer, as in pagan antiquity, inscribed in monuments, in stone altars; he is inscribed in alphabet letters on papyrus - later parchment. This papyrus is rolled up and taken away, all the better for having a cart with wheels. So God = alphabet + invention of the wheel. I admit the formula is reductive. A mediologist studies the technical conditioning of culture, and in both directions: what technology does to culture, and what culture does to technology. Hence the words mediation, interface, etc. But when you look historically at the formation of God and the history of the one God, you come up against the fact that oral culture can't think about the one God because it has trouble producing the universal, it doesn't have the tools of abstraction and God: what could be more abstract? The tools of abstraction are analytical thought, which is written thought. Oral societies do not have a single God; the single God is, I would not say produced, but in any case, induced by written societies“. Clearly, there is something peremptory and excessive in this assertion, and the most edifying counter-example to it is that of Chinese ideogrammatic writing, which did not produce monotheism. It is therefore impossible to establish a strict causal relationship between writing and monotheistic religion. But it is just as clear that without the conveniences offered by writing - which are out of all proportion to anything we have ever known before - revealed religions would not have spread to the extent that has made them universal religions.

[8] Socrates and Platon are arguably at odds in their respective assessments of the written word. Whereas Socrates prefers the living word and distrusts writing, which he accuses of petrifying thought in dead signs, Plato, no doubt because he is a mathematician, founds a school of philosophy in which geometry, the science of writing par excellence, plays a crucial role. In the Menon, Socrates is unable to complete his mathematical demonstrations, and is obliged at one point to draw figures on the ground, as if to retain knowledge he would otherwise have lost. Writing as a support against forgetting and as a condition of possibility for transmission is the condition of science.

[9] In the perspective opened up by Leroi-Gourhan, this loss is rather good news, and should be interpreted in terms of liberation. The loss of memory frees the mind from the drudgery of memorization, making it available for new, higher and more intelligent tasks. It follows in the footsteps of other losses, such as the loss of the hand for locomotion, which have produced the aptitudes by which man has constituted himself as such.

[10] Theaetetus, 434a 435, GF- Flammarion, trans. franç. Luc Brisson, Gallimard. Followed by Jacques Derrida, La pharmacie de Platon.

[11] For Platon, anamnesis is the capacity to internalize and appropriate knowledge, while hypomnesis is the technical support, such as writing, through which it is externalized, and which Socrates says in the Phaedrus is the death of knowledge. It is this opposition that still structures our relationship with the technologies of knowledge today.

[12] On the question of these research programs throughout European history, see Eco, U. (1994). La recherche de la langue parfaite. Seuil.

[13]  See Couturat, L. (1901). La logique de Leibniz d’après des documents inédits. Alcan.

[14] For more on the history of the Internet, see  Hauben, R. (2003). À la recherche des pères fondateurs d’Internet. Multitudes, 11(1), 193-199. https://doi.org/10.3917/mult.011.0193. http://beq.ebooksgratuits.com/auteurs/Proust/Proust-lecture.pdf

[15] Marx (2011). Manuscrits de 1857 dits « Grundrisse ». Les éditions sociales.

[16] Stiegler, B. (2012). États de choc. Bêtise et savoir au xxie siècle. Fayard/Mille et une nuits.

[17] Télérama, n° 2852 - 9 septembre 2004

[18] In his remarkable book, Jonathan Crary reports on unusual experiments underway in the United States to reduce people's need for sleep, with a view to exposing them longer to advertising, on the one hand, and eventually “creating” a consumer who no longer sleeps, on the other. In the ruthless logic of profit, consumer sleep is seen as a hindrance to the efficient operation of the capitalist system: "Given its profound uselessness and essentially passive character, sleep, which also has the disadvantage of causing incalculable losses in terms of production, circulation and consumption time, will always come up against the demands of a universe 24/7. Spending an enormous part of our lives asleep, free from the quagmire of factitious needs, remains one of the greatest affronts human beings can make to the voracity of contemporary capitalism. Sleep is an uncompromising interruption of the theft of time that capitalism commits at our expense. Most of the seemingly irreducible necessities of human life - hunger, thirst, sexual desire, and recently, friendship - have been converted into commodified or financialized forms. Sleep imposes the idea of a human need and an interval of time that can neither be colonized nor subjected to an operation of massive profitability - which is why it remains an anomaly and a place of crisis in today's world”, Crary, J. (2014). 24/7. Le capitalisme à l’assaut du sommeil. Éditions Zones, 14.

[19] Citton, Y. (2014). Pour une écologie de l’attention. Seuil, 25-26.

[20] Citton, Y. (2014). Pour une écologie de l’attention. Seuil, 25-26.

[21] It should be pointed out, however, that this shift towards the attention economy is not a total and definitive shift towards a new form of economy that would abolish the traditional one. Clearly, the former could not exist without the latter, which enables it to exist to a certain extent. Rather, it is an expansion into a field of activity hitherto excluded from market exchanges, which completely reconfigures the discipline.

[22] Quoted by Citton, Y. (2014, p. 21)

[23] “Hyper and deep attention: the generational gap in cognitive modes”, online article: http: //www.mlajournals.org/doi/abs/10.1632/prof.2007.2007.1.187. “(...) we are in the midst of a generational shift in cognitive styles that poses challenges to education at all levels, including colleges and universities. The younger the age group, the more pronounced the shift; it is already apparent in present-day college students, but its full effects are likely to be realized only when youngsters who are now twelve years old reach our institutions of higher education. To prepare, we need to become aware of the shift, understand its causes, and think creatively and innovatively about new educational strategies appropriate to the coming changes.”

[24] "Deep attention, the cognitive style traditionally associated with the humanities, is characterized by concentrating on a single object for long periods (say, a novel by Dickens), ignoring outside stimuli while so engaged, preferring a single information stream, and having a high tolerance for long focus times. Hyper attention is characterized by switching focus rapidly among different tasks, preferring multiple information streams, seeking a high level of stimulation, and having a low tolerance for boredom.”, “Hyper and deep attention: the generational gap in cognitive modes”, online article: http: //www.mlajournals.org/doi/abs/10.1632/prof.2007.2007.1.187.

[25] Bougnoux, D. (1998). Introduction aux sciences de la communication. La Découverte, 92.

[26] Marcel Proust sees the miracle of reading as residing in the fact that it gives us more to think about than what it expressly tells us, as if each text were inhabited by a constitutive semantic ambiguity that opened it up to a plurality of interpretative possibilities that its author was unable to foresee: “We feel very well that our wisdom begins where the author's ends, and we would like him to give us answers when all he can do is give us desires. And these desires he can only awaken in us by making us contemplate the supreme beauty which the final effort of his art has enabled him to attain. But by a singular and providential law of the optics of minds (a law which perhaps means that we cannot receive truth from anyone, and that we must create it ourselves), what is the end of their wisdom appears to us only as the beginning of ours, so that it is at the moment when they have told us all they can tell us that they give us the feeling that they have not yet told us anything.”, On reading: http: //beq.ebooksgratuits.com/auteurs/Proust/Proust-lecture.pdf

[27] On this subject, see the work of Stanislas Dehaene, in particular, Le code de la conscience, Odile Jacob, 2014 ; Apprendre à lire, Odile Jacob, 2011.

[28] On the basis of these considerations, it seems that digital natives would not have the same brain structure as book natives, hence the opposition between “reading brain” and “digital brain”. On this subject, see Maryanne Wolf's (2007) magnificent book, Proust and the squid. The story and science of the reading brain, Harper.

Bibliography

///

bottom of page