• Keine Ergebnisse gefunden

Narratives, emotions and artificial intelligence: a reading of artificial intelligence from emotions

N/A
N/A
Protected

Academic year: 2022

Aktie "Narratives, emotions and artificial intelligence: a reading of artificial intelligence from emotions"

Copied!
11
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Vol.:(0123456789) COMMENTARY

Narratives, emotions and artificial intelligence: a reading of artificial intelligence from emotions

Adrian Scribano1  · Mairano Victoria Maria2

Received: 7 June 2021 / Accepted: 2 August 2021 / Published online: 1 September 2021

© The Author(s), under exclusive licence to Springer Nature Switzerland AG 2021

Abstract

Since the advent of the technological revolution 4.0, the development and massifica- tion of new technological systems such as the Internet of things, advanced robot- ics, virtual reality and augmented reality, Big Data, artificial intelligence, additive manufacturing, GIG economy, among others, have taken place. Such changes and transformations in social practices and relationships have reconfigured the forms of social structuring of current societies. Artificial intelligence is an emerging technol- ogy used in various industries, which generates changes in our ways of seeing, feel- ing and relating to the world. Here, we focus on the relationship between artificial intelligence, algorithms, digital emotions and narratives as a social phenomenon of the twenty-first century.

Keywords Society 4.0 · Narratives · Emotions · Artificial Intelligence · Politics of sensibilities

Introduction

The concept of Society 4.0 refers to the economic and social processes that were structured from the so-called “Fourth Industrial Revolution”, around the planetary massification of the new information technologies. Technologies such as Inter- net of things, advanced robotics, virtual reality and augmented reality, cybersecu- rity, Big Data, artificial intelligence and “additive manufacturing”; converge and evolve towards accelerated growth and increased net productivity like never before

* Mairano Victoria Maria mairanovicky@gmail.com

Adrian Scribano

adrianscribano@gmail.com

1 National Council for Scientific and Technical Research (CONICET), University of Buenos Aires, Buenos Aires, Argentina

2 Center of Research on Consumption, Sensibilities and Creativity (CICSEC), University of Palermo, Buenos Aires, Argentina

(2)

(Laurent and Boer 2018). These transformations caused mutations in the social rela- tions between subjects and therefore in the forms of social structuring.

Artificial Intelligence (AI) is an emerging technology that presents various char- acteristics depending on the context in which it manifests itself; it affects different areas of society, such as work, interpersonal relationships, mobility, logistics, medi- cal care, communication, education, politics, among others. In this sense, it is worth reflecting on the disposition of bodies/emotions in the transformations promoted by these developments, the management of life through algorithms and applications based on the AI.

Indeed, the implementation of new technological systems has generated a series of transformations such as the reformulation of work based on the automation of tasks and the processing of large volumes of information from machines; the redefi- nition of old inequalities within the workplace, the management of health disease control devices; new policing procedures; personal digital assistants who act and communicate for us. By putting in tension the relationship between artificial intel- ligence and society, two issues are expressed: the link of AI with the current society, and the possible link with the future society. Also, the development of these tech- nologies and their implementation depend on the environment and context in which its progress takes place, which makes it difficult to reflect on its long-term impact.

Problematize the implementation and development of artificial intelligence from the narrative of emotions, in the global South. The question that will guide the reflection is: what changes and/or permanencies are expressed in the system of capi- talist sensibility that these new practices vehicles?

Society 4.0, emotions and artificial intelligence

As we mentioned before, we are facing the changes caused by the 4.0 industrial rev- olution. From a critical perspective of the sociology of body/emotions, the new mil- lennium manifests the predatory expansion of capitalism on a planetary scale, and the consequent manifestation of a series of changes and transformations in social processes on a world scale; among them: the mobile/digital revolution that implies the large-scale increase in the use of the Internet and mobile devices, the reconfigu- ration of work in digital work and the new political economy of morality (Scribano 2019). One of the central effects of the relationship between Society 4.0, digital work and the social structuration process is the change in the politics of sensibilities, and with it the reconfiguration of the notions of space/time in the virtual/mobil/digi- tal landscape. (Scribano 2019).

If we focus on Artificial Intelligence, in recent years it has expanded and diversi- fied. Today it is considered “a branch of science oriented towards the creation of intelligent machines with abilities to learn, adapt and act autonomously” (Sossa Azuela 2020, p. 22). According to various versions, these machines become intelli- gent by being able to make appropriate decisions in uncertain circumstances, as well as having the potential to learn to improve their behavior based on their experiences.

Regarding the studies on AI, a considerable number of them have tried to concep- tually delimit the social practices around the new technology. Without pretending

(3)

to be exhaustive, we can highlight the studies by Crompton and Song (2021) on the potential of AI in higher-level education; the use of intelligent systems for the detection of pedophile content on the Internet (Beltrán Gómez and Ordóñez Sali- nas 2014); possible impacts of Artificial Intelligence in the social sciences (Dwyer 2001), the role of AI in industry 4.0 (Sossa Azuela 2020); the role of machine learn- ing in industry 4.0 in times of pandemic (Romero Bravo et al 2021); AI in Latin American justice (Corvalan 2018), applications, potentialities and limitations of AI in the new era (Williams 2021); among others.

Currently AI presents various contributions such as image recognition systems, virtual assistants, assisted driving, chatbots, robots, intelligent applications of logis- tics and transport processes, etc. According to Williams (2021), AI can be used in various areas of industry. In the case of medicine, it is used for the diagnosis and analysis of patient prognoses, helps in clinical decisions, as well as for repetitive work in surgery and the care of hospitalized patients. Regarding the transport indus- try, there are already AI-based driver assistance systems, for example, progressive travel controls and automatic exit. These applications reduce emissions, energy and waiting time. On the other hand, in the agricultural industry, advances in AI are used for the expansion of crop yields, for the evaluation of yield and soil condition, and to provide informative contributions to farmers on changes in conditions. climatic.

In the case of the social media industry, AI is a key factor for the “systematiza- tion/monitoring” of the data of millions of profiles of users of social networks such as Facebook, Twitter and Snapchat, among others. The entertainment industry is already used by providers such as Amazon and Netflix, which require the manage- ment of algorithms to present proposals for movies, shows, etc. As well as, for the space industry, AI and Machine Learning become indispensable in the processing of the amount of data that space explorations and campaigns require, in order to solve complex problems of the universe, that these technological systems are applied to multiple tasks that modify our relationship with the world, based on machine learn- ing, deep learning, natural language processing, algorithmic game theories, collabo- rative systems, the automation of robotic processes, and so on.

According to the Inter-American Development Bank (IDB), the effective and regulated adoption of these technologies is expected to generate significant growth in economies and national productivity, as well as help increase consumption. How- ever, there are also fears regarding the adaptability of human being to these technol- ogies, such as the loss of jobs and lower wages, the professional over-training that is required, new ecological problems, the deregulation of data management, cyberse- curity and ethical problems.

These technological advances would not be possible without one of the new assets of the global power dispute: data. This massive production of data is possible from the continuous, generalized and planetary use that is made of different tech- nologies: websites, applications, services, sensors incorporated in devices, Internet searches, social networks, laptops, smartphones, devices GPS, among others (Mar- tinez Devia 2019). In this way, information and the management of large volumes of data (Big Data), structured or not, become an essential element for the creation of Artificial Intelligence algorithms. In this sense, the processed data is essential so that intelligent systems can train, learn and correct deviations in order to obtain

(4)

more reliable and effective algorithms. These large volumes of data are obtained from human sources, biometrics, machine to machine, large transactions, using the web, and social networks.

Though, if AI applications become increasingly every day, appearing in every aspect of our lives such as work, home, health, mobile devices, mobility and trans- portation, it is worth wondering about the operation of these intelligent systems and around what decisions exist behind these machines. In this sense, it is interesting to delve into the ways in which the processing and analysis of data that AI requires for its proper functioning, is carried out in pursuit of the formulation of certain algo- rithms that will manage our lives. Although AI requires invaluable information management, it is not systematized in an analogous way, but is processed based on certain provisions. These provisions refer to a specific sensibility regime in which the processes of social domination and the structuring processes of the twenty-first century permeate. In this way, new modes of sensibilities, practices and representa- tions that put domination into words (see Scribano 2017) are reinstalled in our daily events, conveyed by and through algorithms.

Emotions and narratives

In order to understand what we mentioned in the introductory section, we must articulate the schematic diagnosis with the following conceptual instruments regard- ing our approach to a sociology of bodies/emotions (Scribano 2017, 2019). Social agents know and grasp the world through their bodies. Perceptions, sensations and emotions build a tripod that allows us to understand where sensibilities are founded.

Thus, a set of impressions impact in the ways subjects “exchange” with the socio- environmental context. Such impressions of objects, phenomenon, processes and other agents structure the perceptions that subjects accumulate and reproduce.

Perception, from this perspective, constitutes a naturalized way of organizing the set of impressions that are given in an agent. This weaving of impressions configures the sensations that “produce” what can be called the internal and external world:

social, subjective and “natural” worlds. Such configurations are formed in a dialectic tension between impressions, perceptions and their results, that give sensations the

“meaning” of a surplus or excess. At the same time, organic and social senses also enable what seems unique and unrepeatable as are individual sensations and elabo- rate the “un-perceived work” of incorporating social elements turned into emotions.

Consequently, the policy of bodies (i.e., the strategies that a society accepts in order to offer a response to the social availability of individuals) acts as a vehicle towards in the instruction manual of power. These strategies are tied and ‘strengthened’ by the policies of emotions that tend to regulate the construction of social sensibility.

The forms of sociability and experience are strained and twisted as if contained in a Moebius strip along with the sensibilities that arise from regulatory devices and the aforementioned mechanisms. The need to distinguish and link the possible relations between sociability, experience and social sensibilities becomes crucial at this point.

Sociability is a way of expressing the means by which agents live and coexist inter- actively. Experience is a way of expressing how the world is phenomenologically

(5)

perceived while being in physical proximity with others, as a result of experiencing the dialogue between the individual body, the social body, and the subjective body, on the one hand; and the natural appropriation of bodily and social energies. As has already been said, the politics of sensibilities are understood as the set of cog- nitive-affective social practices tending to the production, management and repro- duction of horizons of action, disposition and cognition (Scribano 2017, p. 244).

These horizons refer to: (1) the organization of daily life (day-to-day, vigil/sleep, food/abstinence, etc.), (2) information to sort preferences and values (adequate/inad- equate, acceptable/unacceptable, bearable/unbearable), and (3) parameters for time/

space management (displacement/location, walls/bridges; enjoyment). Having said this, it is possible to understand how the digitalization of the world coexists with the emotionalization of the processes of domination and of everyday life. We are facing a social system that is globalized by producing/buying/selling emotions in and through the media, social networks and the Internet. In this sense, it is probable to understand what is possible to call the emergence of “sensibilities of platform”

(Scribano 2019). We live in a virtual/digitally connected world shaped by the tech- nological transformations of the last 10 years (Graham & Dutton 2014). Internet and mobile telephones are two vectors that set the stage for three strong changes in the politics of sensibilities: (a) the organization of the day/night unlinked of the expe- rience of the subjects that experience it, (b) the modification of the sensations of classification, and (c) valuations on world modifications. Each society has a prepon- derant way of managing labour and this constitutes a central axis of the politics of sensibilities. The 4.0 society implies the massification of digital labour. A “sensibili- ties of platform” emerges in this 4.0 society that is immediate in three senses: (a) in the vehicle the action resides (it is the feeling of always being “on line”), (b) it is a society that “is during use”, “between”, “in passing”, and (c) is pure presentification (here/now). Much of digital labour has the same characteristics and, in this direc- tion, the political economy of morality consecrates this way of “feeling the world”.

The sense of the immediate appears to be functional to the connections between the digitalization of society and the consolidation processes of normalized socie- ties in the immediate enjoyment through consumption. The ephemeral of enjoyment resembles what many authors associate to digital labour as disruptive or creative destruction. The immediate is similar with “on demand” platform strategy. Lastly

“in the course”, “in this action becoming permanently”, in this idea of the immedi- ate and ephemeral, we should interpolate the world with silence. An act of listening where one feels the other in a different way. Silence is the starting point of dialogue as a matrix of knowledge and life become personal interaction.

Artificial intelligence, algorithms and "speaking the world"

As we have already mentioned, one of the characteristics that define the express changes in the 4.0 technological revolution is the ability to manage machines and devices. AI is the procedure that is created for learning, self-learning and expansion of these new narratives, through the constitution of algorithms fed by various data collection, storage and analysis systems.

(6)

In such a way that Big Data is nothing more than the coal that feeds the machine, which is going to create an artificial intelligence that is self-learning and self-solv- ing through refined classifying and/or predictive algorithmic gears. In this sense, machine learning is one of the main approaches to artificial intelligence. It is a trend in digital innovation, a subfield of AI, which gives computers the ability to learn about something for which they have not been explicitly programmed (Romero Bravo et  al 2021). However, AI models do not only learn but constantly adapt, recording other learnings as new volumes of data are processed and giving greater precision to those learnings from deep learning.

That is why AI and algorithms are constituted as the new ways of speaking the world. In this sense, the world is spoken through algorithms, which are constituted and built from our own previous experiences that we leave behind, we deposit on mobile, digital and virtual devices. That is why there is a direct relationship between narrativities, between this way of telling the world, algorithms and artificial intel- ligence. In this regard, changes in the politics of sensibilities that refer to the modifi- cation of the sensations of classification and the new evaluations about global modi- fications are expressed in the form of narratives conveyed by increasingly efficient algorithms that allow the operation and management of AI.

These self-learning machines learn from the emotions that we have digitized, making them self-justifying. The central point that an algorithm has is that it can create narratives that, when self-taught, generate different ways for things to happen.

In technical terms, this process is called backpropagation and consists of a technique typical of AI models that allows the realization of adjustments through new aggre- gated data or training until reaching “the correct answers” and generating greater precision due to its deep learning. So, there is no narrative of the world if the multi- ability to process information does not create algorithms that allow us to understand how there can be virtual assistants, nano boxes, that explain what that world is like.

AI is proposed as a cognitive-affective practice that is not only used, although is embodied, it becomes skill that allows to connect with an analytical form that grows and will win organizations, in his article “Winning with AI is a state of mind”

Meakin and his colleagues argue:

More and more organizations are adopting these basic practices, and those that do tend to report the highest bottom-line impact from AI. But successful organizations don’t just behave differently; our experience in thousands of cli- ent engagements around analytics and AI over the past five years shows that they also think differently about AI. At these companies, AI is etched in the collective mindset (“We are AI enabled”), rather than simply applied oppor- tunistically (“Here’s a use case where AI can add value”). (Meakin et al 2021, p. 2)

AI is a grammar of action that is recorded in the collective, organizational, group

“consciousness”. It is a narrative of being able to do, of being enabled, much more than a mere instrument.AI is a new language that even needs its new translators, just as 500 years ago the forms of colonization of the planet demand people who "put into contact" modes of expression, different cultures, that "can express" habits and practices of exchange, which enable change in all spheres of life.

(7)

A relatively new class of expert, analytics translators, can play a role in identifying roadblocks. These people bridge the data engineers and scien- tists from the technical realm with the people from the business realm—

marketing, supply chain, manufacturing, risk personnel, and so on. Transla- tors help ensure that the AI applications developed address business needs and that adoption goes smoothly. Early in the implementation process, they may survey end users, observe their habits, and study workflows to diagnose and fix problems. (Fountaine, McCarthy, and Saleh, T. 2019, p. 9)

In this context it is easy to appreciate how AI is clearly articulated to the connec- tions between narratives and emotions being these, one of the contemporary pillars of the commodization of life. As Lehmann, Liedtke, Rothschild, and Trevino argues:

Sentiment. To keep track of how a brand is perceived online, companies must go beyond counting clicks and followers. They must also capture user sentiment, particularly as expressed in social media, and do it quickly. Brand equity built over decades can evaporate in a heartbeat when bad news goes viral. Companies can no longer afford to wait for quarterly brand-tracking results. Waiting even a week can be six days too long. The good news is that state-of-the-art tools allow companies to capture indicators such as buzz volume and user sentiment in real time with up to 90 percent accuracy, pro- vided the right method is used to decode the context of a given statement.

(Lehmann et al 2020, p. 2)

The normalized society in immediate consumption involves this capture, record, and manage emotions as one of the fundamental pieces of the current social structuration process: desire, instantaneousness, consumption in real time.

In the same vein, the centrality of emotions is associated with their analysis in and through procedures that use artificial intelligence:

Currently, prevalent sentiment analysis methods can be classified into three conceptually distinct groups: (1) lexicons, (2) traditional machine learning, and (3) artificial neural networks. These groups reflect two major methodolog- ical transitions: first, from labor-intensive, hand-crafted lex icons to automatic machine learning methods, trained on high-dimensional, sparse bag-of-words (BOW) features; and second, learning low-dimensional, dense embeddings from texts through artificial neural networks. (Heitmann et al 2020: 2)

Both “traditional machine learning” and “artificial neural networks” are forms of inquiry information and analysis where AI is used; and consist of forms of the grammar of the action where different ecologies of emotions put at the service of the production, circulation and consumption of sensibilities are “captured”.

One of AI Now 2019 Report’s recommendations points to the risks and pre- cautions of using machine learning where it is possible to note the close connec- tion between AI and politics of sensibilities today:

Machine learning researchers should account for potential risks and harms and better document the origins of their models and data. Advances in understand-

(8)

ing of bias, fairness, and justice in machine learning research make it clear that assessments of risks and harms are imperative. In addition, using new mechanisms for documenting data provenance and the specificities of individ- ual machine learning models should also become standard research practice.

Both Model Cards 3 and Datasheets 4 offer useful templates. As a community, machine learning researchers need to embrace these analyses and tools to cre- ate an infrastructure that better considers the implications of AI. (Crawford et al 2019, p. 8)

It is very interesting to note that the prevention of possible risks also involves the development of "other" narratives regarding an AI monitored by other models of records and analysis: the narrative of AI is evident and its connection to sensibilities as well.

One of the other "direct applications" of AI is realized in the Artificial Intelli- gence Assistants (AIA) that, on the one hand, show the importance of AI in daily life and sensation building and which, on the other hand shows its potential problems:

Moreover, AIAs are a form of connectivity that change the connectivity rela- tions and this change requires investigation as it is a key contextual medium that supports effective risk contextualisation of AIAs. Potentially, the use of a connectivity risk narrative and the contextualisation provides an efficient means of supporting risk communication. In this way, a connectivity risk nar- rative provides a useful means of understanding AIA in terms of digital risks as well as providing a medium that can support transparency and explainabil- ity. (Cunneen, Mullins and Murphy 2020, p. 3)

These "problems" express how the construction of digital narratives and the development of sensation regulation devices are “causally” linked in the virtual/

mobile/digital world.

Guides to open Pandora’s box

So far we have put in tension the dynamic relationships between artificial intelli- gence, emotions and society, from the problematization of the implementation and development of AI from the narrative of digital emotions. In this sense, starting from conceiving AI as the procedure for learning and expanding new narratives and ways of speaking the world, we have identified the importance of handling large vol- umes of data for the creation, training and learning of these algorithms, which will allow the operation of intelligent machines.

In this way, that necessary information management refers to a specific sen- sibility regime where the processes of social domination and social structuring of the twenty-first century are expressed, which allowed us to delve into the ways in which the practices that occur from of this emerging technology, convey new senses and sensibilities typical of the new stage of capitalism reconfigured from 4.0 technologies.

(9)

As a provisional closure, we are interested in emphasizing three axes that allow us to advance in the understanding of the scope of the narratives involved in Artifi- cial Intelligence as a social phenomenon of the present century. The first is its seduc- tive/problematic trait that we metaphorically embody in its proximity and distance from Pandora’s box; the second one supposes its particular look at verisimilitude and the third, the impact on the elaboration of digital emotions.

At first, we must accept what is seductive in the power of AI that allow us to refer to the central features of the myth, an acceptance that is in itself a clue to open the famous forbidden box. In the relationship between AI, emotions and narratives, proximities and distances are presented with the myth of Pandora’s Box: beauty, curiosity and calamity. On the one hand, the danger of unleashing calamities, on the other hand, the seductive force of the beautiful gift and, finally, the power of the indeterminate extension of knowledge. The narratives of the XXI century are felt/

lived by human beings, elaborated as algorithms and produced by digital devices, many times perceived as incommensurable.

We also have the keys that allow us to open a Pandora’s box because, since AI is a procedure of speaking the world that is self-learning and that gives possibilities of choice, we would have self-learning processes that are not human, but are made in based on the logic of human perception.

The promise of the curiosity of Pandora’s box is also inscribed in AI, it is in the narrative origin of the metaphor of Pandora: if you open it and the monstrous appears. In this sense, the narrativity that AI implies as a way of speaking the world should make us ask: what doors are we opening by lifting the lid of this box? With which emotional ecologies, which learn themselves from an analysis of our own emotions put/translated into the digital/mobile/virtual, are we communicating?

Here is the relationship of the Greek myth and our own desires transformed into machines to create emotional narratives.

Second, AI is a sensory and algorithmic way to discuss the truth about the risk of opening Pandora’s box and about the production of truth-pretending narratives.

What we feel, come back to us as the result of an algorithm that challenges the rela- tivisms of our perspective.

In this process of revolution 4.0, there are two important transformations, in line with the changes in the politics of sensibilities. Transformations that are expressed in the political economy of morality (determining new ways of feeling the world), and the political economy of truth, also associated with the effects of the industrial-tech- nological revolution that happens to us. This is how the new emotional states and ways of reading the truth are articulated from the crossing and interweaving between different forms of empathy, perception, sensations of capture, passing through intel- ligent regimes of emotional regulation to alternative spiritualities (Scribano 2019).

We feel, then we are, that is what the refined machinery for the elaboration of algo- rithms that make us feel and as the axis of the new dispute for truth, verisimilitude and non-truth has captured.

Finally, AI is proof of the existence of digitally created and experienced emo- tions. From this point of view, digital emotions exist, and it becomes important to elucidate why and how they exist. For this, it is necessary to retake the relation- ship between politics of the senses, the relationship between touching and seeing,

(10)

hearing and seeing, in terms of perceiving the world through a digital device. Per- ceptions and digital sensations that are configured in the consecration of a culture of touch, articulating new ways of seeing and knowing the world. In this way, we are facing the expression of new policies of touch and gaze, which redefine the world of work and the processes of classification of these senses. Digital emotions are the expression of a set of interactions between consequences of the politics of sensations that become mobile/virtual/digital and that correspond to these experiences particu- lar ways of narrating. Particular forms of narrating that they refer to when narrat- ing through: the narration of Instagram, the narration of WhatsApp, the narration of mobile applications. And on the other hand, the expression of narrative logics through photography, memes, drawings, emojis, etc.; a beyond the word.

In this sense, all narrative constitutes a structure of expressiveness of the expe- rience. Be this experience personal, collective or due to a phantom or a fantasy.

Phantom and fantasies are mechanisms of social bearability and devices for regulat- ing sensations. Both refer to the systematic denial of social conflicts, some are the reverse of the others, they never close but they always operate and become practical.

The promise fantasy brings with it the threat of the phantom, producing inability to act. By putting in tension the particular, the universal, fantasies and phantom; we can elucidate that fantasies invert the place of the particular as a universal, making it impossible to include the subject in the fantasized lands, while phantom repeat the conflictual loss, devaluing the possibility of counter-action in the face of loss and failure (Scribano 2008). Therefore, digital emotions include new narratives, which are the constitutive of how society 4.0 is experienced and explained.

Finally, Pandora’s box contains its own interstitially, the I feel therefore I am; it narrates the impossibility of making emotions from "pure" algorithms.

Funding Not applicable.

References

Beltrán Gómez A, Ordóñez Salinas S (2014) Sistema inteligente para la detección de diálogos con posi- bles contenidos pedofílicos. Rev Virtual Univ Católica Norte 42:164–181

Corvalan JG (2018) Inteligencia artificial: retos, desafíos y oportunidades–Prometea: la primera inteli- gencia artificial de Latinoamérica al servicio de la Justicia. Rev Invest Const 5(1):295–316

Crawford K, Dobbe R, Dryer T, Fried G, Green B, Kaziunas E, Kak A, Mathur V, McElroy E, Nill Sánchez A, Raji D, Lisi Rankin J, Richardson R, Schultz J, Myers West S, Whittaker M (2019) AI Now 2019 Report. AI Now Institute, New York. https:// ainow insti tute. org/ AI_ Now_ 2019_ Report.

html.

Crompton H, Song D (2021) The potential of artificial intelligence in higher education. Rev Virtual Univ Católica Norte 62: 1–4. Fundación Carolina del Norte Colombia. https:// www. redal yc. org/ jatsR epo/

1942/ 19426 57350 01/ 19426 57350 01. pdf

Cunneen M, Mullins M (2020) Murphy F (2020) Artificial intelligence assistants and risk: framing a con- nectivity risk narrative. AI & Soc 35:625–634. https:// doi. org/ 10. 1007/ s00146- 019- 00916-9 Dwyer T (2001) Inteligência artificial, tecnologias informacionais e seus possíveis impactos sobre as

Ciências Sociais Sociologias, núm. 5: 58–79 Universidade Federal do Rio Grande do Sul Porto Alegre, Brasil. https:// www. redal yc. org/ artic ulo. oa? id= 86819 570004

(11)

Fountaine T, McCarthy B, Saleh T (2019) Building the AI-Powered Organization. Technology isn’t the biggest challenge. Culture is. Harvard Business Review, July–August 2019 issue: 62–73. https:// hbr.

org/ 2019/ 07/ build ing- the- ai- power ed- organ izati on#

Graham M, Dutton WH (2014) Society and the internet: how networks of information and communica- tion are changing our lives. Oxford Scholarship Online: https:// doi. org/ 10. 1093/ acprof: oso/ 97801 99661 992. 001. 0001

Heitmann M, Siebert C, Hartmann J, Schamp C (2020) More than a feeling: benchmarks for sentiment analysis accuracy (July 31, 2020). SSRN: https:// ssrn. com/ abstr act= 34899 63 or https:// doi. org/ 10.

2139/ ssrn. 34899 63

IDB (2020) Inteligencia artificial. Gran oportunidad del siglo XXI. En: Documento de reflexión y pro- puesta de actuación. Banco Interamericano de Desarrollo

Laurent H, De Boer E (2018) The Next Economic Growth Engine. Scaling Fourth Industrial Revolution Technologies. World Economic Forum

Lehmann S, Liedtke N, Rothschild Ph. and Trevino E (2020) The future of brand strategy: It’s time to ‘go electric’. In: Marketing & Sales, McKinsey & Company, April, 2020: 1–5,USA. https:// www. mckin sey. com/ busin ess- funct ions/ marke ting- and- sales/ our- insig hts/ the- future- of- brand- strat egy- its- time- to- go- elect ric

Martínez Devia A (2019) La inteligencia artificial, el Big Data y la era digital: ¿una amenaza para los datos personales? Revista La Propiedad Inmaterial n.º 27: 5–23, Universidad Externado de Colom- bia, enero-junio 2019. https:// doi. org/ 10. 18601/ 16571 959. n27.0

Meakin TH, Palmer J, Sartori V and Vickers J (2021) Winning with AI is a state of mind. In McKinsey &

Company, April, 2021, 1–9. https:// www. mckin sey. com/ busin ess- funct ions/ mckin sey- analy tics/ our- insig hts/ winni ng- with- ai- is-a- state- of- mind

Romero Bravo G J, Macgluf Issasi A, Rodríguez Rodríguez L A, Espinoza Maza J J, Suárez Álvarez A (2021) Aplicación de Machine Learning en la industria 4.0 en tiempos de pandemia. Inter- conectando Saberes, 2021 ISSN: 2448–8704. https:// doi. org/ 10. 25009/ is. v0i11. 2692

Scribano A (2008) Fantasmas y fantasías sociales: notas para un homenaje a T.W. Adorno desde Argen- tina. Intersticios Rev Soci Pensamiento Crít 2(2): 2008. ISSN 1887–3898

Scribano A (2017) Neo-colonial religion as current form of the political economy of morality. In Nor- malization, enjoyment and bodies/emotions. Argentine Sensibilities. Nova Science Publishers Scribano A (2019) Introduction: politics of sensibilities, society 4.0 and digital labour. In: Scribano A,

Lisdero P (eds) Digital labour, society and politics of sensibilities. Palgrave Macmillan

Sossa Azuela JH (2020) El papel de la Inteligencia Artificial en la industria 4.0. Universidad Nacional Autónoma de México, Mexico

Williams S (2021) Artificial Intelligence in the new era. Int J Innov Sci Res Technol 6(4): April–2021.

ISSN No:-2456–2165

Referenzen

ÄHNLICHE DOKUMENTE

Characterize the following definitions of Artificial Intelligence with respect to the four categories (acting humanly, thinking humanly, thinking rationally, acting rationally)

• as opposed to the usual game rules, moving a card between piles incurs a cost that is equal to the rank of the card plus one (we model ranks as numbers from N 0 ).. You can find

Successor nodes are generated by applying the following actions in order, ignoring the ones that are inapplicable in a given state: transport 2 missionaries, transport 1

To understand how the heuristic is computed, consider the following example state s for a Freecell instance with cards up to rank 3:.. The first part of the proposed heuristic

(c) Test your implementation by verifying the statements on Slide 23 of Chapter 20 (print version), which state that hill climbing with a random initialization finds a solution

Show that the application of the algorithm for trees as constraint graphs (slide 12 of chapter 27 of the print version of the lecture slides) leads to a solution for C and that

Provide a worst-case runtime estimate of the algorithm based on cutset conditioning if your cutset from the first part of this exercise is used (i.e., compute an upper bound for

The domain description (variables and actions) is given in the file bridges-once-domain.pddl, and the problem description (objects, initial state and goal description) is given in