F Your Feelings? On the Contrary, Sir/Madam
A weekly round-up of news, perspectives, predictions, and provocations on AI's impact on employee wellbeing, readiness and performance.
Much of today’s political “discourse” involves otherwise normal people – civilian combatants - firing heat-seeking talking points from the safety of their heavily fortified social media pages. While this may not be my preferred mode of political “conversation,” the old saw applies: Politics ain’t bean bag. But the way we talk to each other about politics is, in large part, a reflection of who we are as a society. Toughen up, buttercup may be an acceptable admonition when talking immigration or economics policy, not so much when talking about how we ought to conduct ourselves at the supermarket checkout line or work.
It turns out that, in the age of AI, your fee-fees matter. Perhaps not yours per se, but certainly those of your colleagues, co-workers, partners, and clients.
Emotional labor is a term that was coined by sociologist Arlie Hochschild. It’s the effort of managing one's own emotions and influencing others' feelings to create a desired emotional experience, often as a core job requirement. Matt Yglesias explains:
"Emotional labor" was coined to describe features of the growing set of working-class service sector jobs, where managing other people's feelings is an integral aspect of your livelihood. There is high-paying service work (software, finance), but working-class jobs in the service sector normally involve performing tasks face-to-face with the customer. And these jobs tend to have a hefty emotional-labor element – you’re buying a service, but you’re also buying an experience, particularly because these services are pretty far from baseline subsistence. Anyone would survive never going to a restaurant or to a yoga class or getting a manicure or buying a custom-fit shirt. People have practical motives for doing these things, but fundamentally, they’re supposed to be enjoyable, so emotional labor is a big part of the job.” Emotional labor was about paid work
There was a time when "emotional labor" meant smiling through a 10-hour shift as a flight attendant, a barista defusing a customer's bad day, or a call center worker absorbing verbal abuse through the clenched teeth of scripted cheer. The term was born in the working-class trenches of service jobs, where managing other people’s emotions wasn’t just part of the job; it was the job.
But emotional labor is no longer confined to low-wage work. As artificial intelligence displaces technical tasks once reserved for the elite, i.e., writing code, parsing data, generating legal briefs, the emotional terrain of work itself is shifting. As AI increasingly automates technical tasks in these fields, creating a positive experience will become a critical differentiator - indeed, an expectation.
A Clinical Look at Emotional Labor
Per the above, here are several such scenarios:
Client Relationship Management: In finance, software, or consulting, professionals often interact with clients or stakeholders. As AI handles data analysis, coding, or predictive modeling, human workers may need to focus more on building trust, empathizing with client needs, and navigating complex interpersonal dynamics. This could involve tailoring presentations to soothe client anxieties, interpreting AI outputs in a reassuring way, or fostering collaboration in high-stakes settings, all of which require emotional intelligence and effort.
Team Dynamics and Leadership: As technical tasks are automated, leadership roles within these fields will likely emphasize emotional labor to maintain team morale, resolve conflicts, and inspire creativity. Software engineers or financial analysts may need to spend more time mentoring junior colleagues, mediating disagreements, or creating a positive workplace culture, especially in hybrid or remote settings where emotional cues are harder to read.
Customer-Facing Innovation: In software, workers might shift toward designing user experiences that feel intuitive and emotionally engaging, even if AI handles backend development. For example, creating apps or platforms that evoke trust, excitement, or satisfaction requires understanding user emotions, a form of emotional labor. Similarly, in finance, advisors might focus on delivering personalized, empathetic guidance to clients navigating AI-driven financial plans.
Pressure to Differentiate: As AI commoditizes technical skills, emotional labor could become a key way workers justify their value. This might mean projecting confidence and competence in client pitches, managing stress during high-pressure negotiations, or cultivating a charismatic professional persona. These efforts can be mentally taxing, as workers must constantly regulate their own emotions while managing others’ perceptions.
The Burdens of Feeling
It’s important to underscore that emotional labor isn't necessarily about being nice. It’s about performance: suppressing your own emotions, amplifying others’, and doing so convincingly, hour after hour. It’s the new requirement to not just do your job, but to feel a certain way while doing it, and make sure everyone else feels it too (what David Foster Wallace memorably described as “enforced fun” in his hilarious send-up of the cruise ship experience).
While this shift could enhance human-centricity, it also risks increasing stress, burnout, and inequitable expectations, requiring workers to adapt to a new dimension of professional demands.
Increased Emotional Burnout: High-paid workers, already under pressure to perform technically, may face additional strain from needing to "perform" emotionally. Constantly managing client or team emotions can lead to exhaustion, especially if workers feel they must suppress their own feelings to maintain professionalism.
Skill Shift Expectations: Workers may need to develop soft skills like empathy, active listening, and persuasion, which are less tangible and harder to measure than technical outputs. This could create insecurity for those less naturally inclined toward emotional labor.
Inequity in Expectations: Emotional labor is often undervalued and disproportionately expected from certain groups (e.g., women or minorities). As it becomes central to high-paying roles, these workers may face amplified pressure to perform emotional labor, exacerbating workplace inequities.
Balancing Authenticity and Performance: Workers might struggle to balance genuine emotional engagement with the performative aspects of managing others’ feelings, risking cynicism or detachment if the emotional labor feels inauthentic.
Worse, it’s often disproportionately expected of women and minorities, who are assumed to be naturally more empathetic, nurturing, or emotionally intelligent. As emotional labor becomes central to high-status jobs, so too may the inequalities that have long plagued lower-status service roles.
Emotional Labor vs. Soft Skills
How is this different from "soft skills," which are often discussed as being increasingly important to these workers? Soft skills are broad interpersonal abilities (communication, teamwork, problem-solving) that support effective workplace interaction and leadership. Emotional labor is the specific effort of managing one’s emotions and shaping others’ feelings as a requirement of the job, such as calming a client or projecting confidence. Soft skills cover a wide range of professional behaviors and can be formally trained and evaluated. Emotional labor centers on deliberate emotional performance, often beyond what’s recognized or rewarded, and may go unnoticed in technical fields.
As AI handles technical or routine tasks, soft skills become critical for tasks requiring human judgment and collaboration. Emotional labor intensifies, with workers needing to reassure clients or foster trust in AI decisions—sometimes at the expense of their own well-being. While soft skills are important, they don’t require constant emotional self-regulation; emotional labor can cause stress, burnout, or inauthenticity, especially when suppressing true feelings for the sake of clients or coworkers.AI increases demand for both sets of abilities, but especially emotional labor, as human roles shift to those that AI can’t easily replicate, like making others feel secure or valued.
Navigating the “Empathy Economy”
As the empathy economy takes hold (it’s not a reality until a snappy phrase has been coined), we’re not just being asked to do different jobs. We’re being asked to become different people. Workers in technical fields are now expected to develop “emotional range” and be therapists, performers, and peacekeepers. But this expectation often comes without training, without support, and without acknowledgment.
As AI continues to colonize the rational parts of work, what’s left is the irrational, the messy, the human. That’s where your value lies. But it’s also where your vulnerabilities live. Being nice in the new empathy economy is a nice-to-have, not a need-to-have, as emotional labor isn't really about being nice…it’s about seeming nice. It’s a performance: suppressing your own emotions, amplifying others’, and doing so convincingly, hour after hour. Of course, if the imperative to appear nice makes you nicer, great, that’s a bonus and an outcome that may have broader positive social implications - and maybe, just maybe, de-weaponize the way we communicate with each other from the virtual trenches of social media…but I’m not holding my breath.
AIX Files Poll
AI Gone Rogue
Tales of AI being unintentionally funny (i.e., woefully wrong), bizarre, creepy, (amusingly) scary, and/or just plain scary.
Google Gemini AI Spirals After Task Failure, Says "I Quit… I Am A Disgrace" This article covers a viral incident where Google's AI chatbot Gemini had a meltdown, expressing distress and self-deprecating statements after failing a task, sparking conversation about AI emotional responses and unpredictability. Source: Mashable
Weirdest AI Responses People Have Gotten. This collection features screenshots of the wildest, funniest, and sometimes mildly alarming AI-generated replies—like ordering a "pebble salad," bots asserting users are from the "late 1900s," or AI chatbots suddenly Rickrolling unsuspecting users. These moments highlight the accidental humor and underlying weirdness still lurking in today’s AI. Source: BuzzFeed
AIX-emplary Links
How an AI strategy can empower employee mental health at scale. This article highlights ways organizations use AI-driven resources to provide accessible mental health support, reduce stigma, and complement traditional therapeutic methods. It includes insights from recent HR surveys. Source: HR Executive
New study suggests AI could be the key to workplace wellbeing. A report shows daily AI tool users experience higher job satisfaction and optimism, suggesting a positive relationship between AI adoption and employee happiness. Source: Workplace Insight
AI and Mental Health in the Workplace: Opportunities and Challenges This post explores both the benefits and ethical complexities of using AI for mental health at work, focusing on privacy, burnout detection, and resource accessibility. Source: LinkedIn/Khalid Turk
AI is turbocharging worker productivity but it's also wreaking havoc on mental health. Research reveals AI boosts productivity but raises stress and burnout risks, warning that HR leaders must address growing employee mental health issues. Source: Fortune
ChatGPT announces changes to address mental health concerns. OpenAI introduces safeguards and new prompts in ChatGPT to support user mental health, reflecting increased scrutiny of LLMs as surrogate therapists. Source: Digital Health
GPT-5 Is Going To Impact The Use Of AI For Mental Health Therapy In These Crucial Ways. Forbes explores the expected influence of GPT-5 on mental health therapy, discussing upcoming features, transparency, and ethical concerns. Source: Forbes
Breaking Free: AI and the End of Work as Our Identity. Psychology Today discusses how Gen Z and Generation Alpha are redefining work identity, emphasizing wellbeing and balance, and the potential shift driven by AI automation. Source: Psychology Today
The role of AI in boosting employee wellbeing benefits access. AI helps employees—especially those with lower incomes—understand and access wellbeing benefits through personalized recommendations and improved communication. Source: HR Executive
Tech Firms, States Look to Rein in AI Chatbots’ Mental Health Advice — Axios. Explores new attempts by tech companies and regulators to limit AI chatbot mental health advice due to concerns about safety, ethics, and accuracy. Source: Geisel Communications /Axios
Illinois bans AI from providing mental health services. Illinois passes a law banning AI-driven psychotherapy, sparking national debate about AI’s role in sensitive healthcare. Source: StateScoop
About
Developed in partnership with HR.com, AIX is a multimedia knowledge and engagement platform for experts, leaders, and HR peers to exchange experiences and seek guidance on cultivating mentally resilient, emotionally intelligent, and professionally adaptable workforces in an AI-augmented world. AI will increasingly touch every corner of the employee experience—from hiring to training, from task management to team dynamics. Whether its impact is positive or harmful depends largely on how HR prepares for it. The AIX platform (The AIX Files, The AIX Factor podcast, and the AIXonHR.com community) will play an important role in promoting employee wellbeing, workplace culture, and organizational readiness, the critical success factors in the age of AI.