From holiday shopping to emotional support, is AI becoming our go-to decision partner?
A quiet shift is happening in how we think.
Decision fatigue is real. What to wear, what to eat, what to text back, what to get your impossible-to-shop-for cousin for her birthday. It’s exhausting. So when AI showed up this year offering to lighten the holiday season load, a lot of us (including me) said yes. And in many ways, it’s been helpful.
According to Adobe Analytics, Black Friday retail traffic routed through AI assistants surged by 805 percent. Instead of spending hours comparing products across twelve tabs, people just asked: What should I get my boyfriend? What’s the best air fryer? What would work for someone with my style?
Deloitte found that one in three U.S. shoppers planned to use AI for holiday shopping this year, double what it was in 2024. For many people, AI is making certain tasks genuinely easier.
AI can surface options you might not have found, save time you don’t have, and reduce the mental load of constant micro-decisions. For people who are overwhelmed, neurodivergent, or just busy, the technology can be genuinely helpful.
But there’s a difference between AI helping us decide and AI deciding for us. That’s what I find interesting about this development.
What Is judgement, and why does it matter?
Political philosopher Hannah Arendt wrote that judgement is what connects our inner world to everyone else. It’s how we express who we are through the choices we make. It’s deeply human because it requires lived experience rather than data points.
Human brains have limits, though. In our day-to-day, we’re required to process more information than we have capacity for. We need tools that help us manage the overwhelm. AI can be that tool.
I’m not questioning whether we should use AI. I’m more interested in where the line is between useful assistance and unconsciously outsourcing judgement itself.
Gen Z is using AI for more than shopping
For some, AI is becoming a companion for questions beyond what to get their secret Santa.
A nationally representative U.S. survey found that 1 in 8 adolescents and young adults (ages 12–21) have used AI chatbots for mental health or emotional support when feeling sad, angry, or anxious. Among those who tried it, over 92% said the advice felt somewhat or very helpful. Usage was highest among 18–21-year-olds, where 22% had turned to AI for emotional guidance.
For young people who might not have access to therapy, who feel judged by adults, or who just need to process something at 2 a.m., AI can provide a non-judgemental space to think things through. That has value.
And it’s not just kids. In Canada, 66% of adults have used generative AI tools since 2022. In the U.S., adults under 30 are frequent weekly users across personal, social, and professional contexts. AI is becoming a normal part of how people navigate daily life.
The upside is that AI provides accessible support, reduced cognitive load, and more mental bandwidth for things that truly matter.
But, if AI is becoming the first place we take questions that used to live inside us before we talk to friends or sit with our own thoughts, and sometimes before we even realize we’re thinking, what does that cost us?
AI doesn’t give you the same answer twice
Unlike Google, which gives consistent results, AI systems are probabilistic. Ask ChatGPT the same question today and tomorrow, and you might get different answers. The variation is built into how these models work.
But we tend to read AI outputs as definitive.
So if you’re using AI to help make decisions, it’s worth knowing that the answer presented to you isn’t based on stable facts, it’s based on the model’s internal logic at the moment you asked. Your sense of certainty might be more fluid than it feels.
That doesn’t make AI useless, but it does make it important to stay critical, especially for bigger decisions.
How capitalism shapes what AI recommends
In May 2025, Google confirmed it’s now embedding ads inside AI-generated answers in Search’s AI Mode. They’re labelled, but they’re woven into the conversational text that might feel like neutral advice to the user. OpenAI hasn’t added ads to ChatGPT yet, but it has rolled out shopping features: product recommendations, comparison tools, and soon, checkout flows.
This makes sense from a business perspective. AI systems cost billions to build and run. They need revenue models. Advertising and commerce are natural paths.
But it also means the AI helping you choose is increasingly the same AI that brands can pay to influence, which means the advice might not be neutral.
That means we need to use AI with awareness. The system that feels helpful is also optimized to convert you into a customer. We saw this with Facebook after it launched. It went from being a useful tool to connect with friends and family, to being used to target very specific ads (sometimes political ads) to you based on your connections, likes and dislikes, and your engagement with content that could reveal deeper psychological indicators.
Think of it like asking a personal shopper for advice. They might genuinely help you find something great, but they’re also working on commission.
What this means for gift-giving and connection
Gift-giving illustrates something about how we connect. A meaningful gift isn’t just about finding the “right” thing, it’s about noticing. Remembering an offhand comment someone made, understanding what would make them feel seen. It’s inefficient, imperfect, and deeply human.
But I want to highlight my own use of AI to find gifts. I’m a busy working parent. I used ChatGPT to scan my kids’ letters to Santa. I uploaded photos of their handwritten lists, asked ChatGPT to read them and find me the cheapest deals. I don’t feel like the Christmas magic is lost for my kids, rather, I was able to save a bit of money at a time when prices are high and time is at a minimum. This was super helpful.
My kids still wrote their letters, and they still feel the anticipation. I still chose what to buy, AI just helped me find it faster and cheaper. The intimacy of the ritual stayed intact while the logistics got easier.
When AI consistently does all the choosing, especially when its suggestions are commercially influenced, something shifts. But when AI handles the time-consuming parts (price comparison, availability checks, deal-hunting), it frees up the mental space I need for the meaningful parts.
It’s not automatically problematic. Sometimes efficiency creates room for connection rather than replacing it. The question is less about whether we use AI and more about what parts of the process we’re handing over.
When AI becomes our default decision partner
What happens when AI becomes the automatic first stop for decisions, big and small?
If that interface is shaped by commercial incentives, then over time:
The options we see reflect what’s profitable to show us, not what’s truly best
The authority we lean on is manufactured inside corporate priorities
Our most personal choices get filtered through systems optimized for conversion
My concern is that we might gradually stop noticing how our judgement is being shaped.
How to use AI thoughtfully
AI can improve your life. It can save time, reduce stress, surface good options, and help you think through problems. Those are meaningful benefits.
The key is being intentional about when and how you use it:
For low-stakes decisions: Let AI do the heavy lifting. Finding a highly-rated coffee maker or getting gift ideas for a distant acquaintance works well.
For decisions that express who you are: Use AI as a starting point, not the final word. Get recommendations, then filter them through your own judgement and knowledge of the people involved.
For emotional support: AI can be a helpful first step, especially when human support isn’t immediately available. But it works best as a bridge to human connection, not a replacement.
For anything important: Stay critical. Remember that AI’s answers are probabilistic, potentially commercially influenced, and not objective truth.
The is to use AI thoughtfully.
Hold on to your decisions
Outsourcing effort makes sense. Outsourcing judgement, the human work of figuring out what matters to us, is different.
Stay curious about the difference between decisions made with AI’s help and decisions you’re accepting from it.
Disclosure: I lead AI communications at Manulife. All views expressed in this newsletter are my own and do not represent my employer.
AI in the news
Cristiano Ronaldo invests in Perplexity AI, enters partnership (Bloomberg) Cristiano Ronaldo has invested in Perplexity AI, one of the fastest-growing challengers to Google, and partnered with the company to launch a fan-focused hub on its search platform. The move signals AI’s growing pull in pop culture and global consumer markets as celebrities like Ronaldo lend their brand power to emerging tech.
AI country hit ‘Walk My Walk’ built on Blanco Brown’s sound sparks questions of attribution, ethics (Boston.com) An AI-generated country song mimicking Black artist Blanco Brown’s voice with a white digital avatar shot to No. 1 on the charts without his knowledge or consent, spotlighting urgent questions about race, authorship, and accountability in AI music. Brown’s response, releasing his own version and calling out the industry’s silence, underscores how generative tools are exposing long-standing biases while outpacing legal and ethical safeguards.
A softer image of AI? This Google-backed film aims to change the narrative (Los Angeles Times) Google-backed short film Sweetwater, starring Michael Keaton and Kyra Sedgwick, imagines a grieving man reconnecting with his deceased mother through an AI-generated hologram. The film offers a tender if not unsettling glimpse into the idea of “digital afterlives.” The project is part of Google’s broader effort to soften public perception of AI by shifting the narrative from dystopian fear to emotional resonance, even as it fuels debates in Hollywood about creativity, consent, and the ethics of resurrection tech.






Thanks Nicolle for this great article. It really made me think about exactly when and where to use AI.
It's very easy to get caught into a trap where we start using it as an emotional support and even beyond that relying on it to make simple decisions that can then trickle into much more long-term thinking.
Thanks again for helping to hit pause and making us reflect on our use of AI.