Abstract
Telling customers they are interacting with an AI sales agent can reduce trust and sales, even if the bot works perfectly. Our research shows this negative effect is driven by a consumer bias called "speciesism"—a preference for humans over machines. The solution is strategic: disclose the bot's identity before the conversation begins, and design it with a high "social presence" using emojis, humor, and personalization. This approach maintains transparency while preserving customer trust and purchase intentions.
| Original language | English |
|---|---|
| Type | Policy Briefs |
| Publisher | Universidade Católica Portuguesa |
| Number of pages | 4 |
| Place of Publication | Porto |
| DOIs | |
| Publication status | Published - Oct 2025 |
Publication series
| Name | Policy Briefs |
|---|---|
| Publisher | Research Centre in Management and Economics |
Fingerprint
Dive into the research topics of 'When sales agents aren’t human: how identity disclosure, social presence, and speciesism shape trust'. Together they form a unique fingerprint.Projects
- 1 Active
-
CEGE 2025-2029: CEGE - Research Centre in Management and Economics: UID/731/2025. Pluriannual 2025-2029
Vlačić, B. (PI)
1/01/25 → 31/12/29
Project: Research
Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver