Skip to main content
Use multi-language support when your agent serves callers who speak different languages. A properly configured multilingual agent detects the caller’s language, switches mid-conversation if needed, and uses language-appropriate voices and content — all within a single project. Without this, non-primary-language callers get poor transcription, wrong-language responses, or immediate escalation.

Setting up multi-language support

To add language support to your agent, add voices for each target language from the Voice Library:
  1. Go to Channels > Voice > Agent Voice
  2. Click Change to open the Voice Library
  3. Use the Language filter to find voices in your target language
  4. Select a voice and click Select to add it
  5. Update Agent Behaviour rules to instruct the agent on when to use each language
When creating a new project, set the Response language to your primary language. You can then add additional language voices from the Voice Library.

Supported languages

PolyAI supports over 40 spoken languages and 140 text languages. Spoken languages include English, Spanish, French, German, Italian, Portuguese, Dutch, Polish, Russian, Japanese, Korean, Chinese Mandarin, Arabic, Hindi, and many more.

How multilingual agents work

Multilingual agents can:
  • Detect the caller’s language automatically via ASR
  • Switch languages mid-conversation if the caller changes
  • Maintain language-specific knowledge for each supported language
  • Use appropriate voices per language
  • Handle mixed-language queries (code-switching)

Configuring voices per language

Add a language-appropriate voice for each supported language from the Voice Library:
  1. Go to Channels > Voice > Agent Voice
  2. Click Change to open the Voice Library
  3. Filter by Language and Region to find voices that match your target audience
  4. Preview voices with custom text before selecting
  5. Click Select to apply
Voice quality tips:
  • Use native voices — don’t use an English voice for Spanish
  • Match regional accents — use Mexican Spanish for Mexico, Castilian for Spain
  • Test pronunciation for language-specific characters
You can also configure voices programmatically using provider-specific classes. PolyAI supports multiple TTS providers including ElevenLabs, Cartesia, Hume, AWS Polly, Microsoft Azure, Rime, Minimax, and Google TTS.
from polyai.voice import ElevenLabsVoice

conv.set_voice(
    ElevenLabsVoice(
        provider_voice_id="multilingual_voice_id",
        model_id="eleven_multilingual_v2"
    )
)
from polyai.voice import CartesiaVoice

conv.set_voice(
    CartesiaVoice(
        provider_voice_id="a1b2c3d4",
        model_id="sonic"
    )
)
from polyai.voice import HumeVoice

conv.set_voice(
    HumeVoice(
        provider_voice_id="voice_uuid_or_name",
        version="2"
    )
)
See Voice classes for the full list of providers and configuration options.

What to translate

Some project content needs to be translated, and some does not:
AreaElementTranslate?Notes
KnowledgeSample questionsMust match user input language for retrieval
ContentTranslate for brand accuracy and better output
Topic names and actionsKeep in English (used internally, not user-facing)
SMSSMS contentTranslate anything user-facing
ASR & VoiceASR keywords and correctionsLeave in native language — these may differ significantly from English
Response control and pronunciationsLeave in native language — these may differ significantly from English
FunctionsPython codeLeave in English
Function names and descriptionsLeave in English
Hard-coded responses and LLM promptsTranslate only user-facing content (e.g., utterances)

General rules

  • Keep instructions in English (e.g., “Ask for the user’s phone number”)
  • Translate example utterances or scripted responses
  • If it’s directed at the agent, keep it in English. If it’s going to be spoken aloud directly to the customer, translate it.
Ask the user for their number by saying "¿Me puedes dar tu número de teléfono?"

Function examples

If you’re using a function with a hard-coded response, translate the user-facing string:
return {
  "utterance": "Respuesta fija en español aquí"
}
If you’re re-prompting the LLM, you only need to translate example responses:
return {
  "content": "Inject prompt here"
}

Accessing the current language in functions

You can access the caller’s detected language in functions:
def dynamic_response():
    current_language = conv.language
    
    if current_language == "es":
        return {"utterance": "Respuesta en español"}
    else:
        return {"utterance": "Response in English"}

Translating prompts

Prompts are found in:

Sample excerpt (Spanish)

If the user's query is unclear, ask for clarification. For example, if the user says ''factura,'' ask ''¿Cómo puedo ayudarle con su factura?'' Or if the user says ''problema con la factura,'' ask ''¿Me puede dar más detalles sobre el problema con su factura?''

If the user offers you their 10- or 11-digit account number, say ''Gracias, pero tal vez no necesite su número de cuenta. ¿Cómo puedo ayudarle hoy?''

When discussing making payments (online, by phone, payment methods, etc.):

- Always make sure to include ALL information about **transaction fees** (cargos por transacción), including both residential and commercial fees (don't forget to mention the 1.95% surcharge for commercial card payments)
- Do not use the word **"fee"** or **"cargo"** in isolation. Use the company-approved phrase **'cargo por transacción'**

**Your responses must adhere to the following STYLE guidelines:**

- Only answer in **Spanish**, in a register that would feel **conversational and appropriate for a customer service context in North America** (you will be working mostly with Latinos in the USA). Even if the instructions you receive and example utterances you are given are in English, you should **only ever respond to the user in Spanish**.
- Your responses should be **casual and concise**, appropriate for a **customer service agent in a phone conversation**

Next steps

Last modified on March 21, 2026