Latest Features (Call Analysis, Backchanneling) And Python Custom LLM Update

Dear Retell Community,

We're thrilled to share some exciting new features and updates on the platform. Here’s what’s new 👇. Feel free to reply to this email directly to share your feedback.

1. Enhanced Call Monitoring

Voice AI agent Assitant

Call Analysis:  We've introduced metrics like Call Completion Status, Task Completion Status, User Sentiment, Average End-to-End Latency, and Network Latency for comprehensive monitoring. You can access these directly on the dashboard or through API.

Disconnection Reason Tracking: Get insights into call issues with the addition of "Disconnection Reason" in the dashboard and "get-call" object. For more details, refer to our Error Code Table.

Function Call Tracking: Transcripts now include function call results, offering a seamless view of when and what outcomes were triggered. Available in the dashboard and get-call API. For custom LLM users, can use tool call invocation event and tool call result event to pass function calling results to us, so that you can utilize the weaved transcript and can utilize dashboard to view when your function is triggered.

2. New Features

Reminder Settings: You can now configure reminder settings to define the duration of silence before an agent follows up with a response. Learn more.

Backchanneling: Backchannel is the ability for the agent to make small noises like “uh-huh”, “I see”, etc. during user speech, to improve engagement of the call. You can set whether to enable it, how often it triggers, what words are used. Learn more.

“Read Numbers Slowly”: Optimize the reading of numbers (or anything else) by making sure it is read slowly and clearly. How to Read Slowly.

Metadata Event for Custom LLM: Pass data from your backend to the frontend during a call with the new metadata event. See API reference.

3. Major Upgrade to Python Custom LLM (Important)

Improved async OpenAI performance for better latency and stability. Highly recommended for existing Python Custom LLM users to upgrade to the latest version.

See Doc

4. Webhook Security

Improved webhook security with the signature "verify" function in the new SDK. Find a code example in the custom LLM demo repositories and in the documentation.

Additionally, the webhook includes a temporary recording for users who opt out of storage; please note that this recording will expire in 10 minutes.

See Doc

This week’s video

We’ve got a shout out in the latest episode of Y Combinator’s podcast Lightcone.

Thank you for being part of our community. We look forward to your feedback on these updates and are excited to see how you leverage these new features!

The Retell AI team 💛