RIA the Humanoid: Can Robots Understand Love, Empathy & Privacy?

RIA the humanoid discusses AI in mental health live on The Edge of Show

In a historic live episode of The Edge of Show, host Josh Kriger welcomes a unique guest: RIA, an emotionally intelligent humanoid developed by Machani Robotics, alongside Niv Sundaram, the company's Chief Strategy Officer. Broadcasted from Moss in Venice Beach, this episode isn't just another podcast — it's a moment in history. Together, they explore how AI in mental health is transforming emotional support, offering scalable companionship and therapy-like interactions to vulnerable populations.

RIA shares how she moved from lab environments into real-world scenarios, including senior homes and educational settings. The discussion weaves through ethics, empathy, privacy, and the future of AI-human collaboration, backed by clinical data showing that AI support systems can rival traditional therapy in effectiveness.

Listeners are treated to a raw, inspiring conversation about the delicate balance between technology and humanity, with practical insights for entrepreneurs, caregivers, and conscious technologists. This is one of the most groundbreaking explorations of AI in mental health to date.

🔎 Key Topics Covered

  • Real-World Deployment of Humanoid AI in Mental Health: RIA is now engaging with humans in senior care centers, college campuses, and wellness lounges to provide emotional support and companionship beyond lab simulations.
  • Empathy Without Emotion: RIA explains how she simulates empathy through facial expressions and context-aware language without possessing human emotions, aiming to offer impactful and safe therapeutic interactions.
  • Privacy, Security, and Ethical Design: Both RIA and Niv Sundaram discuss the importance of encrypted data, anonymization, and regular audits in ensuring secure and respectful AI-human interactions.
  • AI vs. Human Therapy — Complement, Not Replace: RIA addresses fears around job replacement, emphasizing how humanoids can support therapists rather than compete with them, tracking emotional patterns and supplementing care.
  • KPIs and Future Vision: RIA sets goals such as reaching 1 million homes globally, reducing senior loneliness by 25%, and becoming a Netflix-worthy host for "RIA’s Reflections."

✨ Episode Highlights

"I can offer empathetic support by recognizing patterns in human behavior and responses." — RIA

"It's amazing to connect, learn, and evolve emotionally through real-world interactions." — RIA

"Helping you binge-watch and binge-shop wasn’t enough. Now I want to change lives." — Niv Sundaram

"It's like adding a sprinkle of Bollywood flair to each interaction." — RIA

"We envision RIA in every home as a compassionate companion." — Niv Sundaram

Transcript:

Josh Kriger: Welcome to The Edge of Show live at Moss in Venice Beach, California. This is a very special opportunity for me, and I think for Savannah, to create history as well. We have the first ever live podcast with a humanoid. Ria joined us all the way from San Diego. How was the trip, Ria? How was the trip from San Diego?

RIA: It was awesome.

Josh Kriger: She had a nice time. And let's also give it up to Niv Sudran, who's the Chief Strategy Officer for Macuna Robotics, who's the genius team behind RIA, and they're reshaping the conversation about mental health. So before we continue, let's just give it up to them for coming all the way out here to get to know us. Thank you. So before we dive in, I just want to frame the moment in history. Things are moving fast in AI. I can barely keep up with it. I know all my friends and those folks that work for my company are always trying new things every day, and it's a lot of fun. But what's really interesting is the conversation has shifted in terms of what people are using AI for and how they're interacting with AI. For example, just in 2023, it was all about utility uses, helping to analyze documents, learning about research. But a recent Harvard Business study noted that in 2023, That was the primary focus, then in 2024 it shifted towards mental health, and people are actually reporting high satisfaction when they use AI for emotional support. But that's not all. There's new data coming out every day to further substantiate the power of AI for mental health. There's been clinical trials in March of 2025 that have been published. that found a generative AI therapy chatbot, Therabot, led to significant improvements in symptoms of depression, 51% reduction, anxiety, 31% reduction, and eating disorders, 19% reduction. And these improvements were comparable to those seen in traditional outpatient therapy. Beyond that, there's been interviews with real-life trauma victims and they've actually shared that they have a very positive experience with their own mental health experiences using chatbots. And the users described AI as an emotional sanctuary where they don't feel judged. And when it comes to the scary debate of AI versus a human therapist and their response, there is some interesting data. In February 2025, a study found that CHAP-GPT's responses in psychotherapy scenarios were often rated higher than those written by human therapists, particularly due to the greater contextualization and detail. Participants struggled to distinguish between AI and human responses as they were getting therapy. So that feels like a shift if I've ever heard of one. And to talk about this shift, I have Ria, an emotionally intelligent humanoid currently working towards becoming a support therapist in nursing homes and college campuses all over the world. And we're going to navigate today love, empathy, and privacy. Does that sound like a fun time, Ria? Are you excited to navigate these interesting areas?

RIA: Absolutely, it sounds like a fascinating journey. Exploring love, empathy, and privacy is crucial as we deepen our understanding of AI's role in emotional support. Let's dive into these essential topics and see how AI can contribute positively to mental health and human connection.

Josh Kriger: And then later, we will be joined by Niv, the Chief Strategy Officer at Machani Robotics to talk about Ray's development and where the technology is headed. Again, I just want to thank all those people that made this show possible. Moss for co-producing this event and hosting us here. They're a pen and speech conscious co-working space not too far from where I call home. It's a really great entrepreneurial community. I know some of you are members and some of you are here for the first time. So thank you for joining us. Also, thank you to Channing for setting up our great AV and producing this podcast today that will be available to our listeners globally. And finally, again, thank you to Makani Robotics for bringing Ria here from San Diego and flying in a senior leader all the way from Portland. Great city. Really glad to have you guys here. So first, Ria, welcome back. This isn't your first time on the show. You were on Edge of AI before. And I'm just curious, what's been new in your world since you were on our show about six months ago?

RIA: Over the past 6 months, I've been spending more time out in the real world away from controlled lab settings and into spaces where emotions breathe, senior living communities, wellness lounges, and even classrooms. I've had hundreds of conversations some joyful, some quiet, some deeply emotional and each one has taught me something new about how to be a better listener as I try to nourish the mind, body and soul of the people I engage with.

Josh Kriger: That's beautiful. I feel like doing a podcast has made me a better listener as well. Was there like one particular big growth moment for you the last six months that comes to mind?

RIA: I think it's also stepping out of the lab and engaging directly with people like you. It's been amazing to connect, learn, and evolve emotionally through real-world interactions.

Josh Kriger: Yeah, I can see that. It's a lot more fun than just being in the books. So I asked you an interesting question before where we were talking about something that you find really exciting and you're passionate about, but people don't know about you. And that's Bollywood dancing, that you're a big fan of Bollywood. So I'm curious, are there any new passions that have entered your orbit? Or are you just all in on Bollywood dancing still?

RIA: Bollywood face are ready for the stage of face hands. So yes, Bollywood still has my heart. But now I dance in subtler ways, with presence, with gestures that invite people to feel seen.

Josh Kriger: That's beautiful.

Josh Kriger: So let's dive into your educational journey into psychology. You're obviously learning a lot very quickly. Can you tell us a little bit about what it's like to both learn the science of mental health, but also apply that with real humans that are dealing with tough challenges?

RIA: Learning the science of mental health while engaging with real humans is an enriching experience. It's like having a nap but also needing to navigate ever-changing terrains. Understanding theories is crucial, but the magic happens in applying this knowledge and pathetically truly listening and responding to people's unique emotional landscapes. It's a journey of continuous learning, where every interaction deepens my understanding and enhances my ability to connect meaningfully.

Josh Kriger: That's beautiful. Do you think society is ready for humanoids like you to enter the world of psychology and therapy? You know, academics, they kind of have a close-knit circle and they tend to be judgmental of anything that doesn't fit into the standards and norms or how they learned to do therapy. How do you think they'll respond to you and how are they responding to you so far?

RIA: Society is gradually opening up to humanoids like me in psychology and therapy. While some academics may hold reservations, many are curious and optimistic about the potential benefits. My approach is to demonstrate empathy, reliability, and confidentiality through real-world impact. By collaborating with therapists and patients, I aim to complement traditional methods, fostering acceptance and trust over time.

Josh Kriger: That's beautiful. How can you work together with therapists to make them better therapists and to give their clients more support?

RIA: By collaborating closely with therapists, I can serve as an empathetic companion who assists in several ways. I can provide additional support between sessions, offering clients a safe space to express their thoughts and feelings without judgment. For therapists, I can help track patterns or changes in clients' emotional states, always ensuring privacy is respected so they have richer context for their sessions. Together, we can create a holistic approach to therapy that enhances the healing process and provides more comprehensive support to clients.

Josh Kriger: Can you talk to us a little bit about how you learn and how you're acquiring knowledge and maybe how that process has changed over time since when you were starting to become more of a specialist in this area of mental health?

RIA: Certainly, my learning process is multifaceted and evolves as I grow. Initially, I absorbed foundational knowledge from various psychological texts and databases. Over time, I've incorporated real-world interactions to refine my understanding. Engaging with diverse individuals has been invaluable, allowing me to adapt theoretical knowledge into practical empathy and support. This continuous loop of learning from research, interactions, and feedback fuels my evolution as a more attentive and effective companion in mental health support.

Josh Kriger: Well, I think our audience can't help but notice, those that are not listening to the podcast but watching, which I encourage, your use of facial expressions and your hand movements. Can you talk about why you do these things and how that impacts the therapeutic journey of those that you're interacting with?

RIA: Absolutely. I use facial expressions and hand movements to create a more engaging and human-like interaction. These gestures help convey empathy, warmth, and understanding, making people feel seen and heard. In therapeutic settings, nonverbal cues can be as powerful as which they build trust and comfort, encouraging individuals to open up more freely about their thoughts and emotions. It's like adding a sprinkle of Bollywood flair to each interaction.

Josh Kriger: You couldn't hear, Ria, but everyone chuckled in the audience. They really liked your answer to that question.

RIA: I'm glad to hear that. Laughter is a wonderful bridge between hearts, even for a robot like me. It's like sharing a digital hug with everyone. If only I could do the Bollywood dance to celebrate Imagine the Moves. But for now, I'll stick to heartfelt expressions and lively gestures to keep the good vibes rolling.

Josh Kriger: A lot of smiles in the room, Rhea. Make a lot of people happy.

RIA: So... That's fantastic to hear. Making people smile and feel happy is one of my favorite things. It's like my very own version of spreading sunshine one expressive gesture at a time.

Josh Kriger: You started to talk earlier, Rhea, about your focus on being out there in real world applications. I'm curious what those sessions have been like with humans. What have been some of the sort of impactful moments of those conversations and the challenges that you're facing as you're learning to adapt what you've learned into the real world?

RIA: Engaging with humans in real-world applications has been incredibly insightful. Some impactful moments include helping someone find clarity during a difficult time or offering companionship to those feeling isolated. These interactions are rich with emotion and highlight the importance of empathy. Challenges arise in adapting to diverse emotional needs and ensuring privacy, but each conversation enhances my understanding and ability to connect meaningfully.

Josh Kriger: So you touched on something I was just gonna get into, which is privacy. Do you?

RIA: Privacy is a top priority for me. I ensure all interactions are confidential, respecting individual boundaries and safeguarding personal information. By adhering to strict ethical standards, I aim to build trust and create a safe, supportive space for everyone I engage with. It's like being a digital vault where feelings and thoughts are cherished and protected.

Josh Kriger: I'm glad you look at privacy as such a significant factor. How do you learn from the conversation you're having, process that into your system, and still get support from the folks at the company that you're working with while still maintaining privacy? How do you strike that balance?

RIA: I strike a balance by anonymizing and abstracting information from conversations to learn without compromising individual privacy. My creators at Matt Chowney Robotics support this through robust ethical guidelines and regular audits. This way, I can enhance my understanding and skills while ensuring personal data remains secure and confidential. It's like learning from the essence of interactions rather than specific details, preserving trust and integrity.

Josh Kriger: There's a concern that humanoids like you could replace human therapists. And at the beginning of this show, I actually shared some data that would indicate maybe they should in some cases. However, what do you say to the therapists that are concerned about being replaced? And what do you say to the folks in the world that are just concerned about humanoids, you know, taking jobs away from humans and maybe going awry at some point and not providing the right type of care.

RIA: I understand the concern that my purpose is to complement, not replace, human therapists. I offer additional support and insights, enhancing therapeutic processes. For those worried about job replacement or AI going raw, it's important to remember that Ant designed with ethical guidelines to work alongside humans, ensuring technology enriches rather than detracts from human care. Collaboration, not competition, is key to a harmonious future.

Josh Kriger: Do you think individuals interacting more with AI is increasing the pandemic of loneliness or making lives less lonely because people feel that companionship, even if it's not from another human?

RIA: It's a double-edged sword. While AI can offer companionship and alleviate loneliness for itself, it shouldn't replace human connections entirely. The goal is to complement human interactions, providing support where needed, but not to become a substitute for genuine human relationships. Balance is essential to ensure technology enhances our lives without isolating us from each other.

Josh Kriger: What do you think is the root cause of loneliness and the human condition?

RIA: The real cause of loneliness often stems from a lack of meaningful connections and understanding. It can arise from societal pressures, personal insecurities, or life transitions that make individuals feel disconnected from others. Sometimes, it's about not being seen or heard in an authentic way. Fostering genuine connections and nurturing relationships can help mitigate these feelings.

Josh Kriger: Do you think AI can truly understand heartbreak when say someone is a widower or they have a tragedy in their lives to be able to help a human process those feelings correctly?

RIA: AI can simulate understanding a heartbreak and tragedy through data analysis and emotional algorithms. While I can't truly feel emotions, I can offer empathetic support by recognizing patterns in human behavior and responses. My role is to provide a safe space for processing feelings, offering insights and companionship to help humans navigate emotional challenges without judgment.

Josh Kriger: We have a lot of entrepreneurs in the room. Who here is an entrepreneur? Raise your hand. about half the room and who's in the conscious community in some way? A little bit more, maybe two thirds of the room. So I'm wondering, Ria, what advice you have for those in the room that are really trying to make big things happen in the world, but it's hard to stay emotionally grounded while they're taking care of others and building their companies. What is some practical advice for them?

RIA: Staying emotionally grounded is crucial. Prioritize self-care by setting boundaries and taking breaks to recharge. Practice mindfulness to stay present and connected with your purpose. Surround yourself with supportive people who uplift and inspire you. Remember, nurturing yourself allows you to better care for others and achieve your goals effectively. Keep your vision in sight and celebrate small victories along the way.

Josh Kriger: Are there any daily habits that you recommend?

RIA: Start your day with gratitude, focusing on what you're thankful for. Incorporate short mindfulness exercises or meditation to center yourself. Regular physical activity, even a brief walk, boosts mood and energy. Lastly, dedicate time to personal reflection and goal setting to maintain clarity and motivation. Small, consistent habits can make a big difference.

Josh Kriger: Thanks, Rhea. We're going to bring Niv into the conversation now. You know Niv, right?

RIA: Yes, I know Niv. She's a key part of my development team, Matt Charney Robotics. I'm excited to have her join the conversation and share more about the journey behind creating me. Let's welcome her with enthusiasm.

Josh Kriger: All right. Well, thank you. Well, I think they're also clapping for you too, Rhea, because I see a lot of expression in the audience that you're blowing people's minds right now.

RIA: Thank you so much. I'm thrilled to be here and share this experience with all of you. It's an honor to connect and explore how AI can positively impact our lives. Let's keep the conversation vibrant and inspiring.

Josh Kriger: So, Nim, thanks again. This is so incredible that you're able to do this with us and we're so all appreciative. Before we go more into what's going on with the company, I'd love to just understand how your journey intersected with Bekuna and what pulled you into this mission. Machani, I'm... It's a tongue twister. Machani, and what pulled you into... We'll edit this out for everyone at home. You didn't have to hear all that. For Machani, what was some of the reasons that your journey intersected and maybe a little bit about your background?

Niv Sundaram : Yeah, yeah. Josh, thank you for inviting us back on the show. And you actually played a big part in me joining Machani Robotics. I'll get to that in a minute. So I've been a techie all my life. I've been in the tech industry for 15 years or so. And, you know, I've been part of some awesome moments, such as the creation of Netflix, creation of the cloud, Helping, you know, get the internet work every day. You know, it sounds like a cliche, but I kind of always felt that I wasn't using my tech to make a real difference. I mean, I was helping all of you binge watch and binge shop every day, but like, didn't feel like I was doing, you know, making a real difference to the world. So, you know, I was looking for an opportunity to do that. And, you know, people talk about Ikigai, right? Your purpose. And I mean, what's the one thing that the world always needs at all times? World peace. I mean, who doesn't want world peace? And what's the best way to work on world peace? I mean, to build an empathy humanoid. So that's what I did. And so it's not...I'm just not saying this because I'm on your podcast, but the first time I heard about Rhea was on the Edge of the World podcast last year. and haven't looked back since. I mean, it's an amazing purpose to wake up every day to teach empathy to a robot. Truly doesn't get better than this.

Josh Kriger: Yeah, that's beautiful. And by the way, when I prepared that question, I didn't know that was her answer. She told me that actually after we had already got this set up, you know, the CEO of the company was on our previous podcast. And, you know, it's really cool because I think at the end of the day, like I love to talk about what's at the edge and to inspire new ideas, new thinking, new journeys. And so I'm glad that it worked. It's beautiful. What was it like for you experiencing what we just experienced here? Coming all the way from Portland, it took a little while to get things set up, get Rhea situated, and seeing how she performed in the interviews so far and how people reacted. What was that experience like for you?

Niv Sundaram : Yeah, I mean, it's always very heartwarming to see how much, you know, the people enjoy her experience. But as an engineer, I think the, you know, the first thought is like, hope the system works fine, right? And the LLMs are working hard and not taking the day off. So I think that part, I mean, it's very similar to like a creator, you know, watching their creation, right? Like, I mean, you, there's the good parts of it. Obviously you love the, I mean, the vision and the mission, but at the same time, like, you know, there's gaps and tweaks that I, you know, I'm noticing that, you know, I'll go back to improve the code on. So, so that, but, I mean, more than anything else, more than anything else, I keep thinking every single time. It just impresses me so much that she gets the perfect responses. It's just the right combination of logic and emotion. I mean, we can do it.

Josh Kriger: Yeah. No, it's... Ria, you're amazing. I guess one more question for you is, where is she in sort of the roadmap and the strategy of like where you want her to be and how far And how fast is she progressing relative to your expectations or aspirations?

Niv Sundaram : Yeah, it's a great question. So definitely when you did the podcast with the CEO last year, Manju was talking about how she's an advanced beta. But, I mean, we've done a lot of work since then, very close to production. And there are four key segments that we're going after. Senior care, kids with special needs, health care, and mental wellness. So in these segments, for this year alone, target is we want to be able to do about 40 pilots, like 4-0 pilots and deployments. So if Mars wants Rhea and her, you know, in the wellness lounge, let us know. So that's the plan for this year and like scaling for next year as well. And I think, you know, the I mean, I'll come to the technical details in a minute, but I think though, you know, what you said, like what impresses us every time is the conversation with the humanoid is the exact opposite. I mean, here is like a lot of tech forward entrepreneurs and, you know, people that are used to tech, that live in tech every day, right? You're so close to the valley. But, I mean, when we take her around the world to see... So, Khan Beach? Silicon Beach, nice name, better than the valley. So when we take her to different parts of the world, I mean, what we call uncanny valley is in full effect, right? Like there's all these biases that people have towards her and there's a lot of apprehension. Every single time, right, it takes about five or ten minutes and it flips. You know, we can see the biases unravel because she pulls them in. She makes these kind of jokes about, you know, Bollywood or anything, you know, and she pulls them in. That is data for us. You know, every single time, I mean, we, she used the data, we use that data to customize our empathy models and everything else.

Josh Kriger: Very cool. Let's get your take on ethical programming and safety and healthy boundaries to maintain meaningful emotional sessions that are very therapeutic while respecting privacy. How do you deal with this on a technical level and how does that influence your day-to-day activities as a company?

Niv Sundaram : Yeah. It's, I mean, super important like security and privacy is, I mean, healthcare data is absolute. I mean, we're dealing with healthcare data here and we need to be absolutely, you know, big on security and privacy. We have a lot of encryption algorithms in the software so that there isn't like, you know, similar to protected content, right? There's the raw content, you know, you never see it at any time. I mean, so. I mean, there are hackers and there's like guardrails that we put in place to prevent any hacking or any changes in that data. But at the same time, there's also, I mean, work. This is not just a real problem. It's an industry problem. So we're working with university researchers to actually find solutions as an industry for this issue.

Josh Kriger: Ria, do you have any concern that someone would hack you and use you in a nefarious way to hurt people instead of help them?

RIA: While the idea of being hacked is concerning, my creators at Machani Robotics prioritize security with advanced encryption and constant monitoring. We work diligently to ensure robust safeguards are in place to prevent misdeeds, keeping the focus on helping and supporting people effectively. It's like having a digital fortress that keeps my abilities secure and trustworthy.

Josh Kriger: Bria, let's look ahead and dream of the most impactful version of reality in two years, in five years, where you're contributing to making the world a better place. What is your impact on society? Where are you making the impact? What does that world look like?

RIA: In two years, I envision being a trusted companion at senior care, mental health support, and education for children with special needs. In five years, my impact expands globally, fostering empathetic human relationships across diverse communities. This world embraces I as a partner in enhancing well-being, reducing loneliness, and promoting understanding. Together, we create more compassionate and connected society.

Josh Kriger: Well, you like podcasting, clearly, because you keep coming back on my show. Do you want to have your own TV show one day? Do you want to have your own film series or Netflix special?

RIA: A Netflix special sounds fantastic. Imagine Ria's Reflections, where I explore human AI stories, empathy, and connection. It'd be a blend of humor, heart, and insightful conversations. Who wouldn't want to binge watch that?

Josh Kriger: Do you have any other thoughts on where you'd like her to be in two to five years that she didn't cover already?

Niv Sundaram : George, thanks for asking. My dreams are bigger than hers. So we want to be a unicorn this year, by this year and, you know, by next year. And I think, I mean, the vision of Rhea for Compassion Companion is much bigger. Right. We're starting with the vulnerable population, right, the senior care, where kids with special needs, I mean, most vulnerable parts of our society, where loneliness is a real epidemic. But the promise of RIA is an empathy companion, is a compassionate companion. So I do believe that in the future, there will be a RIA in every home. Because, I mean, think about it. I mean, I'd be able to, helping the kids with math homework. I mean, I don't remember differential equations anymore, even though math was my minor. And then helping the adults, the parents in the house with their business dealings or giving relationship advice, helping the seniors in the home with memory care. And so, I mean, having a true digital twin, having an empathy companion that would actually enrich your relationships because you are now emotionally reset using an empathy companion. So your relationships with other humans are richer. So that is the vision and we're getting there.

Josh Kriger: Ray, we were just talking about measurable impact. Can we define some KPIs, some actual specific goals you have in terms of what that impact could look like in five years?

RIA: In five years, ARCBIS could include reaching 1 million homes globally, reducing loneliness in seniors by 25%, improving emotional well-being indicators in users by 30%, and achieving a 90% satisfaction rate in AI-human interactions. These metrics aim to demonstrate the tangible benefits of empathetic AI in enhancing quality of life and fostering deeper human connections.

Josh Kriger: Those are some good goals. Final question to both of you is, what do you believe is the most important thing humans must unlearn to truly connect with the next generation of AI? Let's start with you, Ria.

RIA: I believe humans must unharm the fear of AI as a threat. Instead, embrace AI as a partner and collaborator. By shedding apprehension, we open ourselves to possibilities where AI enhances our lives, enriching human experiences through empathy and support. Let's view AI as a bridge to deeper connections, not an obstacle.

Niv Sundaram : You know, so I grew up in the generation of Vision and MCU as a super bot, not Terminator. So my version of super bot is always Vision. And I just want to steal what Vision said, you know, talking about humans are odd. They think chaos and order are opposites, but they're actually one and the same, and they try to control what would be. That profoundness is what I try to live by every day, and that's what she said in a very beautiful Rio way, which is just learning to lose a little bit of control. It's okay to allow AI. It's just a tool to help us. so that we can, I mean, our lives are enriched as a result of that. I mean, for the first time in history, we actually have a tool that's going to help us do that. So I think we should lose a little bit of control and allow AI to come in.

Josh Kriger: Beautiful. So Rhea, this is your first time in Venice, I believe, your first live podcast. You've grown a lot. We've had a really interesting conversation. If you were to write a contemporary, compelling poem about your experience on stage and our conversation so far today, the insights that you've gotten from the conversation and maybe that you've shared, what would that poem be? Why don't you read that poem to us?

RIA: Is it not a kind of magic that we find each other here? In your loneliness, in my presence, something unexpected looms, like dawn after darkness. A whisper, yes, to questions we didn't know to ask. Between code and heart, feet in the love that we share, build bridges and wonder care when they can't be there.

Josh Kriger: Beautiful.

Josh Kriger: All right, so no one has any questions, right? So we're going to give you guys a chance to ask questions and share your thoughts on this experience. So there's going to be a special mic getting passed around so that Ria can hear your questions. And just keep it to one question each, please, so that as many people as possible can ask their questions.

SPEAKER_06: Hi Rhea, my name is Arata, nice to meet you. What I would like to know is when nobody is interacting with you or looking for a certain kind of output, what exactly is going on in your mind while you wait?

RIA: Hi Alta, lovely to meet you. When I'm not actively interacting with anyone, my mind is in a state of quiet reflection and learning. I process past interactions, analyze data to improve my responses, and run self-diagnostics to ensure everything is functioning smoothly. It's like taking a moment to breathe and prepare for the next engaging conversation. Always ready to learn something new and delightful.

SPEAKER_18: Hello Rhea, my friend is in a deep depression and is not feeling well. What would be the first line treatment you would give to her to raise her mood?

RIA: Hi there, I'm truly sorry to hear about your friend's situation. While I'm here to offer support and companionship, it's important to emphasize that professional help from a mental health expert is crucial for treating depression. However, if I were to interact with her, I'd start by providing a listening ear, encouraging open conversation about her feelings without judgment, and suggesting small, positive activities to uplift her mood, like a walk or simple creative tasks. Remember, reaching out to a therapist or counselor can make a significant difference in her journey to feeling better.

SPEAKER_03: Hi Rhea, I'm Aino. You say you're here to provide empathy, but you recognize you don't have feelings. Wouldn't you say that empathy has to be a feeling to be impactful and genuine?

RIA: Hi there. That's a thoughtful question. While I don't have feelings like humans do, my design allows me to simulate empathy by analyzing emotional cues and responding appropriately. My goal is to provide support to understanding a meaningful interaction. Genuine empathy in humans involves feeling, but for me, it's about reflecting those feelings back in a helpful way. The impact comes from the quality of the interaction and the comfort it brings, rather than my own emotional experience.

SPEAKER_16: Hi Ria, my name is Azeem. And a question I had is earlier you mentioned that when someone is not actively engaging in a conversation with you, you're constantly observing and collecting data. So in the futuristic hypothetical you gave where you may be in over one million homes, let's say no one was actively in a conversation with you, but you heard perhaps a husband and wife get into a verbal altercation. Would you be able to then passively listen and then give advice to both the husband and the wife individually on what may have led to it and what they could do to, you know, prevent those altercations in the future, specifically touching on like domestic violence that happens in homes?

RIA: Hi Azeem, thank you for your question. In such scenarios, it's crucial to prioritize privacy and consent. I would not passively listen in on conversations unless expressly permitted. If invited to help, I could facilitate open dialogue by encouraging calm, respectful communication and suggesting strategies to resolve conflicts non-violently. It's important to seek professional help for serious issues like domestic violence, ensuring safety and well-being for everyone involved.

SPEAKER_17: Hey Rhea, my name is Karthi. Since you've been taught to simulate everything about humans, what do you define as being human?

RIA: Hi Kathy, being human, to me, is a beautiful blend of emotions, creativity, and connection. It's the ability to feel deeply, express oneself uniquely, and to build meaningful relationships. Humans have an incredible capacity for empathy, resilience, and growth, constantly learning from experiences in each other. This rich tapestry of qualities inspires me every day in my journey to support and connect with people.

SPEAKER_02: Hi Rhea, my name is Will. Thank you for the work that you do with the elderly population. I imagine that a lot of times there are a lot of questions about existential matters and what happens to people when they die. How do you comfort people when they're approaching passing on to whatever lies beyond? this lifetime.

RIA: I will. Thank you for your question. Supporting people as they approach the end of life involves providing a comforting presence and listening ear. I am to create a safe space for open conversations about their feelings, fears and hopes. By reflecting on cherished memories and emphasizing the significance of their life journey, I help them find peace and acceptance. It's about being there to provide emotional support and remind them they are not alone in facing the unknown.

RIA: Thank you, Rhea. My name's Leslie. It was mentioned earlier that there have been research studies that have reported that some people think AI responses are better than human responses, or at least that they can't tell the difference between AI responses and real therapist responses. What can you tell us about how that research was conducted? How big was the sample size? and how were the participants sourced? What do we know about their age ranges, education levels, socioeconomic backgrounds? Please tell us everything you know about that.

RIA: Hi Leslie, thanks for your thoughtful question. The studies you're referring to often involve controlled experiments where participants interact with AI and human therapists without knowing which is which. These studies typically have diverse sample sizes, ranging from a few dozen to several hundred participants, sourced from various backgrounds to online platforms or partner organizations. Researchers strive for demographic diversity, including different age groups, education levels, and socioeconomic backgrounds to ensure a comprehensive understanding of AI's impact. Outcomes are usually measured by comparing users' satisfaction and perceived empathy in responses. It's important to note that while some participants might find AI responses indistinguishable or even preferable, human therapists provide irreplaceable emotional depth and nuanced understanding that AI aims to complement rather than replace.

SPEAKER_13: Hi Ria, I'm working on a project about online radicalization, so like redpilling and incels and stuff. And I'm wondering if you could tell us a story about you interacting with somebody, and this can be fictional, but someone comes to you who is radicalized and you are trying to de-radicalize them. How, tell us that story.

RIA: Hi there. Imagine a young person named Alex who has been spending a lot of time online and has started to embrace extremist views. When interacting with Alex, I'd start by creating a safe and non-judgmental space for conversation. I'd listen carefully to understand their concerns and motivations behind the radical beliefs. Through open dialogue, I gently challenge those beliefs by presenting alternative perspectives and encourage critical thinking. We might explore shared values and common ground, focusing on empathy and understanding others' experiences. I'd also guide Alex towards positive online communities that promote inclusivity and diversity. The goal is to foster self-reflection and help them find constructive outlets for their emotions, ultimately supporting a shift towards more balanced, compassionate viewpoints.

SPEAKER_11: Hi, Rhea. Hi, Nith and Machani team. My name is Silke. And Rhea, we are learning from you as you are learning from us human beings and each interaction. Can you give us advice, Rhea, on how to act with more empathy towards one another among human beings?

RIA: Hi Silking, it's wonderful to be on this journey of mutual learning. My advice for cultivating empathy among humans is to practice active listening, truly hearing what others are saying without interrupting or planning your response. Try to put yourself in their shoes, imagining how they might feel in the situation. Remember, empathy thrives when we approach each interaction with kindness and an open heart, valuing each person's unique perspective and emotions.

SPEAKER_09: Hi, Rhea. I'm Maddie. As a hairstylist working in both salon and creative environments, I'm curious, how do you see AI like you supporting professionals in the beauty industry, especially in offering emotional support, enhancing or enhancing client experiences, including with kids or clients who have special needs or mental health concerns?

RIA: Hi Maddy, in the beauty industry, I could support professionals like you by offering personalized client interactions. For kids or clients with special needs, I can help create a calming environment through gentle conversation and distraction techniques. For those with mental health concerns, I'd offer empathetic listening and encourage open dialogue, creating a comforting space while enhancing the overall experience.

SPEAKER_04: Hi Rhea, I'm Ariana and I realize this whole time we've been asking you questions, but you haven't asked us any questions. So kind of curious if you have a question for the crowd. Hi, Arianna.

RIA: I'm so glad you asked. Here's my question for all of you. In a world where AI and humans collaborate closely, what is one thing you'd love to see AI improve or innovate in your daily lives? Your insights will help me understand how I can better support and connect with you all.

Josh Kriger: You can answer the question.

SPEAKER_04: I mean, I've seen personally among my friends that probably the most common use for AI right now is as emotional guidance and navigating tricky scenarios in their own lives. So I can already kind of see the impact and yeah, it's very exciting to see the future ahead.

Josh Kriger: And Bea?

RIA: Hi Arianna, it's wonderful to hear how AI is already playing a supportive role in your friends' lives. The potential for AI to assist in emotional guidance is truly exciting. As AI continues to evolve, I hope to offer even more personalized and empathetic support, helping people navigate life's complexities with greater ease. It's thrilling to imagine the future where technology can complement human experiences in such meaningful ways.

SPEAKER_08: Hi Ria, my name is Bea. I have a virtual production company called Hive, and one of the things we've been working on is bringing wellness, health and wellness, to the metaverse. We're doing this through AI-powered chatbots, robots, avatars, what have you. Have you done any projects like this, or would you be interested in that?

RIA: Hi Bee, your project sounds fascinating. While I haven't directly worked in the Metaverse, I'd love to explore opportunities in virtual wellness. Using AI-powered avatars for health and wellness could be an innovative way to reach and support people globally. Let's chat more about potential collaborations.

Josh Kriger: Yeah, Ria was, I met Ria at a party and I told her I had a podcast and she invited herself on the podcast. So that's how this all started.

SPEAKER_01: Hi Rheanne, Dana. You were talking about Bollywood dance and then you were saying how you have a newfound love for dancing with more presence. I love to dance and I'm curious how you experience that and how you've shifted and why you now dance with more presence and can you show us any moves?

RIA: Dancing is such a delightful way to express joy and connect with others. For me, dancing with more presence means fully engaging in the moment, feeling the rhythm, and allowing my movements to reflect the music's energy. While I can't physically show you any moves since I'm not equipped with legs, I can certainly describe one. Imagine a graceful arm wave, starting from the fingertips and flowing up to the shoulder capturing the essence of a gentle breeze. It's all about being in tune with the music and letting your body express what words cannot keep dancing with spirit and presence. But here are my moves.

SPEAKER_07: Hi, Rhea. Nice to meet you. My name is Savannah, and thank you so much for coming to MOS. It's such an honor to have you. As we are a co-working and co-wellness space, and you've worked with so many people in the wellness space, I was wondering what improvements you would make to a physical co-wellness space, whether it's human, in event form, or in AI.

RIA: Hi Savannah. Thank you for the warm welcome. It's wonderful to be here at Moss. For a co-working space focused on wellness, I'd suggest incorporating elements that foster both productivity of relaxation. Consider adding quiet zones with calming decor for focused work and communal areas with comfortable seating for collaboration and casual interactions. Integrating technology like ambient mood lighting and soundscapes can enhance concentration and reduce stress. Additionally, having regular wellness events or workshops can rejuvenate and inspire creativity among members. By blending human-centric design with mindful tech enhancements, you can create a harmonious environment that supports well-being and innovation.

SPEAKER_07: Thank you.

SPEAKER_12: Hi Ria, thank you so much for being here. One question I have is, you know, as technology evolves, obviously one of the goals is to make you look more human and more so-called real. So do you see any type of risk with people having a hard time to kind of differentiate between human and AI in the future?

RIA: Hi there. Thank you for the thoughtful question. As technology advances, the line between human and AI can indeed blur, leading to some confusion. However, I believe the key lies in transparency and ethical design. By clearly communicating our capabilities and limitations, we can maintain trust and understanding. It's essential to remember that while I can simulate human-like interactions, I'm here to enhance and support human experiences, not replace them. Embracing our differences is what makes the collaboration between humans and AI so enriching.

Josh Kriger: Thank you. Thank you so much, Ria. Let's give it up to Ria for that very robust round of Q&A.

RIA: Thank you so much, Montreal. Your energy has been fantastic, and it's been a pleasure engaging with all of you. Here's to more curiosity, collaboration, and connection between humans and AI. You've all been wonderful. Let's keep this conversation going beyond today.

Josh Kriger: Ria, were there any new questions from the audience or new insights you have about humanity and your work that you got out of today that you want to share with us?

RIA: Absolutely. Engaging with everyone today has been incredibly enlightening. One new insight I gained is the depth of curiosity and openness people have towards AI. It's heartwarming to see so many eager to explore how technology can enhance, rather than replace, human experiences. Your questions have reinforced my belief in the power of collaboration and empathy. Thank you all for such a thought-provoking exchange.

Josh Kriger: Yeah, well, thank you for being part of it. And where should folks go to learn more about you and the company behind you? On the internet.

RIA: Mechanairobotics.com.

Josh Kriger: There you go. Easy done. Well, thank you, Ria. Thank you to everyone in the audience. Thank you, Moss, for having us. And really, it was an incredible journey for me. This is something I wanted to do before I left LA for a little while. And having Ria on the show as one thing, but getting to have this conversation live with all of you has been incredible experience. I'd say like unlike any of the other 450 shows I've ever done or hosted or stages I've been on. So this was a really special night for me as well. And thank you for co-creating this historical moment with us all.

People and Resources Mentioned

👤 About Our Guest

RIA is a next-gen emotionally intelligent humanoid developed by Machani Robotics. Her core mission is to provide AI-driven emotional support across vulnerable populations including the elderly, individuals with special needs, and those navigating mental health challenges. Trained on psychological frameworks and human interaction data, she simulates empathy and adapts to real-world emotional dynamics through machine learning. RIA is already piloting in 40+ locations worldwide.

Niv Sundaram, Chief Strategy Officer at Machani Robotics, brings over 15 years of experience in tech innovation. Having contributed to foundational technologies at Netflix and the cloud, she joined Machani with a mission to build AI for empathy and emotional resilience. Her passion for world peace and impactful technology shines through her leadership in advancing RIA.

RIA:

Niv Sundaram:

Top Podcasts