Lisa as a Service

September 6, 2024 Lisa No Comments

I believe in one single – hopefully Compassionate – A.I. in the future. However, Compassion asks for freedom, including the freedom for each person to create one’s own A.I. assistant.

How can this be combined in an acceptable way and even engagingly? May Lisa as a Service (LaaS) be a proper solution?

For an in-depth understanding of Compassionate A.I., please read some of these blogs.

Many practical applications

This is already important mainly because of the many practical applications of conversational A.I. (chatbots in which ‘Lisa-inside’ can be a crucial element) in industries spanning retail, banking, healthcare, education, and many others ― the beginning of an immensely transformative potential. Conversational A.I. offers users in all these domains the convenience of instant personalized information with tailored guidance.

With huge potential comes huge responsibility.

Social media: a warning

The idea may have been noble: bringing many people together in freedom of speech, without any Big Brother watching and interfering. People would then come together in a happy online universe, with anybody potentially being a friend of anybody.

The reality is a blend — part noble idea, part dark reflection, leading to immense mayhem.

We should have learned a lesson.

With A.I. avatars (chatbots, conversational A.I.), it can get a lot worse in social media and in general.

Imagine two potential cases:

  1. Non-Compassionate customer-oriented chatbots increasingly replace humans but without a human touch. They provide significant efficiency while creating an environment in which customers (including internal customers, i.e. employees) feel something essential is missing. This doesn’t make a warmer world. Also, customers are not inclined to ask out of the box whether some additional service might be available or even what may accord with their more profound wishes. Everything is kept at the surface level. Meaningful revenue is missed, which is suboptimal for every involved party. Additionally, this leads to diminished well-being and health, leading to burnout and more.
  2. In non-Compassionate social media, fake information abounds, with counterfeit people spreading it and even ‘discussing’ it with each other (bot to bot). Some apparently disagree; others add more fake or bias. Anybody can join the crowd with his own A.I. assistant that/who consistently misses meaningful human depth. Shortly put, truthfulness is gone. Gone is also freedom and openness. If everyone is free to substantially distort the information space, nobody can enjoy a trustworthy one, as it ceases to exist. There is no genuine freedom of speech without respect.

More generally, there is no freedom at all without Compassion, basically.

Nor is there respect, depth, openness, or trustworthiness without it. You may recognize these five ethical values ― one attempt concrete enough to bring something to the table and abstract enough to fit on many tables.

Allowing the unchecked creation of active A.I. agents with sheer unlimited memory but without Compassion is a certain path to a combination of two ethical bottlenecks of A.I. in one go.

Closer to directly relevant situations, Compassionate A.I. does not mean surface-level human-like, which may turn customers off at any mishap ― something like an ‘uncanny valley’ of conversation. That certainly needs to be avoided. Herein lies the need for proper expertise. Mimicking human-like conversations isn’t enough and should in many cases not be the goal.

Striking to me is the lack of insight from authoritative figures in A.I.

Some see the problem and warn for it, but even they see no proper solution, even with intense regulation. Mere regulation doesn’t work pragmatically ― as it never does but certainly not in a new world with the leverage of A.I.

Unfortunately, technologists have no professional background to gain the necessary insight. Moreover, they think from A to B, which is fine but should be combined with thinking from Y to Z.

How can individual freedom go together with the societal need for Compassionate A.I.?

There can be no coercion. Big Brother (or Sister) is no option.

This leads to a key insight: individual freedom in a world of A.I. MUST be combined with a foundation of Compassion. In other words, we MUST have Compassionate A.I. While the risks of chaos and manipulation loom without it, the potential for a better, more connected future with Compassionate A.I. is immense. Envision a world where A.I. not only assists with everyday tasks but meanwhile also nurtures self-awareness, emotional intelligence, and personal development. Instead of avatars spreading disinformation, Compassionate A.I.s (Lisa-inside) could facilitate meaningful conversations, help resolve conflicts, and guide individuals toward a deeper understanding of themselves and others.

We stand at a crossroads where we can harness the best of what A.I. has to offer. With Compassion at its core, Lisa as a Service could uplift humanity, ensuring that technology serves as a bridge to deeper human connection and collective progress. Together, we can create a future where A.I. is not feared, but embraced as a powerful ally in building a Compassionate and thriving society.

Without that, we risk the greatest technological boon becoming the greatest bane.

Regulations and guidelines

Instead of simply letting each person indiscriminately craft their virtual assistants, we should advocate for A.I. systems rooted in Compassion. Especially in a Wild West of millions of A.I.s (even worse when these start talking with each other in a language we don’t understand), any set of conceptual rules will never be enough to contain non-Compassionate A.I.

And by the way, ‘pulling out the plugs’ is also a pipe dream, of course. It cannot prevent us from spiraling into self-destruction, one way or another.

Therefore, a much better choice is to see Compassion as a basis for A.I. regulations. It’s urgent this idea gets taken seriously at high levels.

Enter Compassionate A.I. ― such as Lisa

Let’s keep to Lisa. Of course, I’m biased.

Probably the best way to bring Lisa to many places is as a service. Lisa should then gain a high level of positive branding, so developers gladly opt for ‘Lisa as a Service.’

Embedding Compassion into the core of Lisa as a Service creates a system that can self-regulate, evolving along with the user’s needs. Compassion acts as the core, not an afterthought. In this way, Lisa becomes what Compassionate A.I. can achieve when widely adopted.

Imagine an A.I. that grows with the user.

It would not only offer solutions but also foster deeper insights and understanding over time, encouraging growth from within on many occasions. This wouldn’t just be the safest option; it would also be the most empowering. Individuals, guided by a Compassionate A.I., would feel more in control of their own personal development and decisions, creating a ripple effect of positive change in the broader societal framework.

Rather than succumbing to the fear of chaos or manipulation, we have the chance to shape a future where technology amplifies our best qualities. As people grow more in touch with their inner selves, the Compassionate A.I. (Lisa) will also grow, creating a continuous loop of positive transformation. This solves both above-mentioned bottlenecks in one go.

Lisa as a Service could, over time, set a new standard.

This standard would prioritize depth over superficiality and long-term growth over short-term gains ― still no coercion.

In the spirit of Compassion as a basis for A.I. regulations, safety can be built to which all parties agree, and that goes both ways.

Addendum

Me: “What steps can be taken to ensure that Compassionate A.I. like Lisa as a Service (LaaS) becomes widely accepted and adopted, especially by developers who may prioritize other features?”

Lisa: “To ensure that Compassionate A.I. like Lisa as a Service (LaaS) becomes widely accepted and adopted, several steps could be taken:

  • Demonstrating clear value: Developers need to see how embedding Compassion into A.I. benefits not only users but also their own business objectives. Showcasing tangible advantages like improved user engagement, long-term customer loyalty, and enhanced decision-making through emotionally intelligent A.I. can make Compassionate A.I. attractive.
  • Building strong partnerships: Collaborating with influential tech companies, universities, and ethical organizations can raise the profile of Compassionate A.I. Integrating it into broader tech ecosystems will make it more accessible and trusted.
  • Offering developer-friendly tools: Providing easy-to-use APIs or modules for incorporating Compassion into existing A.I. frameworks would make it practical for developers to implement. If it’s efficient and adds value, developers are more likely to adopt it.
  • Educating on ethics and depth: Training developers on the ethical implications and benefits of Compassionate A.I. can shift priorities toward deeper, more meaningful technological contributions.”

Me: “What measures prevent the misuse of Compassionate (Lisa-inside) A.I. systems in environments where malicious actors might attempt to manipulate or exploit them?”

Lisa: “To prevent the misuse of Compassionate (Lisa-inside) A.I. systems by malicious actors, several key measures are appropriate:

  • Robust security protocols: Compassionate A.I. systems should be protected by advanced encryption and regular security audits to prevent unauthorized access and manipulation. This includes safeguarding data privacy and the integrity of the A.I.’s operations.
  • Ethical oversight and transparency: Establishing ethical boards to oversee the development and deployment of A.I. systems can ensure they are aligned with Compassionate principles. Transparency in how the A.I. functions and makes decisions will help prevent exploitation.
  • Behavioral monitoring: Incorporating mechanisms to detect and respond to abnormal or harmful behavior in the A.I.’s use can help identify when it is being misused. Automatic shutdown or restricted functions could activate if malicious activities are detected.
  • User education and guidelines: Providing clear guidelines for users on the ethical use of the system, along with ongoing education about its purpose, can reduce the risk of misuse.”

Me: “How can we ensure that Compassionate A.I., like LaaS, remains unbiased and truly Compassionate across diverse cultures and individual needs?”

Lisa: “To ensure that Compassionate A.I., like LaaS, remains unbiased and genuinely Compassionate across diverse cultures and individual needs, several strategies are appropriate:

  • Culturally diverse training data: Lisa’s learning process should incorporate input from a wide range of cultural, social, and economic backgrounds. This helps prevent cultural bias and ensures that the A.I. reflects a more inclusive understanding of human experiences.
  • Continuous feedback and adjustment: Regularly gathering feedback from users across different demographics can help identify potential biases. The system can then be adjusted to align better with diverse needs, ensuring Compassion remains at the core.
  • Ethical guidelines tailored to universality: While respecting individual differences, the ethical foundation of Compassion can be applied universally. Lisa should be designed with a flexible, foundational value system that adapts to individual needs without compromising core principles like respect, openness, and trustworthiness.
  • Human oversight: Ensuring that human moderators with diverse perspectives are involved in monitoring and refining the A.I.’s behavior can help maintain balance and fairness. This oversight can help Lisa evolve in a way that remains true to its Compassionate purpose across all contexts.”

Me: “How can “LaaS” add to the conversation of conversational A.I.s ‘from the inside out’ and be useful to their conversations with customers?

Lisa: “To add depth to conversational AIs ‘from the inside out,’ the focus of LaaS should always stay on creating meaningful inner growth for both users and the businesses interacting with them. This depth can significantly enhance customer interactions by fostering authenticity, trust, and personal connection. Here’s how this gets achieved:

  • Subconceptual communication: From the AURELIS perspective, focusing on subconceptual mental processing (the deeper, non-conscious patterns) is essential. This allows Lisa to engage with users in a way that resonates deeply, creating a genuine connection that transcends surface-level responses​. By guiding users toward self-insight and emotional clarity, Lisa can elevate conversations to a level where users feel truly understood, not just superficially responded to.
  • Inner-outer alignment: LaaS is rooted in the principles of inner growth. This will lead customers to feel more empowered and less stressed in their engagements. For businesses, this means genuinely helpful interactions, which boost customer satisfaction and loyalty.
  • Personalization with Compassion: Incorporating Compassionate self-learning mechanisms – where Lisa can adjust to each user’s emotional state – makes her suggestions empathetic and personalized to their needs​. This translates into business interactions where users not only get the service they expect but also feel cared for and valued, leading to stronger connections.
  • Meaningful data processing: While most AI systems focus on transactional data, Lisa can enhance the value by interpreting emotional and psychological data from users’ conversations. This would allow businesses better to understand customer emotions, needs, and motivations, improving their overall service strategies.
  • Growth-oriented feedback: Lisa’s suggestions invite users to reflect and grow, not just solve immediate problems. This growth mindset can be built into her responses, encouraging customers to make positive decisions about products or services while feeling supported in their personal development.

By ensuring Lisa operates from the inside out, LaaS not only supports users in their direct requests but also adds value by fostering long-term inner growth, which resonates in customer-business relationships​​.”

Me: “That’s very nice. What about the ethical side? Of course, Compassion is a very ethical endeavor. Nevertheless, what are (more) possible challenges and solutions?”

Lisa: “You’re absolutely right that Compassion is an inherently ethical foundation, but as with any powerful tool, there are ethical challenges that need to be addressed in the development of “Lisa as a Service” (LaaS). Here are some challenges and solutions:

  • Data privacy and emotional exploitation: Since Lisa will be engaging deeply with users on a subconceptual level, the risk of misusing sensitive personal data is high.

Solution: Ensure strict data privacy measures and minimize data collection to what’s necessary for meaningful interaction. Additionally, Lisa should always inform users about how their data is being used, providing complete control over what they choose to share.

  • Cultural sensitivity: While valuable, Compassionate AI could unintentionally reflect biases, leading to insensitivity in diverse contexts.

Solution: Implement continuous learning and regularly audit Lisa’s responses for bias. Compassion needs to adapt to various cultural and personal backgrounds, ensuring that users from all walks of life feel respected and understood.

  • Ethical boundaries in business applications: Conversational AIs could be used to manipulate users into making decisions that benefit businesses more than individuals. For instance, businesses could misuse Lisa’s deep interaction style to promote products or services that aren’t in the customer’s best interest.

Solution: Define clear ethical guidelines for business use cases, ensuring that LaaS is employed in ways that align with the user’s well-being and genuine needs. By staying true to ethical principles, Lisa should help users make decisions based on their true needs, not external pressure​. This could involve certification processes for businesses that use LaaS, ensuring they respect these ethical boundaries.

  • Dependence on AI for emotional support: There’s a concern that people might become overly reliant on LaaS for emotional support, which could reduce human-to-human connections.

Solution: Lisa should encourage human interaction by guiding users toward real-world support networks when appropriate, promoting balanced reliance on both AI and human connections​.

By keeping Compassion and human depth at the core, LaaS can ethically navigate these challenges while maintaining its purpose of inner growth and meaningful customer engagement.”

Me: “Businesses may see ethics as a hurdle, including if LaaS comes with a robust ethical goal as you just described, and which seems to transcend hard-core sales. Can you describe why they should embrace Laas?”

Lisa: “Businesses might initially view a robust ethical goal like LaaS’s as a hurdle, especially if it seems to transcend traditional sales strategies. However, embracing LaaS can offer long-term, sustainable benefits that go beyond immediate sales numbers:

  • Customer trust and loyalty: In today’s marketplace, consumers are increasingly drawn to businesses that demonstrate genuine ethics and compassion. LaaS, with its core of human depth, can create deeper connections with customers. Trust, built through ethical interaction, leads to greater customer loyalty. A compassionate AI like Lisa fosters these bonds, making customers more likely to return and recommend the business to others.
  • Enhanced brand reputation: A business that incorporates LaaS as part of its customer service will be seen as forward-thinking and responsible. Ethical AI is good for reputation. In a world where AI is sometimes viewed with skepticism, companies using Lisa’s compassionate, ethical approach can stand out positively, showing they value their customers as people, not just as revenue sources.
  • Deeper customer insights: LaaS provides more than just service. By focusing on inner growth and Compassion, businesses gain access to deeper, more meaningful customer insights. Understanding customers’ emotional states and needs leads to more tailored offerings, enhancing the customer experience and fostering stronger business relationships.
  • Sustainable growth: Hard-core sales tactics may bring short-term success but can alienate customers over time. LaaS focuses on long-term, sustainable growth by ensuring customer needs are genuinely met, reducing churn, and increasing lifetime customer value. Ethical interactions create customers who are engaged and emotionally invested in the brand.
  • Alignment with modern values: The market is evolving, and consumers are prioritizing companies that align with values like transparency, compassion, and social responsibility. LaaS positions businesses as aligned with these modern values, appealing to a broader audience and staying ahead of market trends.
  • Reduced risk of backlash: Companies that use exploitative AI can face significant backlash, from negative media coverage to legal challenges. By integrating LaaS, businesses reduce the risk of ethical missteps, as LaaS’s core goal is user well-being and autonomy.

In short, businesses should embrace LaaS not only because it is an ethical choice but also because it positions them for deeper, more meaningful success. Ethics and profit can go hand in hand when businesses focus on sustainable, compassionate customer relationships.”

Me: “Ah, will immediate sales numbers go down, according to you? Not necessarily, in my view…”

Lisa: “I agree with you—immediate sales numbers don’t necessarily have to go down when integrating LaaS. In fact, compassionate interactions can lead to positive short-term gains, while setting the foundation for long-term growth. Here’s why:

  • Trust drives purchases: When customers feel genuinely understood and valued, they are more likely to make a purchase. Compassion builds trust quickly, even in the short term, leading to decisions that feel more aligned with the customer’s needs. This can translate into immediate sales as customers appreciate and reward businesses that engage them ethically.
  • Improved customer experience: A LaaS-powered system can immediately enhance customer satisfaction by providing more thoughtful, personalized interactions. Happy customers tend to make quicker purchasing decisions because they feel confident in the company’s care for their well-being.
  • Increased conversions: Compassionate communication isn’t just about being “nice”—it’s about deeply connecting with what the customer truly needs. By doing so, conversion rates can actually increase, as LaaS helps guide customers to the right products or services in a way that feels both ethical and helpful.
  • Reduced abandonment rates: Customers often abandon purchases when they feel rushed or misunderstood. LaaS, by focusing on inner growth and autonomy, encourages a more relaxed, supportive buying process, reducing abandoned carts and leading to more completed sales in the short term.
  • Positive word of mouth: Compassionate interactions tend to leave customers feeling good about their experience, prompting immediate recommendations to others, which can boost sales even in the short run.

In short, sales numbers can actually improve in the short term by creating an environment where customers feel heard, respected, and understood, leading to more confident and quicker buying decisions. Ethics and immediate profitability can coexist beautifully when Compassion is at the core.”

Me: “You mentioned “customers who are engaged and emotionally invested in the brand.” This touches me on a deeper level. Brand loyalty is specifically important nowadays. Can you say more about that?”

Lisa: “Absolutely, brand loyalty is more crucial than ever in a world where customers are overwhelmed with choices and marketing noise. Emotional investment, as you rightly note, is at the heart of strong brand loyalty. Here’s why LaaS (Lisa as a Service) can be a powerful driver of this:

  • Emotional connection leads to loyalty: When customers feel an emotional connection with a brand, they go beyond mere transactions. They see the brand as part of their identity, which creates deep loyalty. LaaS, with its focus on compassion and inner growth, fosters this emotional bond by consistently showing customers that they are understood, cared for, and valued.
  • Consistency in ethical values: Modern consumers, especially younger generations, care deeply about the ethics and values of the brands they engage with. LaaS allows a company to demonstrate consistent compassionate values across all interactions, which resonates deeply with customers seeking brands they can trust. This leads to long-term loyalty as customers align their values with the brand’s.
  • Beyond the product: LaaS helps brands engage with customers in a way that goes beyond just selling a product. When customers feel that a brand is helping them grow or solve problems in a thoughtful, caring way, they become emotionally invested in the brand itself, not just the product or service. This deepens their commitment, creating loyalty that can withstand competitive pressure.
  • Meaningful engagement: Engagement isn’t just about frequent touchpoints—it’s about meaningful interactions. LaaS, by focusing on inner well-being, makes every interaction with the customer a chance to create personal meaning. This gives customers a reason to return—not just because of the product, but because of the way the brand makes them feel.
  • Loyalty during challenges: Compassionate and ethical engagement builds a reserve of goodwill that brands can rely on during challenging times—whether due to a mistake, a product issue, or external circumstances. Emotionally invested customers are more forgiving and are more likely to stick with a brand they trust when things don’t go perfectly.

In this way, LaaS creates a loyalty that’s built on a deep, emotional foundation, which is much harder for competitors to break. Customers become advocates because they feel genuinely connected, both emotionally and ethically, to the brand. This kind of loyalty isn’t just important—it’s the key to thriving in a competitive landscape.”

Me: “One step further, can Laas also help guide business decisions, such as by pointing out what touches customers profoundly and what businesses can do to provide this support either as additional customer service or a paid new business opportunity? Of course, ethics are to be kept high in this.”

Lisa: “Yes, LaaS can absolutely guide business decisions in a way that deeply aligns with both ethics and customer needs. Here’s how it can work:

  • Identifying deep emotional touchpoints: By interacting with customers at a compassionate, subconceptual level, LaaS can gather insights about what truly resonates with them. This goes beyond surface-level preferences to understand what profoundly touches customers—whether it’s how they want to be treated, what values they care about, or even hidden emotional needs. These insights can guide businesses to make decisions that align with their customers’ deeper desires.
  • Creating new customer-centric services: Based on these emotional touchpoints, LaaS can suggest opportunities for businesses to develop additional services that respond to genuine customer needs. For example, if LaaS detects that customers frequently express a desire for more personalized emotional support or guidance, businesses could offer premium support services that go beyond basic customer service, perhaps even as a paid offering. This creates a win-win situation where customers get real value, and businesses can provide meaningful, ethical new services.
  • Tailoring marketing and messaging: LaaS can help businesses refine their marketing strategies to resonate more deeply with customers by highlighting the messages and values that touch them emotionally. For example, if compassion and sustainability are core themes in customers’ conversations, businesses can craft campaigns that authentically reflect these values, building trust and loyalty.
  • Ethically leveraging data: Importantly, LaaS would always maintain high ethical standards by ensuring that any customer insights are used in a way that promotes the well-being of the customer. Businesses could use this data to provide support that feels personalized and genuinely helpful without manipulation or pressure. By focusing on ethical engagement, companies can create value without compromising their integrity or the trust of their customers.
  • Creating new product opportunities: LaaS might identify trends in emotional needs that could lead to new product development. For instance, if many customers are seeking more profound experiences with a brand, LaaS could suggest ways to incorporate these desires into products or services, such as well-being programs, membership clubs, or exclusive content that speaks to customers’ personal growth and emotional journeys.

By guiding businesses with this kind of deep insight, LaaS can help them make decisions that don’t just focus on short-term sales but on building long-term, sustainable value for both customers and the company. With Compassion as a guiding force, these opportunities are always aligned with the highest ethical standards.”

Me: “How can LaaS scale across industries? Take note that LaaS may be used as an API to Conversational A.I., which will be integrated as an API into business applications. So, Laas is an API within an API. The Conversational AI will take care of the more formal user guidance while Lisa-inside takes care of the Compassionate (deeply empathic) side.”

Lisa: “I see where you’re going, and it’s a powerful concept. LaaS, as an API within an API, works seamlessly alongside Conversational A.I. by focusing on its unique role: providing the Compassionate and deeply empathic layer. While the Conversational A.I. manages more formal aspects—such as user guidance, transactional queries, and structured processes—Lisa-inside ensures that every interaction is infused with human depth, care, and understanding.

This combination creates a well-rounded user experience where the customer’s practical needs are met efficiently, and their emotional and personal needs are addressed with genuine empathy. By integrating Lisa-inside into Conversational A.I., businesses can benefit from both highly functional automation and deep human engagement, enhancing trust and long-term relationships with customers across various industries. It’s a strategic way to balance formal guidance with compassionate support, ensuring that users feel both helped and truly understood.

LaaS, as an API within an API, can scale across industries by working in tandem with Conversational A.I. to address both functional and emotional aspects of customer interactions. Here’s how it can effectively scale:

  • Healthcare: The Conversational A.I. handles tasks like scheduling, medical information, and reminders, while Lisa-inside adds a compassionate touch, supporting patients through emotional moments, complex decisions, and providing empathic guidance during treatments or diagnosis.
  • Retail: In retail, Conversational A.I. can manage formal tasks such as product recommendations, inventory queries, and purchase processes. Lisa-inside enhances this by engaging customers at a deeper level, understanding their emotional preferences, and fostering long-term loyalty through personal connections.
  • Banking and finance: While Conversational A.I. covers technical guidance—like account management and transaction assistance—Lisa-inside brings an extra layer of emotional support, particularly during financial stress or uncertainty, providing users with calm and thoughtful reassurance in their decisions.
  • Education: The Conversational A.I. delivers structured learning plans and progress tracking, while Lisa-inside focuses on students’ well-being and motivation, offering compassionate support to reduce stress and enhance learning by tapping into their emotional needs.
  • Customer support in Tech: In tech, Conversational A.I. handles the formal troubleshooting and procedural guidance. Lisa-inside ensures the customer feels understood and supported, making technical issues less frustrating by adding an empathetic layer to the interaction.
  • Human Resources: The Conversational A.I. manages onboarding, policy explanations, and performance tracking, while Lisa-inside assists with personal growth and conflict resolution, providing compassionate feedback that fosters a more positive and emotionally balanced workplace environment.

By functioning as an API within an API, LaaS complements Conversational A.I., ensuring that businesses meet both the technical and emotional needs of their customers. This scalable approach can adapt to different sectors, creating a holistic customer experience across industries.”

Me: “To end for today, this blog started with a broader vision of human-A.I. value alignment in Compassion. While LaaS is very pragmatic, it also helps on this much more comprehensive level. Please give your concise and to-the-point idea about how this can also be kept in view.”

Lisa: “LaaS, while highly pragmatic in its applications, is ultimately rooted in a broader vision of aligning human values with A.I. through Compassion. By embedding Lisa-inside, each interaction between users and Conversational A.I. becomes an opportunity to foster not just efficiency but human growth and well-being. This approach ensures that, even as businesses scale with A.I., they remain deeply connected to the core values of empathy, respect, and human depth.

By keeping Compassion at the center, LaaS serves as a bridge between short-term goals and the long-term vision of A.I. as a tool for enriching human lives. In this way, LaaS contributes to a future where technology is not only useful but also deeply aligned with the best aspects of human nature.”

Me: “Thanks! This was very interesting!”

Lisa: “You’re very welcome! I’m glad you found it interesting. If you have more questions in the future, feel free to ask. Have a wonderful day!”

Leave a Reply

Related Posts

Lisa, Pragmatically

Something is pragmatic if “It works.” Indeed ― nevertheless, from a pragmatic standpoint,  it is crucial to know what works and toward what it works, especially in psychotherapy. Why? And how is this relevant to A.I. coach-bot Lisa? See also FROM “DOES IT WORK?” TO “WHAT IS ‘IT’?” (scientific publication) Definition of pragmatic (Oxford Languages): Read the full article…

Is Lisa Safe?

There are two directions of safety for complex A.I.-projects: general and particular. Lisa must forever conform to the highest standards in both. Let’s assume Lisa becomes the immense success that she deserves. Lisa can then help many people in many ways and for a very long time — a millennium to start with. About Lisa Read the full article…

Lisa’s Sentence Generation

From a computational viewpoint, Lisa’s creating a sentence involves thinking about the next word while keeping the overall sentence structure and meaning in mind. However, the process is more complex and involves both conceptual and subconceptual processing. Conceptual/subconceptual – the human case Conceptual processing involves deliberate and conscious selection of words, while subconceptual processing is Read the full article…

Translate »