Conversational AI in Salesforce: A Study of Einstein Bots and Natural Language Understanding
DOI:
https://doi.org/10.63282/3050-9262.IJAIDSML-V4I3P111Keywords:
Conversational AI, Salesforce Einstein Bots, Natural Language Understanding (NLU), Chatbots in CRM, Intelligent Virtual Assistants, AI-driven Customer Service, Dialogue Management, Natural Language Processing (NLP), Machine Learning in SalesforceAbstract
Chatbot AI is turning into an essential part of the contemporary customer management beliefs (CRM). In order to automate customer support, lead generation, and the process of workflow management, Salesforce, a market leader in CRM, has introduced AI-driven virtual dimininiske povrh2718 noticeable bots, so-called Einstein Bots. The present paper assesses the success of the Salesforce Einstein Bots with reference to Natural Language Understanding (NLU), end-user satisfaction and real-life issues of implementation. We discuss the issues of conversational models in Salesforce regarding multilingual input, intents identification, extraction of entities and fallback cases. The paper is based on a review of literature, architectural survey and simulation of enterprise. In our findings, it has been indicated that Einstein Bots can work in structured settings but they have a weakness on contextual awareness and interpretation of sentiments. The paper provides suggestions on how to improve NLU pipelines and describes the best practice on optimising user experience. Our strategic recommendations to strike a balance between the effectiveness of automation and user activity in chatbot implementations in Salesforce are the conclusion
References
[1] T. Mitchell, Machine Learning, New York: McGraw-Hill, 1997.
[2] J. McCarthy et al., "A proposal for the Dartmouth summer research project on artificial intelligence," AI Magazine, vol. 27, no. 4, pp. 12–14, 2006.
[3] A. Turing, “Computing machinery and intelligence,” Mind, vol. 59, no. 236, pp. 433–460, 1950.
[4] M. Wooldridge, Introduction to Multiagent Systems, 2nd ed., Wiley, 2009.
[5] Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, pp. 436–444, 2015.
[6] D. Jurafsky and J. H. Martin, Speech and Language Processing, 3rd ed., Pearson, 2019.
[7] J. Gao et al., “Neural approaches to conversational AI,” Found. Trends Inf. Retrieval, vol. 13, no. 2-3, pp. 127–298, 2019.
[8] S. Saha et al., “A comprehensive survey of chatbot systems using AI,” IJCA, vol. 181, no. 5, pp. 7–12, 2018.
[9] R. Collobert et al., “Natural language processing (almost) from scratch,” JMLR, vol. 12, pp. 2493–2537, 2011.
[10] T. Bocklisch et al., “Rasa: Open source language understanding and dialogue management,” in Proc. NeurIPS Workshop, 2017.
[11] D. Chen and W. Buntine, “Topic modeling using non-negative matrix factorization,” in Proc. ECML PKDD, 2009.
[12] R. Navigli, “Word sense disambiguation: A survey,” ACM Comput. Surv., vol. 41, no. 2, pp. 1–69, 2009.
[13] B. Liu and L. Zhang, “A survey of opinion mining and sentiment analysis,” in Mining Text Data, Springer, 2012.
[14] J. Devlin et al., “BERT: Pre-training of deep bidirectional transformers for language understanding,” in Proc. NAACL-HLT, 2019.
[15] K. Papineni et al., “BLEU: A method for automatic evaluation of machine translation,” in Proc. ACL, 2002.
[16] D. Bahdanau, K. Cho, and Y. Bengio, “Neural machine translation by jointly learning to align and translate,” in Proc. ICLR, 2015.
[17] I. Sutskever, O. Vinyals, and Q. Le, “Sequence to sequence learning with neural networks,” in Proc. NeurIPS, 2014.
[18] R. Mihalcea and D. Radev, Graph-Based Natural Language Processing and Information Retrieval, Cambridge University Press, 2011.
[19] A. S. Rao and M. P. Georgeff, “BDI agents: From theory to practice,” in Proc. ICMAS, 1995.
[20] C. Manning et al., “The Stanford CoreNLP natural language processing toolkit,” in Proc. ACL Demo Session, 2014.
[21] J. Weizenbaum, “ELIZA—A computer program for the study of natural language communication,” Commun. ACM, vol. 9, no. 1, pp. 36–45, 1966.
[22] G. Hinton et al., “Deep neural networks for acoustic modeling in speech recognition,” IEEE Signal Process. Mag., vol. 29, no. 6, pp. 82–97, 2012.
[23] M. Reddy and R. Holzer, “Building intelligent chatbots for customer service,” in Proc. ICWS, 2017.
[24] A. Ram et al., “Conversational AI: The science behind the Alexa Prize,” AI Mag., vol. 39, no. 3, pp. 56–67, 2018.
[25] R. Lowe et al., “The Ubuntu Dialogue Corpus: A large dataset for research in unstructured multi-turn dialogue systems,” in Proc. SIGDIAL, 2015.
[26] Salesforce, “Salesforce Einstein Bot documentation,” 2021. [Online]. Available: https://developer.salesforce.com/docs
[27] Salesforce, “Natural language understanding with Einstein Language,” 2020. [Online]. Available: https://www.salesforce.com
[28] A. Gatt and E. Krahmer, “Survey of the state of the art in natural language generation,” J. Artif. Intell. Res., vol. 61, pp. 65–170, 2018.
[29] D. Traum and S. Larsson, “The information state approach to dialogue management,” in Current and New Directions in Discourse and Dialogue, Kluwer, 2003.
[30] J. L. Austin, How to Do Things with Words, Harvard University Press, 1962.
[31] H. P. Grice, “Logic and conversation,” in Syntax and Semantics, vol. 3, Academic Press, 1975.
[32] M. L. Gabbay and J. Woods, Handbook of the Logic of Argument and Inference, Elsevier, 2002.
[33] C. D. Manning, “Computational linguistics and deep learning,” Computational Linguistics, vol. 41, no. 4, pp. 701–707, 2015.
[34] T. Winograd, “Understanding natural language,” Cognitive Psychology, vol. 3, no. 1, pp. 1–191, 1972.
[35] G. Salton and M. J. McGill, Introduction to Modern Information Retrieval, McGraw-Hill, 1983.
[36] H. Schütze, C. D. Manning, and P. Raghavan, Introduction to Information Retrieval, Cambridge University Press, 2008.
[37] R. E. Banchs and H. Li, “IRIS: A chat-oriented dialogue system based on the vector space model,” in Proc. ACL, 2012.
[38] C. Danescu-Niculescu-Mizil and L. Lee, “Chameleons in imagined conversations,” in Proc. NAACL, 2011.
[39] T. Mikolov et al., “Efficient estimation of word representations in vector space,” in Proc. ICLR, 2013.
[40] K. Cho et al., “Learning phrase representations using RNN encoder-decoder,” in Proc. EMNLP, 2014.










