Natural language processing enhances analysis and data usage

0

Natural language processing is a key feature of modern BI and analytics platforms, simplifying and democratizing analytics across the enterprise.

As companies compete to operationalize, analyze, and predict data, there is a need to empower decision makers and data professionals. Instead of entering queries in a query language, NLP allows non-technical users to simply enter a query in natural language. The platforms also offer other help features such as text completion and popular search phrases to make working with data even easier.

When business people can ask their questions, it frees experienced users from mundane tasks and also makes it easier to collaborate on model development, said Cindi Howson, chief data officer at BI and AI-driven analytics platform provider ThoughtSpot.

“In ThoughtSpot will tell you [a metric in] That zip code is 300% higher than the zip code that a businessman can take and say to the data scientist, ‘This is where I want you to focus your efforts,'” she said.

ThoughtSpot and other BI and analytics providers like Qlik have partnered with companies specializing in NLP to expand their capabilities. For example, ThoughtSpot and VoiceBase announced a partnership in 2020 to simplify the search for voice data, share insights, and use them to drive business value. This is a great way to understand how many angry calls a call center has received and whether an agent’s empathetic treatment of a customer has resulted in increased sales.

NLP has many use cases, e.g. B. Predicting auto damage litigation. NLP can complement telephone conversations between a claims adjuster and various parties in a variety of ways.

“NLP can also understand [other] unstructured data like notes that assign a qualitative score indicating the likelihood that the insured driver was at fault,” said Kieran Wilcox, director of claims solutions at AI-as-a-service company Clara Analytics. “Another ability we’re starting to see is that NLP can fill in missing information that might be in the claims adjuster’s notes but was never in the structured data.”

It’s not what people say, it’s what they mean

Ten different people can ask the same question ten different ways, which is why semantic relationships are so important. Worse, people don’t always say what they mean.

“We let customers define their own synonyms, but I think the technology will actually go towards some industry taxonomies,” Howson said.

Graph technologies are used to understand relationships between words, people and things. Charts can recommend content or suggest popular searches. It’s an ideal way to gain additional information about relationships extracted from natural language data, said Paul Milligan, director of product strategy at Linguamatics, a provider of NLP text mining products and solutions.

“Different problems require different NLP methods, so it is important [to answer] the questions the analyst has,” he said [explain] Use cases and accuracy specifications.”

Embedded analytics on the rise

Embedded analytics are now finding their way into many types of apps, whether for ranking players or providing a visual dashboard for decision-making purposes. Before you grab an NLP API and plug it into an application, you should know what the total cost of using that API will be, including training data, model development, and model deployment. At scale, the cost can be prohibitive.

Data scientists and analysts look for operational support, source and version control, and distribution capabilities when they need access to data and models.

“One remaining bottleneck is the lack of training data in areas like healthcare, where the data isn’t really accessible for privacy reasons,” Milligan said. “Human-in-the-loop tools can alleviate this by providing an initial semi-automated process with a small amount of training material that becomes more automated over time as more data is reviewed.”

Voice interfaces are on the horizon, but there are challenges

NLP experts often disagree on how advanced NLP is at the cutting edge of technology. Those who are most confident point to Alexa and Siri as proof that voice interfaces work, but skeptics use the same examples to underscore their imperfections.

It is not advisable to go ahead with a voice interface that could negatively affect the accuracy of the platform and the reputation of the provider.

The right measure is not a universal level of accuracy like 95%, but a level of accuracy that is appropriate for the application. When Alexa or Siri misunderstands a query, it’s usually a slightly annoying user experience. However, providing an incorrect medical diagnosis to a patient can be a malpractice. While using synthetic data in healthcare is an option, it may not be sufficient to test models, Milligan said.

Currently, BI and analytics platforms require users to type their queries as this is easier to solve than speech recognition. Natural language comprehension can falter for a myriad of reasons, including an inability to understand foreign or native accents and individual language habits. Analytics and BI platforms are judged on their ability to accurately analyze information, so going ahead with a language interface is not advisable, which can negatively impact the accuracy of the platform and the reputation of the vendor.

“We are poised for a revolution in augmented analytics with recent advances in text-based modeling and multimodal deep learning architectures that combine written text with other modalities such as video and audio,” said Toshish Jawale, CTO and co-founder of Symbl, a Conversational Intelligence platform for developers.

Traditional NLP tasks have become significantly more demanding, Jawale said. The reliance on high-level skills to manually curate a targeted model is needed less and less.

“More out-of-the-box models are capable of performing common NLP tasks without requiring major training cycles and data maintenance,” he said.

For example, zero-shot and full-shot learning techniques have enabled systems that can generalize themselves enough to perform tasks they weren’t specifically trained to do. These skills allow someone with a basic understanding of NLP to create sophisticated systems while staying focused on the business problem at hand.

“Human language is full of semantic and syntactic nuances,” says Abhishek Pakhira, COO of AI solution provider Aureus Tech Systems. “NLP helps machines understand those nuances to determine context and exact meaning in verbal and written language, which can be anything from eDiscovery to a chatbot helping a customer fix an issue. The maybe The most intriguing aspect of NLP is “learning by transference” where the machine can learn from one context and apply it to another.”

Basic data literacy is still smart

The whole point of using NLP in analytics is to make the platform easier to use so that less experienced users can benefit from it, but the ease of use only goes so far. It doesn’t teach the masses how to think like an analyst, although through practice and interaction with the platform, even the average businessperson can learn how to ask questions that lead to better-quality answers.

Basically, employees in an insights-based organization should have a 101-level understanding of data literacy, which means they have a fundamental understanding of data, the data lifecycle, and the need for data governance. While using the platform may not require it, a shared understanding helps a company build a data-driven culture.

Share.

Comments are closed.