What Is Natural Language Processing (NLP)?
NLP Evolution: Changes Coming To The Ways We Interact With Technology
gettyNatural language processing technology refines our internet search results, helps voice assistants such as Siri understand our questions and commands, keeps spam out of our email inboxes, and more. The eruption of ChatGPT and other generative artificial intelligence tools introduced the world to multifunctional assistants that, with a simple query, can help with research, writing and generating images—even creating code.
With all that, NLP hasn't reached its pinnacle yet. As the technology continues to evolve, experts expect it to play a role in everything from Web surfing to business automation (and much, much more). Below, 20 members of Forbes Technology Council discuss some of the big developments they see coming to NLP and how its evolution will change the ways we interact with technology.
1. NLP's Accuracy, Breadth And Capabilities Will ImproveThe accuracy, breadth and overall capabilities of NLP will greatly improve, making it cheaper, easier to deploy and better integrated with the applications and workflows we use daily. Advances in reasoning capabilities will take us beyond text generation and question answering to more complex tasks, including planning, integrating resources and reacting appropriately to changes and surprises. - David Talby, John Snow Labs
2. Online Interactions Will Become Extremely PersonalizedLarge language models will allow all online interactions to become extremely personalized. Each page on a website can have its content (text, images and video) customized to the specific task you are performing at that specific moment. The entire structure of websites will change from being a set of linked pages to an interactive agent that partners with you to solve high-level needs. - Ari Kahn, Bridgeline Digital
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?
3. Cross-Domain Thinking Will Be EnabledNatural language processing will evolve and expand its impact through cross-domain thinking, which entails taking a concept from one field and applying it in a seemingly very different domain to enable new insights and innovation. Cross-domain thinking will boost interdisciplinary creativity and help create new mental pathways, expand perspectives and enhance problem-solving abilities. - Victor Shilo, EastBanc Technologies
4. Complex Interactions Will Be Replaced By Simpler CommandsThe human experience with technology can be greatly improved with natural language as an extra modality of interaction. Today, we click, touch, type and issue basic voice commands, but as natural language is woven into every experience—and I'm not talking about chatbots—it will unlock a powerful tool where a very long sequence of clicks, swipes and touches can be replaced by a phrase. - Lucas Persona, CI&T
5. Computers Will More Intelligently Interact With HumansAI using NLP will power the new user interface. We are in a world where humans work with computers in a language computers understand, using devices such as keyboards, mice and monitors. When computers can intelligently interact with humans in our language and environment, it will impact every aspect of technology. For the most part, technology devices will dissolve into everyday things we use. - Vishwas Manral, Cloud Security Alliance
6. Companies Will Know Much More About Their CustomersCompanies and applications will know more about us and help make all our experiences more personalized; the experience we have now with applications such as Netflix and Spotify will be the experience we have across the Web. Foundational models will be trained on more data and become a lot more intuitive. NLP-powered applications will become much easier to use, as prompts won't need to be as technical, and they will be much better at predictive analytics. - Shayan Hamidi, Rechat
7. NLP Models Will Become Highly SpecializedAI will evolve at breakneck speed, with changes that once took years happening in weeks or days. NLP will be one of a myriad of GenAI-adjacent technologies, each growing more specialized and powerful as we leverage high-quality data for high-trust applications. Future models will get master's degrees in the fields for which they're used, making them far more reliable and useful for businesses. - Carolyn Parent, Conveyer
8. 'Cooperative' LLMs Will EmergeThe power and limits of monolithic large language models will become clearer, with a possible shift toward smaller, multimodal models with curated datasets and deeper levels of detail. I'd expect traditional systems of record (such as SAP, Oracle and ServiceNow) to lead in these efforts. These new "cooperative" LLMs will combine to bring more intelligent systems of engagement to market. - Shawn Rosemarin, Pure Storage
9. Software Will Become A Partner, Not Just A ToolNLP technology has been around for decades, recently receiving much attention as it was integrated with large language models and generative AI. In the next five years, NLP will transform software from being a mere tool, as it is today, to becoming a partner that works alongside humans in both advisory and action-oriented capacities. NLP will become adaptable, making AI seem human and resulting in increased adoption. - Simon Bennett, AVEVA
10. Digital Twins Will Develop From Regular InteractionsThe NLP field is moving toward ultra-personalization through the development of digital twins, created via continual learning from the everyday text and speech interactions of their physical counterparts. This evolution will facilitate the creation of highly personalized and context-aware digital entities such as assistants, mentors or therapists that are tailored to meet individual needs and preferences. - Ivan Tankoyeu, AI Superior GmbH
11. Digital Assistants Will Become More SophisticatedNLP technology will evolve significantly. It will enhance and personalize user experiences and become more adept at understanding and interpreting user preferences. Digital assistants will become more sophisticated and capable of handling complex and nuanced conversations. NLP will also evolve to understand context and semantics more effectively, leading to more accurate and context-aware search results. - Amit Jain, Roadz
12. NLP Will Accept More Varied Inputs And Sustain Richer ConversationsImproved NLP models will better understand and maintain context during conversations and may become more adept at recognizing and responding to human emotions in text. Additionally, multimodal NLP models will integrate multiple modes of communication—such as text, images and video—enabling them to understand queries better and generate higher-quality content. - Syl Omope, Elekta
13. Applications Will Become More Comprehensive And Context-AwareNatural language processing will evolve by integrating multimodal capabilities to better understand and generate content across different media, leading to more comprehensive and context-aware applications. In the healthcare industry, such multimodal NLP models could analyze both textual triage notes and relevant medical images or charts to provide a more holistic understanding of a patient's condition. - Kanishk Agrawal, Judge Group
14. We'll Build Our Own Online ExperiencesI believe natural language processing, coupled with developments in large language models, will transform how we interact online. Currently, we heavily rely on others to build our online experiences. I think we will be able to flip that around and build our own experiences without tracking or surveillance technology such as cookies. - Caroline McCaffery, ClearOPS
15. Digital Customer Service Will Expand And ImproveNLP technologies are expected to enhance their capabilities to generate more effective, conversational customer service responses and manage more complicated inquiries. This will allow businesses to automate tasks such as responding to repetitive customer inquiries, freeing up employees to tackle more complex activities while providing a better customer experience. - Rick Watkin, KUBRA
16. Conversational Histories Will Lead To More 'Natural' InteractionsReferences to conversational history often serve as a shortcut to evoke context. AI agents with NLP will make use of the ever-growing accumulation of conversations to increase the "naturalness" and personal tone of interactions. - Viktor Trón, Swarm
17. Language Barriers Will Be OvercomeIn the next five years, natural language processing will have a major impact on multilingual capabilities and addressing language diversity challenges. NLP will focus on bridging cultural and language barriers to facilitate global communication. For global businesses, this plays a crucial role in bridging linguistic and cultural gaps, enabling more inclusive communication across the globe. - Joe McCunney, Scalar Labs
18. Automation Initiatives Will Become Simpler And Less CostlyThe past decades have seen impressive efficiency gains inside large enterprises and governments through the automation of tasks traditionally performed by back-office workers, such as data entry and claims processing. Digitization projects typically cost millions of dollars, require external consultants and take years to complete, but LLMs will make the digitization process much easier and cheaper. - Patricia Thaine, Private AI
19. NLP-Enabled Cybersecurity Solutions Will Detect Subtle Warning SignsNLP is already having a significant impact on the field of cybersecurity, where its ability to recognize patterns in communication has proven invaluable in the war against advanced phishing. By identifying patterns in users' day-to-day communications, NLP-enabled security solutions can then detect the subtle deviations indicative of impersonation, account takeover and other types of fraud. - Eyal Benishti, IRONSCALES
20. NLP Technology Will Extend To Nonverbal CommunicationNLP continually evolves with the accumulation of unstructured data, making its evolution perpetual. So let's be bold in predicting its possibilities in the next five years. As data grows, NLP, which has been focused on human language, will evolve to extend to nonverbal communication. NLP and biometrics will enable human-like robots to see, touch, hear and speak like humans. - Paula Kennedy Garcia
What Is NLP? Natural Language Processing Explained
Natural language processing definitionNatural language processing (NLP) is the branch of artificial intelligence (AI) that deals with training computers to understand, process, and generate language. Search engines, machine translation services, and voice assistants are all
While the term originally referred to a system's ability to read, it's since become a colloquialism for all computational linguistics. Subcategories include natural language generation (NLG) — a computer's ability to create communication of its own — and natural language understanding (NLU) — the ability to understand slang, mispronunciations, misspellings, and other variants in language.
The introduction of transformer models in the 2017 paper "Attention Is All You Need" by Google researchers revolutionized NLP, leading to the creation of generative AI models such as Bidirectional Encoder Representations from Transformer (BERT) and subsequent DistilBERT — a smaller, faster, and more efficient BERT — Generative Pre-trained Transformer (GPT), and Google Bard.
How natural language processing worksNLP leverages machine learning (ML) algorithms trained on unstructured data, typically text, to analyze how elements of human language are structured together to impart meaning. Phrases, sentences, and sometimes entire books are fed into ML engines where they're processed using grammatical rules, people's real-life linguistic habits, and the like. An NLP algorithm uses this data to find patterns and extrapolate what comes next. For example, a translation algorithm that recognizes that, in French, "I'm going to the park" is "Je vais au parc" will learn to predict that "I'm going to the store" also begins with "Je vais au." All the algorithm then needs is the word for "store" to complete the translation task.
NLP applicationsMachine translation is a powerful NLP application, but search is the most used. Every time you look something up in Google or Bing, you're helping to train the system. When you click on a search result, the system interprets it as confirmation that the results it has found are correct and uses this information to improve search results in the future.
Chatbots work the same way. They integrate with Slack, Microsoft Messenger, and other chat programs where they read the language you use, then turn on when you type in a trigger phrase. Voice assistants such as Siri and Alexa also kick into gear when they hear phrases like "Hey, Alexa." That's why critics say these programs are always listening; if they weren't, they'd never know when you need them. Unless you turn an app on manually, NLP programs must operate in the background, waiting for that phrase.
Transformer models take applications such as language translation and chatbots to a new level. Innovations such as the self-attention mechanism and multi-head attention enable these models to better weigh the importance of various parts of the input, and to process those parts in parallel rather than sequentially.
Rajeswaran V, senior director at Capgemini, notes that Open AI's GPT-3 model has mastered language without using any labeled data. By relying on morphology — the study of words, how they are formed, and their relationship to other words in the same language — GPT-3 can perform language translation much better than existing state-of-the-art models, he says.
NLP systems that rely on transformer models are especially strong at NLG.
Natural language processing examplesData comes in many forms, but the largest untapped pool of data consists of text — and unstructured text in particular. Patents, product specifications, academic publications, market research, news, not to mention social media feeds, all have text as a primary component and the volume of text is constantly growing. Apply the technology to voice and the pool gets even larger. Here are three examples of how organizations are putting the technology to work:
Whether you're building a chatbot, voice assistant, predictive text application, or other application with NLP at its core, you'll need tools to help you do it. According to Technology Evaluation Centers, the most popular software includes:
There's a wide variety of resources available for learning to create and maintain NLP applications, many of which are free. They include:
Here are some of the most popular job titles related to NLP and the average salary (in US$) for each position, according to data from PayScale.
Here's Why A Gold Rush Of NLP Startups Is About To Arrive
Remember Natural Language Processing? NLP arose several years ago but it was only in 2018 that AI researchers proved it was possible to train a neural network once on a large amount of data and use it again and again for different tasks. In 2019 GPT-2 from Open AI, and T5 by Google appeared, showing that they were startlingly good (it's now been incorporated into Google Duplex, pictured). Concerns were even raised about their possible misuse.
But since then, things have gone, well, pretty exponential.
Last year saw a veritable 'Cambrian explosion' of NLP startups and Large Language Models.
This year, Google released LambDa, a large language model for chatbot applications. Then Deepmind released Alpha Code then later Flamingo — a language model capable of visual understanding. In July of this year alone, the Big Science project released Bloom, a massive open source language model, and Meta announced that they'd trained a single language model capable of translation between 200 languages.
We are now reaching a sort of tipping point where we will see many more commercial applications of NLP — some using some of these open source, publicly available platforms — hit the market. You could almost say a gold rush has begun of startups trying to build on this technology, with an arms race developing between the large language model providers.
One of those startups is Humanloop, a University College AI spinout which claims to make it "significantly" easier for companies to adopt this new wave of NLP technology via a suite of tools which helps humans 'teach' AI algorithms. This means a lawyer, doctor or banker can put into the platform a piece of knowledge which the software then applies at scale across a large data set, allowing a broader application of AI to various industries.
It's now pulled in a $2.6 million seed funding round led by Index Ventures, with participation by Y Combinator, Local Globe and Albion.
Founded in 2020 by a team of preeminent computer scientists from UCL and Cambridge, and alumni of Google and Amazon, Humanloop's applications, it says, might include building a picture of a national real estate market from unstructured data on the internet; reading through electronic health records to identify people who could be candidates to try new therapies; and even moderating comments on Facebook groups.
"People would be shocked if they knew what language-based AI was capable of now," says CEO Raza Habib in a statement. "But getting the data into a form that the algorithm can use is the biggest challenge. With Humanloop, we want to democratize access to AI and enable the next generation of intelligent, self-serve applications — by allowing any company to take its domain expertise and distill it efficiently in a machine learning model."
Humanloop claims its success is the growth of 'probabilistic deep learning', where algorithms can work out what they don't know, by tuning out the noise in data sets, finding the good stuff and asking humans for help with the parts they don't understand.
Other startups building their own large language models and putting them behind APIs include Cohere AI ($164.9 million in funding) and Open AI GPT-3. Snorkel AI ($135.3 million in funding) is also a new startup in this arena.
However, Humanloop says it is less focused on developing the models and more on the tools needed to adapt them to specific use cases.
"What many people don't know is that it's not the lack of appropriate algorithms that's holding back AI from being ubiquitous in every workplace — it's the absence of properly labelled data," adds Erin Price-Wright, the partner at Index Ventures who led the investment. "In fact, machine learning itself is becoming increasingly commoditized and off the shelf, but it's still really hard for non-technical people to transmit their knowledge to a machine and help the algorithm refine its model." Hence why Humanloop allows people to tweak the data.
If the NLP gold rush is indeed on its way, expect a whole bunch of other startups to appear soon.
Mike Butcher (M.B.E.), formerly Editor-at-large of TechCrunch, has written for UK national newspapers and magazines and been named one of the most influential people in European technology by Wired UK. He has spoken at the World Economic Forum, Web Summit, and DLD. He has interviewed Tony Blair, Dmitry Medvedev, Kevin Spacey, Lily Cole, Pavel Durov, Jimmy Wales, and many other tech leaders and celebrities. Mike is a regular broadcaster, appearing on BBC News, Sky News, CNBC, Channel 4, Al Jazeera and Bloomberg. He has also advised UK Prime Ministers and the Mayor of London on tech startup policy, as well as being a judge on The Apprentice UK. GQ magazine named him one of the 100 Most Connected Men in the UK. He is the co-founder of TheEuropas.Com (Top 100 listing of European startups); and the non-profits Techfugees.Com, TechVets.Co, and Startup Coalition. He was awarded an MBE in the Queen's Birthday Honours list in 2016 for services to the UK technology industry and journalism.

Comments
Post a Comment