What Is NLP (Natural Language Processing)?
Computer Science With Speech And Natural Language Processing MSc
Apply now for 2025 entry or register your interest to find out about postgraduate study and events at the University of Sheffield.
Course descriptionOur programme bridges computer science, machine learning, linguistics and signal processing, driving transformative technologies such as chat/voice assistants, real-time machine translation, sentiment analysis and speech recognition. Modules in Natural Language Processing will introduce you to the core technologies underpinning state-of-the-art AI tools, such as ChatGPT and DeepSeek.
This course is ideal for students with a keen interest in machine learning, linguistics, phonetics, and computational techniques with a background in computer science and engineering, or a related field.
A third of your study time will be devoted to an individual dissertation, where you will collaborate closely with a member of staff to research topics such as machine learning, natural language processing, or speech recognition. These capabilities will prepare you for dynamic careers in AI development, speech and language technology, or academic research, making you a sought-after professional in these cutting-edge fields.
By the end of the course, you will have mastered key skills in machine learning, natural language processing, speech production and perception analysis, and digital signal processing on real-world data. This course blends engaging lectures with hands-on lab classes and computational exercises, fostering both theoretical understanding and practical expertise.
Applying for this courseWe are no longer using a staged admissions process for this course. You can apply for this course in the usual way using our Postgraduate Online Application Form.
AccreditationThis course is accredited by the British Computer Society (BCS). The course partially meets the requirements for Chartered Information Technology Professional (CITP) and partially meets the requirements for Chartered Engineer (CEng).
British Computer Society (BCS) ModulesA selection of modules is available each year - some examples are below. There may be changes before you start your course. From May of the year of entry, formal programme regulations will be available in our Programme Regulations Finder.
MSc modules
Core modules:
Text ProcessingThis module introduces fundamental concepts and ideas in natural language text processing, covers techniques for handling text corpora, and examines representative systems that require the automated processing of large volumes of text. The module focuses on modern quantitative techniques for text analysis and explores important models for representing and acquiring information from texts.
15 credits Speech ProcessingThis module aims to demonstrate why computer speech processing is an important and difficult problem, to investigate the representation of speech in the articulatory, acoustic and auditory domains, and to illustrate computational approaches to speech parameter extraction. It examines both the production and perception of speech, taking a multi-disciplinary approach (drawing on linguistics, phonetics, psychoacoustics, etc.). It introduces sufficient digital signal processing (linear systems theory, Fourier transforms) to motivate speech parameter extraction techniques (e.G. Pitch and formant tracking).
15 credits Machine Learning and Adaptive IntelligenceThe module is about core technologies underpinning modern artificial intelligence. The module will introduce statistical machine learning and probabilistic modelling and their application to describing real-world phenomena. The module will give students a grounding in modern state-of-the-art algorithms that allow modern computer systems to learn from data. It has a considerable focus on the mathematical underpinnings of key ML approaches, requiring some knowledge of linear algebra, differentiation and probability.
15 credits Professional IssuesThis module aims to enable students to recognise the legal, social, ethical and professional issues involved in the exploitation of computer technology and be guided by the adoption of appropriate professional, ethical and legal practices. It describes the relationship between technological change, society and the law, including the powerful role that computers and computer professionals play in a technological society. It introduces key legal areas which are specific and relevant to the discipline of computing (e.G., intellectual property, liability for defective software, computer misuse, etc) and aims to provide an understanding of ethical and societal concepts that are important to computer professionals, and experience of considering ethical dilemmas.
15 credits Scalable Machine LearningThis module will focus on technologies and algorithms that can be applied to data at a very large scale (e.G. Population level). From a theoretical perspective it will focus on parallelisation of algorithms and algorithmic approaches such as stochastic gradient descent. There will also be a significant practical element to the module that will focus on approaches to deploying scalable ML in practice such as SPARK, programming languages such as Python/Scala and deployment on high performance computing platforms/clusters.
15 credits Team Software ProjectThis team project aims to provide insights and wider context for the more practical aspects of the taught modules, and to provide students with experience of working in teams to develop a substantial piece of software.
This module has no summer resit. Failure in this module will normally require students to repeat it the following year with attendance.
This module has the explicit objective of developing group teamwork skills. Participation in teamwork is mandatory and failure to participate will result in deduction of marks and eventually loss of credits. Passing this module is essential for being awarded a degree accredited by the British Computer Society (BCS).
15 credits Speech TechnologyThis module introduces the principles of the emergent field of speech technology, studies typical applications of these principles and assesses the state of the art in this area. You will learn the prevailing techniques of automatic speech recognition (based on statistical modelling); will see how speech synthesis and text-to-speech methods are deployed in spoken language systems; and will discuss the current limitations of such devices. The module will include project work involving the implementation and assessment of a speech technology device.
15 credits Natural Language ProcessingThis module provides an introduction to the field of computer processing of written natural language, known as Natural Language Processing (NLP). We will cover standard theories, models and algorithms, discuss competing solutions to problems, describe example systems and applications, and highlight areas of open research.
15 credits Dissertation ProjectFor your individual project, you can choose from a wide range of possibilities in many different environments both within and outside the University. The project is completed during the summer, and you will have a personal academic supervisor to guide you during this period.
60 creditsThe content of our courses is reviewed annually to make sure it's up-to-date and relevant. Individual modules are occasionally updated or withdrawn. This is in response to discoveries through our world-leading research; funding changes; professional accreditation requirements; student or employer feedback; outcomes of reviews; and variations in staff or student numbers. In the event of any change we will inform students and take reasonable steps to minimise disruption.
Open days
Interested in postgraduate taught study? Register your interest in studying at Sheffield or attend an event throughout the year to find out what makes studying at here special.
Duration1 year full-time
TeachingWe use lectures, tutorials and group work.
AssessmentAssessment is by formal examinations, coursework assignments and a dissertation.
School School of Computer ScienceOur masters courses at the University of Sheffield cover both the strong theoretical foundations and the practical issues involved in developing software systems in a business or industrial context.
Our graduates are highly prized by industry, and provide the opportunity for you to gain an advantage in the job market, whether in the UK or overseas.
Although it is possible to discuss many of the practical issues involved in industrial applications in lectures and seminars, there is no substitute for first-hand experience.
We have a unique track record in developing innovative project-based courses that provide real experience for computing students, and this experience is embodied in our MSc courses.
Our MSc programmes last 12 months, and begin in late September. You will study taught modules during two 15-week semesters. Your work is assessed either by coursework or by formal examination. During the summer you complete an individual dissertation project, which may be based within the University or at the premises of an industrial client.
Student profiles It's possible to create something really specialEmily Ip Graduate , Computer Science with Speech and Natural Language Processing MSc
Graduate Emily Ip was in disbelief when she was told she'd won the school's Fretwell-Downing prize for her dissertation on using language cues to detect signs of cognitive decline.
Entry requirementsMinimum 2:1 undergraduate honours degree in a relevant subject.
Subject requirementsWe accept degrees in the following subject areas:
We may also consider degrees in Linguistics or Psychology.
We also consider a wide range of international qualifications:
Entry requirements for international students
We assess each application on the basis of the applicant's preparation and achievement as a whole. We may accept applicants whose qualifications don't meet the published entry criteria but have other experience relevant to the course.
The lists of required degree subjects and modules are indicative only. Sometimes we may accept subjects or modules that aren't listed, and sometimes we may not accept subjects or modules that are listed, depending on the content studied.
English language requirementsIELTS 6.5 (with 6 in each component) or University equivalent.
Pathway programme for international studentsIf you're an international student who does not meet the entry requirements for this course, you have the opportunity to apply for a pre-masters programme in Science and Engineering at the University of Sheffield International College. This course is designed to develop your English language and academic skills. Upon successful completion, you can progress to degree level study at the University of Sheffield.
If you have any questions about entry requirements, please contact the school/department.
Alumni discount Save up to £2,500 on your course feesAre you a Sheffield graduate? You could save up to £2,500 on your postgraduate taught course fees, subject to eligibility.
Apply
You can apply now using our Postgraduate Online Application Form. It's a quick and easy process.
Apply now
Any supervisors and research areas listed are indicative and may change before the start of the course.
Our student protection plan
Recognition of professional qualifications: from 1 January 2021, in order to have any UK professional qualifications recognised for work in an EU country across a number of regulated and other professions you need to apply to the host country for recognition. Read information from the UK government and the EU Regulated Professions Database.
5 Real-world Applications Of Natural Language Processing (NLP)
Natural language processing (NLP) is a field of study that focuses on enabling computers to understand and interpret human language. NLP involves applying machine learning algorithms to analyze and process natural language data, such as text or speech.
NLP has recently been incorporated into a number of practical applications, including sentiment analysis, chatbots and speech recognition. NLP is being used by businesses in a wide range of sectors to automate customer care systems, increase marketing initiatives and improve product offers.
Related: 5 natural language processing (NLP) libraries to use
Specifically, this article looks at sentiment analysis, chatbots, machine translation, text summarization and speech recognition as five instances of NLP in use in the real world. These applications have the potential to revolutionize the way one communicates with technology, making it more natural, intuitive and user-friendly.
Sentiment analysisNLP can be used to analyze text data to determine the sentiment of the writer toward a particular product, service or brand. This is used in applications such as social media monitoring, customer feedback analysis and market research.
A common use of NLP is sentiment analysis of the stock market, in which investors and traders examine social media sentiment on a particular stock or market. An investor, for instance, can use NLP to examine tweets or news stories about a specific stock to ascertain the general attitude of the market toward that stock. Investors can determine whether these sources are expressing positive or negative opinions about the stock by studying the terminology used in these sources.
By supplying information on market sentiment and enabling investors to modify their strategies as necessary, sentiment research can assist investors in making more educated investment decisions. For instance, if a stock is receiving a lot of positive sentiment, an investor may consider buying more shares, while negative sentiment may prompt them to sell or hold off on buying.
ChatbotsNLP can be used to build conversational interfaces for chatbots that can understand and respond to natural language queries. This is used in customer support systems, virtual assistants and other applications where human-like interaction is required.
A chatbot like ChatGPT that can help consumers with their account questions, transaction histories and other financial questions might be created by a financial institution using NLP. Customers can easily obtain the information they require thanks to the chatbot's ability to comprehend and respond to natural language questions.
Machine translationNLP can be used to translate text from one language to another. This is used in applications such as Google Translate, Skype Translator and other language translation services.
Similarly, a multinational corporation may use NLP to translate product descriptions and marketing materials from their original language to the languages of their target markets. This allows them to communicate more effectively with customers in different regions.
Text summarizationNLP can be used to summarize long documents and articles into shorter, concise versions. This is used in applications such as news aggregation services, research paper summaries and other content curation services.
NLP can be used by a news aggregator to condense lengthy news stories into shorter, easier-to-read versions. Without having to read the entire article, readers can immediately receive a summary of the news thanks to text summarization.
Related: 7 artificial intelligence (AI) examples in everyday life
Speech recognitionNLP can be used to convert spoken language into text, allowing for voice-based interfaces and dictation. This is used in applications such as virtual assistants, speech-to-text transcription services and other voice-based applications.
A virtual assistant, such as Alexa from Amazon or Assistant from Google, uses NLP to comprehend spoken instructions and answer questions in natural language. Instead of having to type out commands or inquiries, users may now converse with the assistant by speaking.
What Is Natural Language Processing?
Natural language processing (NLP) is a branch of artificial intelligence (AI) that focuses on computers incorporating speech and text in a manner similar to humans understanding. This area of computer science relies on computational linguistics—typically based on statistical and mathematical methods—that model human language use.
NLP plays an increasingly prominent role in computing—and in the everyday lives of humans. Smart assistants such as Apple's Siri, Amazon's Alexa and Microsoft's Cortana are examples of systems that use NLP.
In addition, various other tools rely on natural language processing. Among them: navigation systems in automobiles; speech-to-text transcription systems such as Otter and Rev; chatbots; and voice recognition systems used for customer support. In fact, NLP appears in a rapidly expanding universe of applications, tools, systems and technologies.
In every instance, the goal is to simplify the interface between humans and machines. In many cases, the ability to speak to a system or have it recognize written input is the simplest and most straightforward way to accomplish a task.
While computers cannot "understand" language the same way humans do, natural language technologies are increasingly adept at recognizing the context and meaning of phrases and words and transforming them into appropriate responses—and actions.
Also see: Top Natural Language Processing Companies
Natural Language Processing: A Brief HistoryThe idea of machines understanding human speech extends back to early science fiction novels. However, the field of natural language processing began to take shape in the 1950s, after computing pioneer Alan Turing published an article titled "Computing Machinery and Intelligence." It introduced the Turing Test, which provided a basic way to gauge a computer's natural language abilities.
During the ensuing decade, researchers experimented with computers translating novels and other documents across spoken languages, though the process was extremely slow and prone to errors. In the 1960s, MIT professor Joseph Weizenbaum developed ELIZA, which mimicked human speech patterns remarkably well. Over the next quarter century, the field continued to evolve. As computing systems became more powerful in the 1990s, researchers began to achieve notable advances using statistical modeling methods.
Dictation and language translation software began to mature in the 1990s. However, early systems required training, they were slow, cumbersome to use and prone to errors. It wasn't until the introduction of supervised and unsupervised machine learning in the early 2000s, and then the introduction of neural nets around 2010, that the field began to advance in a significant way.
With these developments, deep learning systems were able to digest massive volumes of text and other data and process it using far more advanced language modeling methods. The resulting algorithms had become far more accurate and utilitarian.
Also see: Top AI Software
How Does Natural Language Processing Work?Early NLP systems relied on hard coded rules, dictionary lookups and statistical methods to do their work. They frequently supported basic decision-tree models. Eventually, machine learning automated tasks while improving results.
Today's natural language processing frameworks use far more advanced—and precise—language modeling techniques. Most of these methods rely on convolutional neural networks (CNNs) to study language patterns and develop probability-based outcomes.
For example, a method called word vectors applies complex mathematical models to weight and relate words, phrases and constructs. Another method called Recognizing Textual Entailment (RTE), classifies relationships of words and sentences through the lens of entailment, contradiction, or neutrality. For instance, the premise "a dog has paws" entails that "dogs have legs" but contradicts "dogs have wings" while remaining neutral to "all dogs are happy."
A key part of NLP is word embedding. It refers to establishing numerical weightings for words in specific context. The process is necessary because many words and phrases can mean different things in different meanings or contexts (go to a club, belong to a club or swing a club). Words can also be pronounced the same way but mean different things (through, threw or witch, which). There's also a need to understand idiomatic phrases that do not make sense literally, such as "You are the apple of my eye" or "it doesn't cut the mustard."
Today's models are trained on enormous volumes of language data—in some cases several hundred gigabytes of books, magazines articles, websites, technical manuals, emails, song lyrics, stage plays, scripts and publicly available sources such as Wikipedia. As the deep learning system parse through millions or even billions of combinations—relying on hundreds of thousands of CPU or GPU cores—they analyze patterns, connect the dots and learn semantic properties of words and phrases.
It's also often necessary to refine natural language processing systems for specific tasks, such as a chatbot or a smart speaker. But even after this takes place, a natural language processing system may not always work as billed. Even the best NLPs make errors. They can encounter problems when people misspell or mispronounce words and they sometimes misunderstand intent and translate phrases incorrectly. In some cases, these errors can be glaring—or even catastrophic.
Today, prominent natural language models are available under licensing models. These include the OpenAI codex, LaMDA by Google, IBM Watson and software development tools such as CodeWhisperer and CoPilot. In addition, some organizations build their own proprietary models.
How is Natural Language Processing Used?There are a growing array of uses for natural language processing. These include:
Conversational AI. The ability of computers to recognize words introduces a variety of applications and tools. Personal assistants like Siri, Alexa and Microsoft Cortana are prominent examples of conversational AI. They allow humans to make a call from a mobile phone while driving or switch lights on or off in a smart home. Increasingly, these systems understand intent and act accordingly. For example, chatbots can respond to human voice or text input with responses that seem as if they came from another person. What's more, these systems use machine learning to constantly improve.
Machine translation. There's a growing use of NLP for machine translation tasks. These include language translations that replace words in one language for another (English to Spanish or French to Japanese, for example). Google Translate and DeepL are examples of this technology. But machine translation can also take other forms. For example, NLP can convert spoken words—either in the form of a recording or live dictation—into subtitles on a TV show or a transcript from a Zoom or Microsoft Teams meeting. Yet while these systems are increasingly accurate and valuable, they continue to generate some errors.
Sentiment analysis. NLP has the ability to parse through unstructured data—social media analysis is a prime example—extract common word and phrasing patterns and transform this data into a guidepost for how social media and online conversations are trending. This capability is also valuable for understanding product reviews, the effectiveness of advertising campaigns, how people are reacting to news and other events, and various other purposes. Sentiment analysis finds things that might otherwise evade human detection.
Content analysis. Another use case for NLP is making sense of complex systems. For example, the technology can digest huge volumes of text data and research databases and create summaries or abstracts that relate to the most pertinent and salient content. Similarly, content analysis can be used for cybersecurity, including spam detection. These systems can reduce or eliminate the need for manual human involvement.
Text and image generation. A rapidly emerging part of natural language processing focuses on text, image and even music generation. Already, some news organizations produce short articles using natural language processing. Meanwhile, OpenAI has developed a tool that generates text and computer code through a natural language interface. Another OpenAI tool, dubbed Dall-E-2, creates high quality images through an NLP interface. Type the words "black cat under a stairway" and an image appears. GitHub Copilot and Amazon CodeWhisperer can auto-complete and auto-generate computer code through natural language.
Also see: Top Data Visualization Tools
NLP Business Use CasesThe use of NLP is increasingly common in the business world. Among the top use cases:
Chatbots and voice interaction systems. Retailers, health care providers and others increasingly rely on chatbots to interact with customers, answer basic questions and route customers to other online resources. These systems can also connect a customer to a live agent, when necessary. Voice systems allow customers to verbally say what they need rather than push buttons on the phone.
Transcription. As organizations shift to virtual meetings on Zoom and Microsoft Teams, there's often a need for a transcript of the conversation. Services such as Otter and Rev deliver highly accurate transcripts—and they're often able to understand foreign accents better than humans. In addition, journalists, attorneys, medical professionals and others require transcripts of audio recordings. NLP can deliver results from dictation and recordings within seconds or minutes.
International translation. NLP has revolutionized interactions between businesses in different countries. While the need for translators hasn't disappeared, it's now easy to convert documents from one language to another. This has simplified interactions and business processes for global companies while simplifying global trade.
Scoring systems. Natural language is used by financial institutions, insurance companies and others to extract elements and analyze documents, data, claims and other text-based resources. The same technology can also aid in fraud detection, financial auditing, resume evaluations and spam detection. In fact, the latter represents a type of supervised machine learning that connects to NLP.
Market intelligence and sentiment analysis. Marketers and others increasingly rely on NLP to deliver market intelligence and sentiment trends. Semantic engines scrape content from blogs, news sites, social media sources and other sites in order to detect trends, attitudes and actual behaviors. Similarly, NLP can help organizations understand website behavior, such as search terms that identify common problems and how people use an e-commerce site. This data can lead to design and usability changes.
Software development. A growing trend is the use of natural language for software coding. Low-code and no-code environments can transform spoken and written requests into actual lines of software code. Systems such as Amazon's CodeWhisperer and GitHub's CoPilot include predictive capabilities that autofill code in much the same way that Google Mail predicts what a person will type next. They also can pull information from an integrated development environment (IDE) and produce several lines of code at a time.
Text and image generation. The OpenAI codex can generate entire documents, based a basic request. This makes it possible to generate poems, articles and other text. Open AI's DALL-E 2 generates photorealistic images and art through natural language input. This can aid designers, artists and others.
Also see: Best Data Analytics Tools
What Ethical Concerns Exist for NLP?Concerns about natural language processing are heavily centered on the accuracy of models and ensuring that bias doesn't occur. Many of these deep learning algorithms are so-called "black boxes," meaning that there's no way to understand how the underlying model works and whether it is free of biases that could affect critical decisions about lending, healthcare and more.
There is also debate about whether these systems are "sentient." The question of whether AI can actually think and feel like a human has been expressed in films such as 2001: A Space Odyssey and Star Wars. It also reappeared in 2022, when former Google data scientist Blake Lemoine published human-to-machine discussions with LaMDA. Lemoine claimed that the system had gained sentience. However, numerous linguistics experts and computer scientists countered that a silicon-based system cannot think and feel the way humans do. It merely parrots language in a highly convincing way.
In fact, researchers who have experimented with NLP systems have been able to generate egregious and obvious errors by inputting certain words and phrases. Getting to 100% accuracy in NLP is nearly impossible because of the nearly infinite number of word and conceptual combinations in any given language.
Another issue is ownership of content—especially when copyrighted material is fed into the deep learning model. Because many of these systems are built from publicly available sources scraped from the Internet, questions can arise about who actually owns the model or material, or whether contributors should be compensated. This has so far resulted in a handful of lawsuits along with broader ethical questions about how models should be developed and trained.
Also see: AI vs. ML: Artificial Intelligence and Machine Learning
What Role Will NLP Play in the Future?There's no question that natural language processing will play a prominent role in future business and personal interactions. Personal assistants, chatbots and other tools will continue to advance. This will likely translate into systems that understand more complex language patterns and deliver automated but accurate technical support or instructions for assembling or repairing a product.
NLP will also lead to more advanced analysis of medical data. For example, a doctor might input patient symptoms and a database using NLP would cross-check them with the latest medical literature. Or a consumer might visit a travel site and say where she wants to go on vacation and what she wants to do. The site would then deliver highly customized suggestions and recommendations, based on data from past trips and saved preferences.
For now, business leaders should follow the natural language processing space—and continue to explore how the technology can improve products, tools, systems and services. The ability for humans to interact with machines on their own terms simplifies many tasks. It also adds value to business relationships.
Also see: The Future of Artificial Intelligence

Comments
Post a Comment