Artificial Intelligence (AI) – Demystifying the latest buzzword

Total Shares

Artificial Intelligence (AI) – Demystifying the latest buzzword

Everywhere you turn today someone is talking about Artificial Intelligence (AI). It appears to have taken over as the largest buzzword since Big Data.

Progressive organizations are actively seeking ways to apply AI. They want to use it to advance their businesses and build new experiences for those they interact with internally and externally.

Alas there is great confusion as to what AI means. That just gets worse when people mix that up with terms such as Machine Learning and Deep Learning. If you ask several different people their view on what is AI, or Machine Learning, you will get several different answers.

AI ElephantIt is the age-old problem of describing an elephant dependent on which side of it you touch while blindfolded.

AI is not a new topic. People have been pursuing AI since the 1940s. Machine Learning, which has developed from the field of Artificial Intelligence, has been around since at least the 1980s and Deep Learning, which is a subset of Machine Learning, has been rapidly gaining in popularity over the past 10 years. This post explores all these topics setting the scene for some upcoming posts.

What is AI?

The dream, and some would say purist form, of AI lies in something that is often known as General AI.

In general AI the aim is to AI Robot
create a machine that is as intelligent as a human. That means it is able to communicate, learn and solve problems as broadly as we can. It is this objective, and narrative, that makes people fear the so called “terminator” paradox where Skynet takes over our world.

The truth is we are a long way away from General AI today. What is more we are not likely to get there for some time. There is a reason we only see those machines on our TV screens. That reason is we cannot make it happen. At least not today!

Today our AI world generally falls into something we call “Narrow AI”. This  means we have technologies that allow computers to do specific things the same, or better, than we can as humans. Much of the recent advancement in narrow AI is down to the improvements in Machine Learning and more recently a category of machine learning known as Deep Learning.

The growing power of Narrow AI!

Remarkable progress is being made in Narrow AI thanks to Machine Learning advancements.  Machine Learning as a topic has been growing in popularity since the 1980 and Deep learning, a subset of Machine Learning, has exploded onto the scene in the last 5-7 years.

Machine learning progress has been driven by the advent of more compute power and newer more modern and better performing algorithms. This allowed for complexity coverage and speed which previously stopped progress.

There is another major driver.  It turns out that the advent of the Internet, automation and digitization plays a big role in delivering the large body of data needed to reliably teach machines. The data is the experience of a machine and remember that without experience the algorithm cannot learn!

Narrow AI Everywhere

Arguably everyone that is online uses some form of narrow AI today.

  • Every time you search using Bing or Google there is narrow AI in the background helping to rank pages.
  • When you use Facebook or a tool from Apple to tag images with faces of people there is narrow AI in the background learning and suggesting.
  • Translation capabilities are appearing driven by AI both as standalone web sites but online in Facebook and integrated into technologies such as Outlook, PowerPoint and Skype.

In fact, narrow AI is starting to be pervasive in many aspects of our lives.

Beyond everyday use we have been able to use machine learning to develop narrow AI that is able to beat our Human champions at Chess, Go, Jeopardy and more recently obtain the highest possible score on Ms. Pac-man. We also see things like autonomous cars being tested which is another form of narrow AI.

You might be asking a simple question now.

How do we know these are narrow AI?

The answer is simple. The computer trained to win at Chess could not play Go. The computer designed to play Go could not play Chess, the Jeopardy computer could not play Scrabble, and the autonomous car system will likely not be able to fly planes.

Each has been designed and trained for one task which it has mastered. Any new task will require extensive training. If you think about general AI the computer would need to be able to learn any new task it is presented with, and become an expert at it, with no external support. Thinking about it that way makes it clear why that will take some time.

The truth is the majority of Narrow AI we are playing with today focuses on uses of Machine Learning. Many people use this term without understanding it so here is a shot at trying to define it.

What is Machine Learning?

In 1959 Arthur Samuel described machine learning this way.

“A field of study that gives computers the ability to learn without being explicitly programmed”.

This was a basic definition, but one borne out by his ability to create a computer that could play the game of Checkers better than he could. The key aspect here is that you do not program the computer with every possible move and scenario. The idea is the computer learns from playing hundreds of games and ultimately improves to the place it can play better than a human. This is no different to how Humans learn which is normally by doing and adapting. The difference is a computer does not get bored and can play millions of games repeatedly without a break.

In 1998 Tom Mitchell defined this more exactly when he described machine learning in this more formulaic way.

“A computer program is said to learn from experience E with respect to some task T and some performance measure P, if its performance on T, as measured by P, improves with experience.”

Again, the key thing here is that the computer learns from ongoing experience getting better and better over time.  Experience in this case is data. Something we are increasingly getting an abundance of today.

If we stick to a simple high-level machine learning definition then you can think about it as using computer algorithms to look at data, learn from it and then make a decision or prediction about something. Essentially, we stop trying to hand code all possible combinations to achieve a task and instead we “train” a computer by giving it many different data inputs so it learns how to do a specific task.

Machine Learning Methods

There are a few machine learning methods you might hear about:

  1. Supervised learning;
  2. Unsupervised learning;
  3. Semi supervised learning ;
  4. Reinforcement learning.

Each has different a different use. All are types of machine learning methods often with a variety of algorithms supporting them. Supervised and unsupervised machine learning methods are the most used today.

What’s different? – Supervised Machine Learning vs Unsupervised Machine Learning

Supervised machine learning methods provide labelled training data (essentially a pair of values with input and a known output labelled) from which the algorithm then builds a function you can pass new data through to work out the likely output based on the new inputs.

Unsupervised machine learning methods leave it to the algorithm to work out the outputs based on input data without any labels. To that end the machine will help to determine the outputs, based on how many you tell it you want, using the structure of the data you feed it and any constraints. Think about that. There is NO correct answers and no teacher. The computer is being left to discover and present the interesting structure in the data.

I will not go into the specific algorithms used in either method here. Machine learning has a lot of great algorithms behind it that can be used today. Most of those algorithms are easily accessible via abstract APIs. When you add to that the compute power and data we now have easy access to you can see why they are exploding in usage, popularity and accuracy.

What about Deep Learning?

In my next post I will look deeper into Deep Learning and how that relates to Machine Learning. A hint – it is just a type of Machine Learning. I will also discuss Cognitive computing which is driving a lot of the buzz around Deep learning right now. Additionally I will cover off a few of the use cases I am seeing.

Wrapping up this post – Summarizing the terms

So the way I look at things is:

  • Artificial Intelligence is a broad topic that has been around for a long time. It has ambitious goals. There are many areas we need to master to reach those goals. One of those is Machine Learning.
  • In support of Artificial Intelligence we have Machine Learning. It is an area of research that is dedicated to helping machines to learn from experience.
  • Analytics encompasses classical statistical and machine learning. Everything in Machine Learning can therefore be seen as part of Analytics but in my opinion not everything in Analytics is part of AI.
  • Narrow AI is the outcome of Machine Learning where we have managed to get computers to learn very specific tasks so they can perform them the same or better than a human.
  • Deep Learning is a subset of techniques in the Machine Learning space. Those techniques are often spoken about particularly in relation to Cognitive Computing although they are not restricted to just that.
  • Cognitive Computing generally refers to trying to get computers to interact with us in a human like way. Deep Learning is often behind Cognitive Computing helping with things like image and speech recognition.

Often these topics collide and are labelled collectively as Artificial Intelligence.  People use the top level phrase as it is the new cool. Often you need to dig in to see what problem they want to solve. this then lets you understand what they mean by their use of the word AI.

What do you think? Do you have a different definition for any of these topics?

Although I work for Microsoft these thoughts are 100% my own & may not reflect the views of my employer.

2 thoughts on “Artificial Intelligence (AI) – Demystifying the latest buzzword”

Leave a Reply

Your email address will not be published. Required fields are marked *