Machine learning

 Is a broad and fascinating field. It has been called one of the modern fields to work It has applications in an incredibly wide variety of application areas, from medicine to advertising, from military to pedestrian. Its importance is likely to grow, as more and areas turn to it as a way of dealing with the massive amounts of dat available  

At a basic level, machine learning is about predicting the future based on the past. For instance, you might wish to predict how much a user Alice will like a movie that she hasn’t seen, based on her ratings of movies that she has seen. This means making informed guesses about some unobserved property of some object, based on observed properties of that object.

Some Canonical Learning Problems


There are a large number of typical inductive learning problems. The primary difference between them is in what type of thing they’re trying to predict. Here are some examples: Regression: trying to predict a real value. For instance, predict the value of a stock tomorrow given its past performance. Or predict Alice’s score on the machine learning final exam based on her homework scores.

Binary Classification:

Trying to predict a simple yes/no response. For instance, predict whether Alice will enjoy a course or not. Or predict whether a user review of the newest Apple product is positive or negative about the product.

Multiclass Classification:

Trying to put an example into one of a number of classes. For instance, predict whether a news story is about entertainment, sports, politics, religion, etc. Or predict whether a computer course is Systems, Theory, Artificial intelligence or Other. Like python, data science, and algorithm factors


Trying to put a set of objects in order of relevance. For instance, predicting what order to put web pages in, in response to a user query. Or predict Alice’s ranked preferences over courses she hasn’t taken.

The Decision Tree Model of Learning

The decision tree is a classic and natural model of learning. It is closely related to the fundamental computer science notion of “divide and conquer.” Although decision trees can be applied to many Draft: Do Not Distribute decision trees 11 learning problems, we will begin with the simplest case: binary classification. 


A course in machine learning of prediction with a perceptron. First you compute activations of the nodes in the hidden unit based on the inputs and the input weights. Then you compute activations of the output unit given the hidden unit activations and the second layer of weights. The only major difference between this computation and the perceptron computation is that the hidden units compute a non-linear function of their inputs. This is usually called the activation function or link function.