Concept of Machine Learning come into existence in 1950's when Arthur Lee Samuel who was an American pioneer in the field of computer gaming and artificial intelligence.He coined the term "machine learning" in 1959.The Samuel Checkers-playing Program appears to be the world's first self-learning program, and as such a very early demonstration of the fundamental concept of artificial intelligence (AI).
After coining this term various advancement was made in combining the concept of machine learning like:
1. In 1957 the term "Perceptron" algorithm was invented at the Cornell Aeronautical Laboratory by Frank Rosenblatt, It was intended to be machine rather than a program.
Its first implementation was in software for the IBM 704, it was subsequently implemented in custom-built hardware as the "Mark 1 Perceptron". This machine was designed for image recognition.It had an array of 400 photocells, randomly connected to the "neurons". Weights were encoded in potentiometers, and weight updates during learning were performed by electric motors.
2.After invention of "Perceptron ",Minsky and Papert prove the limitation of Perceptron.
They both wrote a book Perceptrons: an introduction to computational geometry. They claimed that pessimistic predictions made by the authors were responsible for an erroneous change in the direction of research in AI, concentrating efforts on so-called "symbolic" systems, and contributing to the so-called AI winter.
3.Further advancements in Artificial intelligence lead to process of expert system knowledge acquisition,which was one of the successful application of real world business problems. Researchers at Stanford and other AI laboratories worked with doctors and other highly skilled experts to develop systems that could automate complex tasks such as medical diagnosis. Until this point computers had mostly been used to automate highly data intensive tasks but not for complex reasoning. Technologies such as inference engines allowed developers for the first time to tackle more complex problems.
As expert systems scaled up from demonstration prototypes to industrial strength applications it was soon realized that the acquisition of domain expert knowledge was one of if not the most critical task in the knowledge engineering process. This knowledge acquisition process became an intense area of research on its own.
4.With the growth of the concept of machine learning lead to Natural language processing whuch is is a field of computer science, artificial intelligence, and computational linguistics concerned with the interactions between computers and human (natural) languages.
5.In decision tree learning, ID3 (Iterative Dichotomiser 3) algorithm was invented by Ross Quinlan. It used to generate a decision tree from a dataset. ID3 is the precursor to the C4.5 algorithm, and is typically used in the machine learning and natural language processing domains.
6.In late 1990's resurgence of neural network take place which is computational approach and based on a large collection of neural units loosely modeling the way a biological brain solves problems with large clusters of biological neurons connected by axons. Each neural unit is connected with many others, and links can be enforcing or inhibitory in their effect on the activation state of connected neural units. Each individual neural unit may have a summation function which combines the values of all its inputs together. There may be a threshold function or limiting function on each connection and on the unit itself such that it must surpass it before it can propagate to other neurons. These systems are self-learning and trained rather than explicitly programmed and excel in areas where the solution or feature detection is difficult to express in a traditional computer program.
7. Probably Approximately Correct learning (PAC learning) a framework for mathematical analysis of machine learning was proposed in 1984 by Leslie Valiant.
In this framework, the learner receives samples and must select a generalization function (called the hypothesis) from a certain class of possible functions. The goal is that, with high probability (the "probably" part), the selected function will have low generalization error (the "approximately correct" part). The learner must be able to learn the concept given any arbitrary approximation ratio, probability of success, or distribution of the samples.The model was later extended to treat noise (misclassified samples).
8. With further development lead to Data mining which is the analysis step of the "knowledge discovery in databases" process, or KDD. The term is a misnomer, because the goal is the extraction of patterns and knowledge from large amounts of data, not the extraction (mining) of data itself.
Interesting facts:
1.In 1994 Lucas Industries and Jaguar cars collaborated under Prometheus program for testing driver less car by implementing the concept of machine learning to produce intelligent driver support system.Lucas industries developed a system which iterpret the live images from a video camera.A special computer was designed to handle all the video computer.
It first extract all the edges in the scene and then these were analyse to detect the road lanes.
Obstacles were detected by the use of milimetric radar.It was capable of detecting up to 12 vehicle up to 120 meter ahead.It was then combine to offer number of feature
2.In 1997 first time a super computer "Deep Blue" created by IBM beats world champion Gary Kasparov. Deep blue was implementation of machine learning as it was made to play various matches and with every match the experienced was gained and then the performance was improved.This match (between "Deep Blue" and Gary Kasparov) was played under tournament condition.
3.In 2009 google started a project for building self driving car which was based on the concept of machine learning and artificial intelligence.
4.In 2011, Watson competed on Jeopardy,which is a question answering computer system capable of answering questions posed in natural language. It was developed in IBM's DeepQA project by a research team led by principal investigator David Ferrucci. Watson was named after IBM's first CEO and industrialist Thomas J. Watson.This computer system was specifically developed to answer questions on the quiz show Jeopardy!
In the following video I have explained about the history and definition of machine learning learning:
Hope you have enjoyed reading this article.From next article I will be discussing about various machine learning algorithm and it's implementation.Till then enjoy learning!!!
As expert systems scaled up from demonstration prototypes to industrial strength applications it was soon realized that the acquisition of domain expert knowledge was one of if not the most critical task in the knowledge engineering process. This knowledge acquisition process became an intense area of research on its own.
4.With the growth of the concept of machine learning lead to Natural language processing whuch is is a field of computer science, artificial intelligence, and computational linguistics concerned with the interactions between computers and human (natural) languages.
5.In decision tree learning, ID3 (Iterative Dichotomiser 3) algorithm was invented by Ross Quinlan. It used to generate a decision tree from a dataset. ID3 is the precursor to the C4.5 algorithm, and is typically used in the machine learning and natural language processing domains.
6.In late 1990's resurgence of neural network take place which is computational approach and based on a large collection of neural units loosely modeling the way a biological brain solves problems with large clusters of biological neurons connected by axons. Each neural unit is connected with many others, and links can be enforcing or inhibitory in their effect on the activation state of connected neural units. Each individual neural unit may have a summation function which combines the values of all its inputs together. There may be a threshold function or limiting function on each connection and on the unit itself such that it must surpass it before it can propagate to other neurons. These systems are self-learning and trained rather than explicitly programmed and excel in areas where the solution or feature detection is difficult to express in a traditional computer program.
7. Probably Approximately Correct learning (PAC learning) a framework for mathematical analysis of machine learning was proposed in 1984 by Leslie Valiant.
In this framework, the learner receives samples and must select a generalization function (called the hypothesis) from a certain class of possible functions. The goal is that, with high probability (the "probably" part), the selected function will have low generalization error (the "approximately correct" part). The learner must be able to learn the concept given any arbitrary approximation ratio, probability of success, or distribution of the samples.The model was later extended to treat noise (misclassified samples).
8. With further development lead to Data mining which is the analysis step of the "knowledge discovery in databases" process, or KDD. The term is a misnomer, because the goal is the extraction of patterns and knowledge from large amounts of data, not the extraction (mining) of data itself.
Interesting facts:
1.In 1994 Lucas Industries and Jaguar cars collaborated under Prometheus program for testing driver less car by implementing the concept of machine learning to produce intelligent driver support system.Lucas industries developed a system which iterpret the live images from a video camera.A special computer was designed to handle all the video computer.
It first extract all the edges in the scene and then these were analyse to detect the road lanes.
Obstacles were detected by the use of milimetric radar.It was capable of detecting up to 12 vehicle up to 120 meter ahead.It was then combine to offer number of feature
2.In 1997 first time a super computer "Deep Blue" created by IBM beats world champion Gary Kasparov. Deep blue was implementation of machine learning as it was made to play various matches and with every match the experienced was gained and then the performance was improved.This match (between "Deep Blue" and Gary Kasparov) was played under tournament condition.
3.In 2009 google started a project for building self driving car which was based on the concept of machine learning and artificial intelligence.
4.In 2011, Watson competed on Jeopardy,which is a question answering computer system capable of answering questions posed in natural language. It was developed in IBM's DeepQA project by a research team led by principal investigator David Ferrucci. Watson was named after IBM's first CEO and industrialist Thomas J. Watson.This computer system was specifically developed to answer questions on the quiz show Jeopardy!
In the following video I have explained about the history and definition of machine learning learning:
Hope you have enjoyed reading this article.From next article I will be discussing about various machine learning algorithm and it's implementation.Till then enjoy learning!!!
Awesome, Great source of info.
ReplyDeleteThank you very much