Fundamentals Flashcards
(21 cards)
Definition of machine learning
The computer programs learn by themselves
A computer program is said to learn from experience E with respect to come class of tasks T and performance measure P, if it’s performance at tasks improves with experience
Top down vs bottom up
Top down
Model all different functions and wire all these “agents” together
Write (manually) in programming language, instructions to solve the (usually specific) task
Bottom uP
Give the system a lot of data, so it can discover by itself the concepts in the world
Analyse and understand (manually) the task
Select the appropriate models
Training and testing
Use the same model to (automatically) solve unseen tasks
Is machine learning top down or bottom up
Bottom up
What are the three pillars of machine learning
Models
Computation
Data
Models and algorithms
Powerful and cheaper computation
Massive data warehouse
What is AI
Intelligence exhibited by machines that enables machines to perceive their environment and use learning and intelligence to take actions
What is the difference between weak and strong AI
Scope - single specific tasks —— broad general purpose
Self awareness - No —— yes
Learning ability - limited on specific tasks —— General, human like
Status - exists, widely used —— theoretical
Milestones in AI 1940s-60s
1943 - Turing test
1956 - Birth of AI (symbolic AI, search techniques, nature language processing etc.)
Milestones in AI 1970s
1970s intimate biology
AI winter
Models are used…
To train the AI
Doesn’t require knowledge engineers to find all the functions and wire them into an “agents”
Stops a knowledge acquisition bottleneck
Computation Moore’s law
The number of transistors on a computer chip doubles every two years (18 months). It’s a long term trend in computing technology that has lead to a dramatic decrease in the cost of computing
Examples of massive data warehouses
Internet of things: social media; business; industries; devices/vehicles
Big data: quantity, format/dimension, speed
Machine learning vs traditional programs
Traditional programs just use data, program and produce an output
Machine learning has the data and the expected output, then we build a model, train it with existing data
Data mining is …
Exploration and analysis of large quantities of data to discover valid, novel, useful and ultimately understandable patterns in data
Valid - hold on new data with some certainty
Novel - non obvious to the system (not just looking up data ie from a phone book)
Useful - should be possible to act on the item
Understandable - humans can interpret the pattern/model
Data mining explains patterns
AI, machine learning hierarchy
Deep Learning —> Neural Networks —> Machine Learning —> Artificial Intelligence
What is deep blue (milestones in AI)
1997 beat world chess champion
Deep blue was a chess-playing computer developed by IBM
What is IBM Watson (milestones in AI)
Uses natural language processing to analyse data and answer questions
Beat Jeopardy in 2011
In 2014 the Eugene Goostman chatbot
Passed the Turing test
In 2016 all …
Open board games were conquered by AI
Alpha go beat a champion at Go
What is the Turing test
An interrogator has to decide which candidate is a machine and which is a person only by asking them questions
If the machine can fool the interrogator 30% of the time, the machine is considered intelligent
1950 philosophical paper on machine intelligence: the Turing test
The Chinese room experiment
Behaving intelligently and being intelligent are two different things
5 factors of responsible and ethical AI
Privacy (and security of data)
Bias (and fairness)
Transparency (and explanation)
Accountability (and responsibility)
Safety