Git Product home page Git Product logo

harvard_cs50's Introduction

Harvard_CS50

Harvard Extension School Artificial Intelligence with Python CS50 2021

Artificial Intelligence with Python

CS50 2021

harvard_cs50's People

Contributors

logahn avatar

Watchers

 avatar

harvard_cs50's Issues

Harvard CS50 Notes

Havard X Artificial Intelligence with python

		Part 2: KNOWLEDGE
		Propositional Logic

Based on the logic of the proposition
Each letter represents a factor, They use logical connectives. and, or, not, implication, biconditional
Biconditional can be considered as if and or, both conditions must be true or both must be false
Entailment means that anywhere sentence A is true, then sentence B is true.
The AI needs a knowledge base which we give to it to know certain truths about the world
Derivation of new sentences from old ones is called Inference.
Model-checking checks if our knowledge base entails a condition.
Knowledge engineering is converting human knowledge to an AI knowledge base.
Inference Rules are the rules applied to take existing knowledge and apply it to the knowledge base.
Modus Ponens is an inference rule, it is the application of implication
And Elimination is an inference rule, if one side is true then the other is true.
Double Negation Elimination is an inference rule. It is not true that Harry did not pass the test. The negatives eliminate each other and we have a true statement.
Implication elimination and also biconditional elimination
De Morgans Law is the idea of turning an 'and' into an 'or' statement by moving the not statement.
Example Not(AND(a, b)) translates to Or(Not(A), Not(B)).
Example Not(Or(a, b)) translates to And(Not(A), Not(B)).
Distributive Law distributes operands and distributes the logic outside the parenthesis to the expressions on the parenthesis.
Literals are truth expressions, a propositional symbol or not a propositional symbol
A clause is a disjunction of literals
A conjunctive normal form sentence is a logical statement that is a conjunction of clauses.
Conversion to CNF

  1. Eliminate bi-conditions
  2. Eliminate implications, more like spread them
  3. Remove NOTS outside parentheses using Morgan Law
  4. Use distributive law to distribute logic.

Inference by Resolution is using inference to draw a solution
The resolution of two contradictory terms leads to an empty clause which implies a FALSE.
Universal Quantification implies that something is going to be true for all values
Existential Quantification implies that a statement would be true for a particular statement.

			Part 3: 
			UNCERTAINTY

p(w) is the probability of the possibility of w
Probability can range from 0 to 1
The sum of probabilities of possible outcomes in a world should equal 1
Unconditional Probability: Belief in a proposition in the absence of other evidence.
Conditional Probability: Degree of belief in a proposition with some already revealed evidence. AI uses conditional probability.p(a|b)
where a is the finding and b is the evidence.
A variable in probability is called a random variable
Probability distribution takes a random value and gives the possible outcomes in different situations.
The probability distribution of a random variable must sum up to 1
Independence is the theory that the knowledge of the occurrence of an event does not affect the probability of the other.
Probability distribution Vector is P = <0.6, 0.3, 0.1>

		Bayes' Rule

Joint probability: Come up when calculating the likelihood of multiple events.

Inclusion-Exclusion probability is the probability of one event or another event occurring.
Marginalization is calculating the probability p(a) using some information we might have access to. p(a) = p(a, b) + p(a, !b)
Conditioning:
Bayesian network: Data structure, representing dependencies among random variables.
It could be a directed graph where each node represents a rand var.
There would be an arrow from x to y where x is the parent node.
Inference by enumeration:
Sampling is choosing random data to create a sample of the possible values that could come up. The sampling might be done a large number of times.

Likelihood weighting starts by fixing the values for evidence variables.
It uses conditional probabilities
It weighs each sample by its likelihood.

Markov assumption:
The current state depends on only a finite fixed number of previous states.
Markov chain is a sequence of random variables following Markov Assumption
The hidden Markov Model is observing details observable by the robot to create inferences about details not observable by the robot.
A robot in a building cannot observe the sky outside with its camera but can observe if employees are coming into the building with umbrellas and based on that conclude if it is rainy or not then use the Markov chain to predict if the following day would be rainy or not.

				Part 4
			OPTIMIZATION

The HILL CLIMBING Algo is used to sort and find the best state by checking each neighbor of the current state and moving to the better state and repeating the cycle.
When there is no better neighbor than the current state, the algo ends.
The algorithm is not so efficient because it doesn't always find the optimal results. The algorithm might get stuck at local maximas and minimas. The risk of inefficiency is reduced by running the algorithm multiple times and one outcome is chosen.

SIMULATED ANNEALING
It starts at the current state, repeats the process for a number of time, it takes a random neighbor, calculates how better the neighbor is than the current state, it moves the current state to the better neighbor. It sometimes accepts some states that are worse based on some degree of probability.

The TRAVELLING SALESMAN PROBLEM is the most common application of applying optimization.
Constraints satisfaction: This is a set of variables and domains and constraints placed on the variables. A constraint graph can be used to graphically represent the constraints.
Node consistency is when all the values in a variable domain satisfy the variable's unary constraints
Arc consistency is when all the values in a variable domain satisfy the variable's binary constraints.
CSP- Constraints Satisfaction Problem.
Backtracking Search - When we get stuck, we go back to check if there are better outcomes. It acts as A recursive function.

Problem Formulation Methods are:
Local search, Linear programming, and constraints satisfaction.

			PART 4
			LEARNING

Supervised learning is when the computer is given a set of training data.
Classification is based on discrete calculations.
Nearest-Neighbor classification is an algorithm that classifies results based on the data surrounding the agent.

			PART 5
		NEURAL NETWORKS

Model of mathematical functions, from inputs to outputs based on the structure and parameters of the network.
Allows for learning the network's parameter based on data.
Gradient Descent is an algorithm for minimizing loss when training a neural network.
Perceptron can make decisions based on linear layers.
Backpropagation is the algorithm for training a neural network with multilayers.

			 PART 6
			LANGUAGE

n-gram is a continuous sequence of simple items from simple sentences.
Tokenization is the task of splitting a sequence of characters into pieces.
LaPlace smoothing is the addition of 1 to the probability of each word in order to avoid multiplying by zero if a word does not exist in the database.
Term frequency is the number of times a word shows up in a document.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.