Personal blog written from scratch using Node.js, Bootstrap, and MySQL. https://jrtechs.net
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

170 lines
4.1 KiB

  1. # Ch 4: Iterative improvement
  2. ## Simulated annealing
  3. Idea: escape local maxima by allowing some bad moves but gradually decrease their size and frequency.
  4. This is similar to gradient descent.
  5. Idea comes from making glass where you start very hot and then slowely cool down the temperature.
  6. ## Beam search
  7. Idea: keep k states instead of 1; choose top k of their successors.
  8. Problem: quite often all k states end up on same local hill. This can somewhat be overcome by randomly choosing k states but, favoring the good ones.
  9. ## Genetic algorithms
  10. Inspired by Charles Darwin's theory of evolution.
  11. The algorithm is an extension of local beam search with cuccessors generated from pairs of individuals rather than a successor function.
  12. ![GA overview](media/exam1/gaOverview.png)
  13. ![Genetic Algorithm Pseudo Code](media/exam1/gaAlgo.png)
  14. # Ch 6: Constraint satisfaction problems
  15. Ex CSP problems:
  16. - assignment
  17. - timetabling
  18. - hardware configuration
  19. - spreadsheets
  20. - factory scheduling
  21. - Floor-planning
  22. ## Problem formulation
  23. ![CSP formulation ex](media/exam2/cspEx.PNG)
  24. ### Variables
  25. Elements in the problem.
  26. ### Domains
  27. Possible values from domain $D_i$, try to be mathematical when formulating.
  28. ### Constraints
  29. Constraints on the variables specifying what values from the domain they may have.
  30. Types of constraints:
  31. - Unary: Constraints involving single variable
  32. - Binary: Constraints involving pairs of variables
  33. - Higher-order: Constraints involving 3 or more variables
  34. - Preferences: Where you favor one value in the domain more than another. This is mostly used for constrained optimization problems.
  35. ## Constraint graphs
  36. Nodes in graph are variables, arcs show constraints
  37. ## Backtracking
  38. ![Backtracking graph](media/exam2/backtracking.PNG)
  39. ### Minimum remaining value
  40. ![](media/exam2/mrv.PNG)
  41. Choose the variable wit the fewest legal values left.
  42. ### Degree heuristic
  43. ![](media/exam2/degree.PNG)
  44. Tie-breaker for minimum remaining value heuristic.
  45. Choose the variable with the most constraints on remaining variables.
  46. ### Least constraining value
  47. Choose the least constraining value: one that rules out fewest values in remaining variables.
  48. ![lsv](media/exam2/lsv.PNG)
  49. ### Forward checking
  50. Keep track of remaining legal values for unassigned variables and terminate search when any variable has no legal values left.
  51. This will help reduce how many nodes in the tree you have to expand.
  52. ![forward checking](media/exam2/forwardChecking.PNG)
  53. ### Constraint propagation
  54. ![](media/exam2/constraintProp.PNG)
  55. ### Arc consistency
  56. ![](media/exam2/arc.PNG)
  57. ### Tree structured CSPs
  58. Theorem: if constraint graph has no loops, the CSP ca be solved in $O(n*d^2)$ time.
  59. General CSP is $O(d^n)$
  60. ![](media/exam2/treeCSP.PNG)
  61. ## Connections to tree search, iterative improvement
  62. To apply this to hill-climbing, you select any conflicted variable and then use a min-conflicts heuristic
  63. to choose a value that violates the fewest constraints.
  64. ![](media/exam2/nQueens.PNG)
  65. # CH 13: Uncertainty
  66. ## Basic theory and terminology
  67. ### Probability space
  68. The probability space $\omega$ is all possible outcomes.
  69. A dice roll has 6 possible outcomes.
  70. ### Atomic Event
  71. An atomic event w is a single element from the probability space.
  72. $w \in \omega$
  73. Ex: rolling a dice of 4
  74. The probability of w is between [0,1].
  75. ### Event
  76. An event A is any subset of the probability space $\omega$
  77. The probability of an event is the sum of the probabilities of the atom events in the event.
  78. Ex: probability of rolling a even number dice is 1/2.
  79. ```
  80. P(die roll odd) = P(1)+P(2)+3P(5) = 1/6+1/6+1/6 = 1/2
  81. ```
  82. ### Random variable
  83. Is a function from some sample points to some range. eg reals or booleans.
  84. eg: P(Even = true)
  85. ## Prior probability
  86. Probabilities based given one or more events.
  87. Ex: probability cloudy and fall = 0.72.
  88. Given two variables with two possible assignments, we could represent all the information in a 2x2 matrix.
  89. ## Conditional Probability
  90. Probabilities based within a event.
  91. Eg: P(tired | monday) = .9.
  92. ## Bayes rule
  93. ![](media/exam2/bay.PNG)
  94. ## Independence
  95. ![](media/exam2/independence.PNG)