Summary

This document is a presentation on machine learning basics. It covers learning from examples, generalization, features, datasets, and different aspects of classification. The slides highlight concepts like Bayes' rule, misclassification costs, and decision boundaries.

Full Transcript

Machine Learning: the basics Learning from examples Machine Learning: Basics 1 Outline of this lecture Objects, features, measurements,... datasets and feature space Traditional pattern recognition: classification Class posterior probabilities and Bayes’ rule Bayes’ cl...

Machine Learning: the basics Learning from examples Machine Learning: Basics 1 Outline of this lecture Objects, features, measurements,... datasets and feature space Traditional pattern recognition: classification Class posterior probabilities and Bayes’ rule Bayes’ classifier and Bayes’ error Misclassification costs Machine Learning: Basics 2 Learning from Examples Given some examples, we may perform: clustering outlier detection classification regression … on new objects. We assume that no complete physical model is known! Machine Learning: Basics 3 Generalization We don't want to just describe the data... We want to predict for new, unseen data! Machine Learning: Basics 4 Generalization Training set: All examples are labeled This set is used to train/develop our system Test set: These examples cannot be used to train our system The examples do not have to be labeled When labels are available, we can objectively evaluate our system Machine Learning: Basics 5 Features To do these tasks automatically, we have to encode the objects. Objects are typically encoded by defining features: shape weight color AAADGnicpVJNa9VAFJ0Xv+rz61WXgoS+CoIlJEWsG+GhGzdChb62kIlhMrnpGzofYWbSJoTs/Aku/QVu9Re4E7duuveHOMmroK2C6IEwh3Pumdy53KzkzNgwPBl5Fy5eunxl5er42vUbN29NVm/vGlVpCnOquNL7GTHAmYS5ZZbDfqmBiIzDXnb4vPf3jkAbpuSObUpIBDmQrGCUWCelk3tYELvIirbunsZ1Gm3U6eZGEATufJm83kkn0zAIB/jnSXRKprM1/PDtyazZTldH33CuaCVAWsqJMXEUljZpibaMcujGuDJQEnpIDiB2VBIBJmmHh3T+fafkfqG0+6T1B/XnREuEMY3IXGXftjnr9eLvvLiyxZOkZbKsLEi6/FFRcd8qv5+KnzMN1PLGEUI1c736dEE0odbNbow1SDimSggic3wENI6SdhgcXShGYew7OCFTdYszxfPeWsc5MyUnjbENh6Fai3Yadeu4kj+Kuu4PWQu1/aegce2X/xX9iws6txfR2S04T3Y3g+hx8OiVW5BnaIkVdBetoQcoQltohl6gbTRHFL1B79EH9NF7533yPntflqXe6DRzB/0C7+t3jasEaA== T x = [x1 , x2 ,..., xM ] Machine Learning: Basics 6 Datasets When we measure the features of many objects: we obtain a dataset. dataset For classification: feature feature vector labeled object measurement Machine Learning: Basics 7 How to define features? Note that features reduce, and give a specific view of the objects: YOU (the user) is responsible for it Good features allow for pattern recognition, bad features allow for nothing Other (than feature vector) approaches of defining objects are: Dissimilarity approach Structural pattern recognition (graphs) Feature approach is very well developed, other approaches are still more research. Machine Learning: Basics 8 Noise in the measurements The measurements will never be perfect Objects within a class will vary We need to apply some statistics to cover all the variations Machine Learning: Basics 9 Measurements Task: distinguish between 3 types of Iris flowers: (Images from Wikipedia) Iris Setosa Iris Versicolor Iris Virginica Measurements: sepal width, sepal length, petal width, petal length. Machine Learning: Basics 10 Objects in feature space We can interpret the x2 measurements as a vector in a vector space: T AAADGnicpVJNa9VAFJ0Xv+rz61WXgoS+CoIlJEWsG+GhGzdChb62kIlhMrnpGzofYWbSJoTs/Aku/QVu9Re4E7duuveHOMmroK2C6IEwh3Pumdy53KzkzNgwPBl5Fy5eunxl5er42vUbN29NVm/vGlVpCnOquNL7GTHAmYS5ZZbDfqmBiIzDXnb4vPf3jkAbpuSObUpIBDmQrGCUWCelk3tYELvIirbunsZ1Gm3U6eZGEATufJm83kkn0zAIB/jnSXRKprM1/PDtyazZTldH33CuaCVAWsqJMXEUljZpibaMcujGuDJQEnpIDiB2VBIBJmmHh3T+fafkfqG0+6T1B/XnREuEMY3IXGXftjnr9eLvvLiyxZOkZbKsLEi6/FFRcd8qv5+KnzMN1PLGEUI1c736dEE0odbNbow1SDimSggic3wENI6SdhgcXShGYew7OCFTdYszxfPeWsc5MyUnjbENh6Fai3Yadeu4kj+Kuu4PWQu1/aegce2X/xX9iws6txfR2S04T3Y3g+hx8OiVW5BnaIkVdBetoQcoQltohl6gbTRHFL1B79EH9NF7533yPntflqXe6DRzB/0C7+t3jasEaA== x = [x1 , x2 ,..., xM ] This originates in principle from a probability density over the whole feature space p(x, y) x1 Machine Learning: Basics 11 Classification y1 AAAB6nicbVDLSgNBEOyNrxhfUY9eBoPgKeyKr2PQi8eI5gHJEmYns8mQ2dllplcISz7BiwdFvPpF3vwbJ8keNLGgoajqprsrSKQw6LrfTmFldW19o7hZ2tre2d0r7x80TZxqxhsslrFuB9RwKRRvoEDJ24nmNAokbwWj26nfeuLaiFg94jjhfkQHSoSCUbTSw7jn9coVt+rOQJaJl5MK5Kj3yl/dfszSiCtkkhrT8dwE/YxqFEzySambGp5QNqID3rFU0YgbP5udOiEnVumTMNa2FJKZ+nsio5Ex4yiwnRHFoVn0puJ/XifF8NrPhEpS5IrNF4WpJBiT6d+kLzRnKMeWUKaFvZWwIdWUoU2nZEPwFl9eJs2zqndZvbg/r9Ru8jiKcATHcAoeXEEN7qAODWAwgGd4hTdHOi/Ou/Mxby04+cwh/IHz+QMQdo2r Given labeled data: x Assign to each object a class label y AAAB6HicbVDLSgNBEOz1GeMr6tHLYBA8hV3xdQx68ZiAeUCyhNlJbzJmdnaZmRXCki/w4kERr36SN//GSbIHTSxoKKq66e4KEsG1cd1vZ2V1bX1js7BV3N7Z3dsvHRw2dZwqhg0Wi1i1A6pRcIkNw43AdqKQRoHAVjC6m/qtJ1Sax/LBjBP0IzqQPOSMGivVx71S2a24M5Bl4uWkDDlqvdJXtx+zNEJpmKBadzw3MX5GleFM4KTYTTUmlI3oADuWShqh9rPZoRNyapU+CWNlSxoyU39PZDTSehwFtjOiZqgXvan4n9dJTXjjZ1wmqUHJ5ovCVBATk+nXpM8VMiPGllCmuL2VsCFVlBmbTdGG4C2+vEya5xXvqnJZvyhXb/M4CnAMJ3AGHlxDFe6hBg1ggPAMr/DmPDovzrvzMW9dcfKZI/gD5/MH6v2NBw== In effect splits the y3 AAAB6nicbVDLSgNBEOyNrxhfUY9eBoPgKez6Pga9eIxoHpAsYXbSmwyZnV1mZoWw5BO8eFDEq1/kzb9xkuxBowUNRVU33V1BIrg2rvvlFJaWV1bXiuuljc2t7Z3y7l5Tx6li2GCxiFU7oBoFl9gw3AhsJwppFAhsBaObqd96RKV5LB/MOEE/ogPJQ86osdL9uHfaK1fcqjsD+Uu8nFQgR71X/uz2Y5ZGKA0TVOuO5ybGz6gynAmclLqpxoSyER1gx1JJI9R+Njt1Qo6s0idhrGxJQ2bqz4mMRlqPo8B2RtQM9aI3Ff/zOqkJr/yMyyQ1KNl8UZgKYmIy/Zv0uUJmxNgSyhS3txI2pIoyY9Mp2RC8xZf/kuZJ1buont+dVWrXeRxFOIBDOAYPLqEGt1CHBjAYwBO8wKsjnGfnzXmftxacfGYffsH5+AYTfo2t feature space in separate regions y2 AAAB6nicbVDLSgNBEOz1GeMr6tHLYBA8hd3g6xj04jGieUCyhNnJbDJkdnaZ6RXCkk/w4kERr36RN//GSbIHTSxoKKq66e4KEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LR7dRvPXFtRKwecZxwP6IDJULBKFrpYdyr9kplt+LOQJaJl5My5Kj3Sl/dfszSiCtkkhrT8dwE/YxqFEzySbGbGp5QNqID3rFU0YgbP5udOiGnVumTMNa2FJKZ+nsio5Ex4yiwnRHFoVn0puJ/XifF8NrPhEpS5IrNF4WpJBiT6d+kLzRnKMeWUKaFvZWwIdWUoU2naEPwFl9eJs1qxbusXNyfl2s3eRwFOIYTOAMPrqAGd1CHBjAYwDO8wpsjnRfn3fmYt644+cwR/IHz+QMR+o2s Machine Learning: Basics 12 The general model Function f should give the predicted output. f (x) y AAAB6HicbVDLSgNBEOz1GeMr6tHLYBA8hV3xdQx68ZiAeUCyhNlJbzJmdnaZmRXCki/w4kERr36SN//GSbIHTSxoKKq66e4KEsG1cd1vZ2V1bX1js7BV3N7Z3dsvHRw2dZwqhg0Wi1i1A6pRcIkNw43AdqKQRoHAVjC6m/qtJ1Sax/LBjBP0IzqQPOSMGivVx71S2a24M5Bl4uWkDDlqvdJXtx+zNEJpmKBadzw3MX5GleFM4KTYTTUmlI3oADuWShqh9rPZoRNyapU+CWNlSxoyU39PZDTSehwFtjOiZqgXvan4n9dJTXjjZ1wmqUHJ5ovCVBATk+nXpM8VMiPGllCmuL2VsCFVlBmbTdGG4C2+vEya5xXvqnJZvyhXb/M4CnAMJ3AGHlxDFe6hBg1ggPAMr/DmPDovzrvzMW9dcfKZI/gD5/MH6v2NBw== x and... AAAB+HicbVC7TsMwFL3hWcqjAUYWiwqpLFWCeI0VLIxFog+prSrHdVqrjhPZDiKEfgkLAwix8ils/A1OmwFajmTp6Jx7dY+PF3GmtON8W0vLK6tr64WN4ubW9k7J3t1rqjCWhDZIyEPZ9rCinAna0Exz2o4kxYHHacsbX2d+655KxUJxp5OI9gI8FMxnBGsj9e1SVEmeugHWI89PHybHfbvsVJ0p0CJxc1KGHPW+/dUdhCQOqNCEY6U6rhPpXoqlZoTTSbEbKxphMsZD2jFU4ICqXjoNPkFHRhkgP5TmCY2m6u+NFAdKJYFnJrOIat7LxP+8Tqz9y17KRBRrKsjskB9zpEOUtYAGTFKieWIIJpKZrIiMsMREm66KpgR3/suLpHlSdc+rZ7en5dpVXkcBDuAQKuDCBdTgBurQAAIxPMMrvFmP1ov1bn3MRpesfGcf/sD6/AHgipM/ model should p(y|x) be fitted to the data! Machine Learning: Basics 13 Output of the model For each object in the feature space, we should estimate: AAAB+HicbVC7TsMwFL3hWcqjAUYWiwqpLFWCeI0VLIxFog+prSrHdVqrjhPZDiKEfgkLAwix8ils/A1OmwFajmTp6Jx7dY+PF3GmtON8W0vLK6tr64WN4ubW9k7J3t1rqjCWhDZIyEPZ9rCinAna0Exz2o4kxYHHacsbX2d+655KxUJxp5OI9gI8FMxnBGsj9e1SVEmeugHWI89PHybHfbvsVJ0p0CJxc1KGHPW+/dUdhCQOqNCEY6U6rhPpXoqlZoTTSbEbKxphMsZD2jFU4ICqXjoNPkFHRhkgP5TmCY2m6u+NFAdKJYFnJrOIat7LxP+8Tqz9y17KRBRrKsjskB9zpEOUtYAGTFKieWIIJpKZrIiMsMREm66KpgR3/suLpHlSdc+rZ7en5dpVXkcBDuAQKuDCBdTgBurQAAIxPMMrvFmP1ov1bn3MRpesfGcf/sD6/AHgipM/ p(y|x) In practice we fit a function: f (x) Machine Learning: Basics 14 Pattern Recognition pipeline y learning, fitting, x training known find find model objects features features f(x) new extract z y classify evaluation object features applying, generalization testing Machine Learning: Basics 15 Classification, how to do it? Given a feature, and a training set, where is the blue class? Gaussian Data 1 0.8 0.6 0.4 0.2 0 !2 !1 0 1 2 3 4 5 Feature 1 Machine Learning: Basics 16 Class posterior probability For each object we want to estimate p(blue|feature 1) Gaussian Data 1 0.8 p(blue|feature 1) 0.6 0.4 0.2 0 !2 !1 0 1 2 3 4 5 Feature 1 Machine Learning: Basics 17 Class posterior probability For each object we want to estimate p(y|x) AAAB+HicbVC7TsMwFL3hWcqjAUYWiwqpLFWCeI0VLIxFog+prSrHdVqrjhPZDiKEfgkLAwix8ils/A1OmwFajmTp6Jx7dY+PF3GmtON8W0vLK6tr64WN4ubW9k7J3t1rqjCWhDZIyEPZ9rCinAna0Exz2o4kxYHHacsbX2d+655KxUJxp5OI9gI8FMxnBGsj9e1SVEmeugHWI89PHybHfbvsVJ0p0CJxc1KGHPW+/dUdhCQOqNCEY6U6rhPpXoqlZoTTSbEbKxphMsZD2jFU4ICqXjoNPkFHRhkgP5TmCY2m6u+NFAdKJYFnJrOIat7LxP+8Tqz9y17KRBRrKsjskB9zpEOUtYAGTFKieWIIJpKZrIiMsMREm66KpgR3/suLpHlSdc+rZ7en5dpVXkcBDuAQKuDCBdTgBurQAAIxPMMrvFmP1ov1bn3MRpesfGcf/sD6/AHgipM/ Gaussian Data 1 AAAB+HicbVC7TsMwFL3hWcqjAUYWiwqpLFWCeI0VLIxFog+prSrHdVqrjhPZDiKEfgkLAwix8ils/A1OmwFajmTp6Jx7dY+PF3GmtON8W0vLK6tr64WN4ubW9k7J3t1rqjCWhDZIyEPZ9rCinAna0Exz2o4kxYHHacsbX2d+655KxUJxp5OI9gI8FMxnBGsj9e1SVEmeugHWI89PHybHfbvsVJ0p0CJxc1KGHPW+/dUdhCQOqNCEY6U6rhPpXoqlZoTTSbEbKxphMsZD2jFU4ICqXjoNPkFHRhkgP5TmCY2m6u+NFAdKJYFnJrOIat7LxP+8Tqz9y17KRBRrKsjskB9zpEOUtYAGTFKieWIIJpKZrIiMsMREm66KpgR3/suLpHlSdc+rZ7en5dpVXkcBDuAQKuDCBdTgBurQAAIxPMMrvFmP1ov1bn3MRpesfGcf/sD6/AHgipM/ p(y|x) 0.8 0.6 0.4 0.2 0 !2 !1 0 1 2 3 4 5 Feature 1 Machine Learning: Basics 18 Class posterior probability For each object we have to estimate p(y|x) AAAB+HicbVC7TsMwFL3hWcqjAUYWiwqpLFWCeI0VLIxFog+prSrHdVqrjhPZDiKEfgkLAwix8ils/A1OmwFajmTp6Jx7dY+PF3GmtON8W0vLK6tr64WN4ubW9k7J3t1rqjCWhDZIyEPZ9rCinAna0Exz2o4kxYHHacsbX2d+655KxUJxp5OI9gI8FMxnBGsj9e1SVEmeugHWI89PHybHfbvsVJ0p0CJxc1KGHPW+/dUdhCQOqNCEY6U6rhPpXoqlZoTTSbEbKxphMsZD2jFU4ICqXjoNPkFHRhkgP5TmCY2m6u+NFAdKJYFnJrOIat7LxP+8Tqz9y17KRBRrKsjskB9zpEOUtYAGTFKieWIIJpKZrIiMsMREm66KpgR3/suLpHlSdc+rZ7en5dpVXkcBDuAQKuDCBdTgBurQAAIxPMMrvFmP1ov1bn3MRpesfGcf/sD6/AHgipM/ Gaussian GaussianData Data 11 AAAB+HicbVC7TsMwFL3hWcqjAUYWiwqpLFWCeI0VLIxFog+prSrHdVqrjhPZDiKEfgkLAwix8ils/A1OmwFajmTp6Jx7dY+PF3GmtON8W0vLK6tr64WN4ubW9k7J3t1rqjCWhDZIyEPZ9rCinAna0Exz2o4kxYHHacsbX2d+655KxUJxp5OI9gI8FMxnBGsj9e1SVEmeugHWI89PHybHfbvsVJ0p0CJxc1KGHPW+/dUdhCQOqNCEY6U6rhPpXoqlZoTTSbEbKxphMsZD2jFU4ICqXjoNPkFHRhkgP5TmCY2m6u+NFAdKJYFnJrOIat7LxP+8Tqz9y17KRBRrKsjskB9zpEOUtYAGTFKieWIIJpKZrIiMsMREm66KpgR3/suLpHlSdc+rZ7en5dpVXkcBDuAQKuDCBdTgBurQAAIxPMMrvFmP1ov1bn3MRpesfGcf/sD6/AHgipM/ p(y|x) 0.8 0.8 0.6 0.6 Density 0.4 0.4 0.2 0.2 00 X AAACA3icbVDLSsNAFJ34rPUVdaebwSLUTUnE10YounFZwT6gCWEynbRDZ5IwMxFDLLjxV9y4UMStP+HOv3HSZqGtBy4czrmXe+/xY0alsqxvY25+YXFpubRSXl1b39g0t7ZbMkoEJk0csUh0fCQJoyFpKqoY6cSCIO4z0vaHV7nfviNC0ii8VWlMXI76IQ0oRkpLnrnryIR7FMbV1KMPDkdq4AfZ/ejwwvbMilWzxoCzxC5IBRRoeOaX04twwkmoMENSdm0rVm6GhKKYkVHZSSSJER6iPulqGiJOpJuNfxjBA630YBAJXaGCY/X3RIa4lCn3dWd+pJz2cvE/r5uo4NzNaBgnioR4sihIGFQRzAOBPSoIVizVBGFB9a0QD5BAWOnYyjoEe/rlWdI6qtmntZOb40r9soijBPbAPqgCG5yBOrgGDdAEGDyCZ/AK3own48V4Nz4mrXNGMbMD/sD4/AGTG5d8 !2!2 !1!1 00 11 Feature 22 Feature1 1 33 44 55 p(yi |x) = 1 i Machine Learning: Basics 19 Classify new objects Assign the label of the class with the largest posterior probability Gaussian Data 1 p(y1 |x) > p(y0.82 |x) p(y2 |x) > p(y1 |x) AAACDnicbZC7TsMwFIYdriXcAowsFlWlslRJxW1CFSyMRaIXqY0qx3Vaq44T2Q4iCn0CFl6FhQGEWJnZeBucNkNp+SVLv75zjnzO70WMSmXbP8bS8srq2nphw9zc2t7Ztfb2mzKMBSYNHLJQtD0kCaOcNBRVjLQjQVDgMdLyRtdZvXVPhKQhv1NJRNwADTj1KUZKo55VispJz3nsBkgNPT99GB/DSzNj1VnWs4p2xZ4ILhonN0WQq96zvrv9EMcB4QozJGXHsSPlpkgoihkZm91YkgjhERqQjrYcBUS66eScMSxp0od+KPTjCk7o7ESKAimTwNOd2YpyvpbB/2qdWPkXbkp5FCvC8fQjP2ZQhTDLBvapIFixRBuEBdW7QjxEAmGlEzR1CM78yYumWa04Z5XT25Ni7SqPowAOwREoAwecgxq4AXXQABg8gRfwBt6NZ+PV+DA+p61LRj5zAP7I+PoFM0ObmA== AAACDnicbZC7TsMwFIYdriXcAowsFlWlslRJxW1CFSyMRaIXqY0qx3Vaq44T2Q4iCn0CFl6FhQGEWJnZeBucNkNp+SVLv75zjnzO70WMSmXbP8bS8srq2nphw9zc2t7Ztfb2mzKMBSYNHLJQtD0kCaOcNBRVjLQjQVDgMdLyRtdZvXVPhKQhv1NJRNwADTj1KUZKo55VispJr/rYDZAaen76MD6Gl2bGnFnWs4p2xZ4ILhonN0WQq96zvrv9EMcB4QozJGXHsSPlpkgoihkZm91YkgjhERqQjrYcBUS66eScMSxp0od+KPTjCk7o7ESKAimTwNOd2YpyvpbB/2qdWPkXbkp5FCvC8fQjP2ZQhTDLBvapIFixRBuEBdW7QjxEAmGlEzR1CM78yYumWa04Z5XT25Ni7SqPowAOwREoAwecgxq4AXXQABg8gRfwBt6NZ+PV+DA+p61LRj5zAP7I+PoFM1ebmA== 0.6 Density 0.4 decision boundary p(y1 |x) = p(y2 |x) AAACDnicbZC7TsMwFIYdriXcAowsFlWlslRJxW1BqmBhLBK9SG1UOa7TWnWcyHYQUegTsPAqLAwgxMrMxtvgtBlKyy9Z+vWdc+Rzfi9iVCrb/jGWlldW19YLG+bm1vbOrrW335RhLDBp4JCFou0hSRjlpKGoYqQdCYICj5GWN7rO6q17IiQN+Z1KIuIGaMCpTzFSGvWsUlROes5jN0Bq6Pnpw/gYXpoZq86ynlW0K/ZEcNE4uSmCXPWe9d3thzgOCFeYISk7jh0pN0VCUczI2OzGkkQIj9CAdLTlKCDSTSfnjGFJkz70Q6EfV3BCZydSFEiZBJ7uzFaU87UM/lfrxMq/cFPKo1gRjqcf+TGDKoRZNrBPBcGKJdogLKjeFeIhEggrnaCpQ3DmT140zWrFOauc3p4Ua1d5HAVwCI5AGTjgHNTADaiDBsDgCbyAN/BuPBuvxofxOW1dMvKZA/BHxtcvMa2blw== 0.2 0 !2 !1 0 1 2 3 4 5 Feature 1 Machine Learning: Basics 20 Classify new objects Assign the label of the class with the largest posterior probability Gaussian Data 1 p(y1 |x) > p(y0.82 |x) p(y2 |x) > p(y1 |x) AAACDnicbZC7TsMwFIYdriXcAowsFlWlslRJxW1CFSyMRaIXqY0qx3Vaq44T2Q4iCn0CFl6FhQGEWJnZeBucNkNp+SVLv75zjnzO70WMSmXbP8bS8srq2nphw9zc2t7Ztfb2mzKMBSYNHLJQtD0kCaOcNBRVjLQjQVDgMdLyRtdZvXVPhKQhv1NJRNwADTj1KUZKo55VispJz3nsBkgNPT99GB/DSzNj1VnWs4p2xZ4ILhonN0WQq96zvrv9EMcB4QozJGXHsSPlpkgoihkZm91YkgjhERqQjrYcBUS66eScMSxp0od+KPTjCk7o7ESKAimTwNOd2YpyvpbB/2qdWPkXbkp5FCvC8fQjP2ZQhTDLBvapIFixRBuEBdW7QjxEAmGlEzR1CM78yYumWa04Z5XT25Ni7SqPowAOwREoAwecgxq4AXXQABg8gRfwBt6NZ+PV+DA+p61LRj5zAP7I+PoFM0ObmA== AAACDnicbZC7TsMwFIYdriXcAowsFlWlslRJxW1CFSyMRaIXqY0qx3Vaq44T2Q4iCn0CFl6FhQGEWJnZeBucNkNp+SVLv75zjnzO70WMSmXbP8bS8srq2nphw9zc2t7Ztfb2mzKMBSYNHLJQtD0kCaOcNBRVjLQjQVDgMdLyRtdZvXVPhKQhv1NJRNwADTj1KUZKo55VispJr/rYDZAaen76MD6Gl2bGnFnWs4p2xZ4ILhonN0WQq96zvrv9EMcB4QozJGXHsSPlpkgoihkZm91YkgjhERqQjrYcBUS66eScMSxp0od+KPTjCk7o7ESKAimTwNOd2YpyvpbB/2qdWPkXbkp5FCvC8fQjP2ZQhTDLBvapIFixRBuEBdW7QjxEAmGlEzR1CM78yYumWa04Z5XT25Ni7SqPowAOwREoAwecgxq4AXXQABg8gRfwBt6NZ+PV+DA+p61LRj5zAP7I+PoFM1ebmA== 0.6 Density 0.4 p(y1 |x) = p(y2 |x) AAACDnicbZC7TsMwFIYdriXcAowsFlWlslRJxW1BqmBhLBK9SG1UOa7TWnWcyHYQUegTsPAqLAwgxMrMxtvgtBlKyy9Z+vWdc+Rzfi9iVCrb/jGWlldW19YLG+bm1vbOrrW335RhLDBp4JCFou0hSRjlpKGoYqQdCYICj5GWN7rO6q17IiQN+Z1KIuIGaMCpTzFSGvWsUlROes5jN0Bq6Pnpw/gYXpoZq86ynlW0K/ZEcNE4uSmCXPWe9d3thzgOCFeYISk7jh0pN0VCUczI2OzGkkQIj9CAdLTlKCDSTSfnjGFJkz70Q6EfV3BCZydSFEiZBJ7uzFaU87UM/lfrxMq/cFPKo1gRjqcf+TGDKoRZNrBPBcGKJdogLKjeFeIhEggrnaCpQ3DmT140zWrFOauc3p4Ua1d5HAVwCI5AGTjgHNTADaiDBsDgCbyAN/BuPBuvxofxOW1dMvKZA/BHxtcvMa2blw== 0.2 0 R1 R2 AAAB9HicdVDLSgMxFL3js9ZX1aWbYBFcDTN9TbsrunFZxT6gHUomTdvQzMMkUyhDv8ONC0Xc+jHu/BszbQUVPRA4nHMv9+R4EWdSWdaHsba+sbm1ndnJ7u7tHxzmjo5bMowFoU0S8lB0PCwpZwFtKqY47USCYt/jtO1NrlK/PaVCsjC4U7OIuj4eBWzICFZacns+VmOCeXI779v9XN4yS45dsUrIMssFp+rYmtRqVadYRLZpLZCHFRr93HtvEJLYp4EiHEvZta1IuQkWihFO59leLGmEyQSPaFfTAPtUuski9Byda2WAhqHQL1BooX7fSLAv5cz39GQaUv72UvEvrxurYdVNWBDFigZkeWgYc6RClDaABkxQovhME0wE01kRGWOBidI9ZXUJXz9F/5NWwbQrZvmmlK9frurIwCmcwQXY4EAdrqEBTSBwDw/wBM/G1Hg0XozX5eiasdo5gR8w3j4BPk+ScQ==

Use Quizgecko on...
Browser
Browser