Tuesday, March 26, 2024

1121: Big Data.....

 In the previous lecture, I told you about the shift from a rationalist to an empiricist approach to artificial intelligence.

   

The rationalist approach was, that you store a lot of knowledge in an AI program, add some algorithms, and then let the AI program solve problems, this all, related to microworlds.

  

The empiricist approach lets the AI program learn by itself how this or that works so that it can solve problems in new situations. An example of such a self-learning AI program is AlphaGo Zero.

   

It played more than 29 million games against itself, which resulted in such an understanding of the game of Go, that it became unbeatable, 

  

in other words, this AI program developed knowledge of the game by learning from a massive amount of data.

  

In June 2008  Chris Anderson published an article in the magazine WIRED with the title:" The End of Theory: The Data Deluge Makes the Scientific Method Obsolete".


From the article: "This is a world where massive amounts of data and applied mathematics replace every other tool that might be brought to bear. 

   

Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? 

  

The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves." -end quote-

   

This is the new belief: big data against the old-fashioned scientific method to obtain new knowledge.

   

The scientific method is a default series of steps to come to new knowledge.

[1] Make an observation - [2] Ask a question -[3] Form a hypothesis or testable explanation. 

   

[4] Make a prediction based on the hypothesis - [5] Test the prediction - [6] Iterate: use the results to make new hypotheses or predictions.

   

Obsolete method. Now we use big data and this all is accessible in the Cloud. I must honestly confess that I hardly know what this Cloud is, except that it is created by huge data storage facilities.

    

And the new approach to getting new knowledge is by analyzing billions of data and looking for correlations.

   

I once heard a story about scientists who used to build a molecule in a certain way with a certain method. Then they had an AI program checking thousands of articles on the subject.

   

Eventually, the AI program suggested a method to create this specific molecule more easily and faster. 

   

It all sounds promising. Is big data the future? Maybe, we still have some questions.....

   

Thank you for your attention...

   


No comments:

Post a Comment