• 0 Posts
  • 111 Comments
Joined 1 year ago
cake
Cake day: June 17th, 2023

help-circle




  • Because he was ceo of a company in a critical position to define the future of economy. Currently the tech field is the biggest and most influential of all economic fields. And by tech here we talk about digital world. There’s absolutely no comparable sector at the moment for importance, not even pharma.

    It literally defines the modern economy. In the field, openai is an incredibly important company for future relative success and power of big tech companies.

    This is why it is so important for world economy










  • In the easiest example of a neuron in a artificial neural network, you take an image, you multiply every pixel by some weight, and you apply a very simple non linear transformation at the end. Any transformation is fine, but usually they are pretty trivial. Then you mix and match these neurons to create a neural network. The more complex the task, the more additional operations are added.

    In our brain, a neuron binds some neurotransmitters that trigger a electrical signal, this electrical signal is modulated and finally triggers the release of a certain quantity of certain neurotransmitters on the other extreme of the neuron. Detailed, quantitative mechanisms are still not known. These neurons are put together in an extremely complex neural network, details of which are still unknown.

    Artificial neural network started as an extremely coarse simulation of real neural networks. Just toy models to explain the concept. Since then, they diverged, evolving in a direction completely unrelated to real neural network, becoming their own thing.


  • No, what you describe is a basic decision tree. Let’s say the simplest possible ML algorithm, but it is not used as is in practice anywhere. Usually you find “forests” of more complex trees, and they cannot be used for generation, but are very powerful for labeling or regression (eli5 predict some number).

    Generative models are based on multiple transformations of images or sentences in extremely complex, nested chains of vector functions, that can extract relevant information (such as concepts, conceptual similarities, and so on).

    In practice (eli5), input is transformed in a vector and passed to a complex chain of vector multiplications and simple mathematical transformations until you get an output that in the vast majority of cases is original, i.e. not present in the training data. Non original outputs are possible in case of few “issues” in the training dataset or training process (unless explicitly asked).

    In our brain there are no if/else, but electrical signals modulated and transformed, which is conceptually more similar to the generative models than to a decision tree.

    In practice however our brain works very differently than generative models