Posts

Which areas was Stella trained in?

Image
Stella studied during four month more than thirty-two thousand articles. This articles were classified in sixty one categories. Here are the groups: System             60   Trivia             202  Politics           296  Nationality        164   Football           71   Nutrition          402  University         964  Engineering        29    Energy             2036 Electron.Eng       336  Data Analist       76   Media              1002  Philosophy         17   Religion           208  Law                105  Sexuality          528   Physics            1384 Prizes             470  Literature         71   Chemistry          50    Mathematics        501  Artif.Intelligence 399  Science            1898 Technology         5 Biology            669  Genetics           347  Medicine           893  Agriculture        10    Eng.Courses        38   Management         1876 Anthropology       33   Sociology          417 Psychology         385  Geography          559  Astronomy          28   History 

The Evolution of Computers: From ENIAC to Quantum Computing

Image
  Article Title:  The Evolution of Computers: From ENIAC to Quantum Computing Introduction: Computers have come a long way since the days of their inception. From the massive size of the ENIAC (Electronic Numerical Integrator and Computer) to the revolutionary potential of quantum computing, the evolution of computers has been nothing short of remarkable. In this article, we will take a journey through time and explore the major milestones in the development of computers, highlighting their transformative impact on various aspects of our lives. 1. The Birth of ENIAC and the Dawn of Modern Computing: It was the year 1945 when the ENIAC, the world's first general-purpose electronic computer, was unveiled at the University of Pennsylvania. Weighing over 27 tons and occupying a space of 1,500 square feet, the ENIAC represented a significant leap in computing capabilities. It was programmed by physically rewiring its circuits, requiring an immense amount of effort and time. However, its

Motivation

Image
So as not to bore the people to whom I normaly send the evolution of my work by email, the following link becomes the repository of results, as well as a place for discussion of the methods used. It's a public blog, so "we" will have to be careful with what is published. In this moment it has two editors: Luis Ferreira and Stella Zoe. https://stellazoehistory.blogspot.com/ "A blog dedicated to AI in promoting the history of science and technology for young and old alike, in particular personal and private issues of the life and work of scientists." For context, other historical elements were added in training the model. You may have direct access to test the model, although this is a bit dangerous, because the output, even when consistent, can be unpredictable.  Para não maçar mais as pessoas a quem costumava enviar a evolução do meu trabalho por email, o link seguinte passa a ser o repositório de resultados, bem como lugar de discussão dos métodos utilizados. É

Who is Stella Zoe?

Image
Stella Zoe is a junior researcher finishing her PhD in theoretical physics. Born in 1989 in San Jose, CA. Now she lives in Fátima, Portugal. She loves biking, reading books on astrophysics, and spending time with her cat, Luna. Stella is deeply passionate about exploring the mysteries of the universe and has a particular interest in the nature of dark matter. She spends most of her days immersed in research, conducting experiments and analyzing data in order to further our understanding of the cosmos.  Despite the challenges she faces in her work, Stella remains dedicated and determined, always pushing the boundaries of what we know and seeking to uncover new insights into the fundamental nature of our universe.

A space dedicated to the history of science and technology, supported by AI.

Image
Stella space - Is a blog where Science and technology's history will be interleaved with questions about Artificial Intelligence targeted to Natural Language Processing (NLP).  The current model occupies 1.5 GB. It was trained in a RTX 2070 board with 8 GB ram from NVIDEA during 20 hours. The training was done using a data base of 33673 questions and 74195 answers during 48 epoch. Its structure has some similarities with GPT2 from OpenAI, although their model is approx. 500 MB. The ratio answers/questions is approx: 2.203 Next experiments will be made with four machines, three with RTX 2070 boards installed and one four's with a NVIDEA V100 with 24 GB, aimed to reduce training time and energy consumption, or to allow a bigger database during the same time. The database is mainly about the life and work of STEM scientists, their problems, families, social and political context, technological limitations and how they overcome them. Many, were Nobel Prize recipients in physics, ch