November 30, 2024 - what a day to party! It’s the second birthday of ChatGPT, and honestly, it deserves all the confetti and cake we can throw at it. This clever virtual assistant, dreamed up by OpenAI and the genius squad led by Sam Altman, is way more than just a work buddy or a study sidekick. Over these two years, it’s become our go-to pal, an endless well of ideas, and even a virtual shoulder to cry on when life gets tough. So today, let’s throw a big cheer for how ChatGPT has jazzed up our lives - and dream a little about the brighter, AI-fueled future it’s lighting up for us all. 🎉 A Revolutionary Beginning When ChatGPT was introduced in November 2022, it quickly became apparent that this was no ordinary AI. Designed to communicate with humans in a natural, engaging, and insightful manner, it showcased not only technical sophistication but also a thoughtful approach to fostering meaningful connections. This was a product born not just of cutting-edge technology but of a v...
Ada Lovelace, born Augusta Ada Byron in 1815, was an English mathematician and writer, known for her work on Charles Babbage's proposed Analytical Engine, an early mechanical general-purpose computer. She is often considered the world's first computer programmer. She was the only legitimate child of the poet Lord Byron and his wife, Anne Isabella Milbanke. Her parents separated shortly after her birth, and her father had little involvement in her upbringing. However, her mother was determined to provide her with a rigorous education in mathematics and science, hoping to counter any poetic or imaginative tendencies inherited from Lord Byron. Ada Lovelace's intellectual abilities flourished. From a young age, she displayed a natural talent and curiosity for various subjects, including mathematics and logic. developed a keen interest in mathematics and logic. As Lovelace pursued her education, her interest in mathematics and logic grew. She dedicated herself to studying these...
Stella space - Is a blog where Science and technology's history will be interleaved with questions about Artificial Intelligence targeted to Natural Language Processing (NLP). The current model occupies 1.5 GB. It was trained in a RTX 2070 board with 8 GB ram from NVIDEA during 20 hours. The training was done using a data base of 33673 questions and 74195 answers during 48 epoch. Its structure has some similarities with GPT2 from OpenAI, although their model is approx. 500 MB. The ratio answers/questions is approx: 2.203 Next experiments will be made with four machines, three with RTX 2070 boards installed and one four's with a NVIDEA V100 with 24 GB, aimed to reduce training time and energy consumption, or to allow a bigger database during the same time. The database is mainly about the life and work of STEM scientists, their problems, families, social and political context, technological limitations and how they overcome them. Many, were Nobel Prize recipients in physi...
Comments
Post a Comment