OpenAIs latest text generating transformer model GPT-2 is so difficult to differentiate its output from that of a human that they wont release the trained model in the public domain.
If there is code for this video, please refer to YouTube notes
OpenAIs latest text generating transformer model GPT-2 is so difficult to differentiate its output from that of a human that they wont release the trained model in the public domain. MORE: Blog or code: http://www.viralml.com/video-content.html?fm=yt&v=H1lncbq8NC0 Signup for my newsletter and more: http://www.viralml.com Connect on Twitter: https://twitter.com/amunategui My books on Amazon: The Little Book of Fundamental Indicators: Hands-On Market Analysis with Python: Find Your Market Bearings with Python, Jupyter Notebooks, and Freely Available Data: https://amzn.to/2DERG3d Monetizing Machine Learning: Quickly Turn Python ML Ideas into Web Applications on the Serverless Cloud: https://amzn.to/2PV3GCV Grow Your Web Brand, Visibility & Traffic Organically: 5 Years of amunategui.github.Io and the Lessons I Learned from Growing My Online Community from the Ground Up: https://amzn.to/2JDEU91 Fringe Tactics - Finding Motivation in Unusual Places: Alternative Ways of Coaxing Motivation Using Raw Inspiration, Fear, and In-Your-Face Logic https://amzn.to/2DYWQas Create Income Streams with Online Classes: Design Classes That Generate Long-Term Revenue: https://amzn.to/2VToEHK Defense Against The Dark Digital Attacks: How to Protect Your Identity and Workflow in 2019: https://amzn.to/2Jw1AYS Articles OpenAI Blog Entry: Better Language Models and Their Implications https://blog.openai.com/better-language-models Sunspring: https://www.youtube.com/watch?v=LY7x2Ihqjmc Sunspring | A Sci-Fi Short Film Starring Thomas Middleditch Full transcript: Hello Friends! Deemed too dangerous, OpenAI wont release their latest text generating model GPT-2. OpenAi - the organization partly founded by Elon Musk just posted a blog piece entitled Better Language Models and Their Implications in which they trained a large Generative Pre-trained Transformer model with 1.5 billion parameters, trained on a dataset of 8 million web pages named it GPT-2. They say that it is so difficult to differentiate its output from a human, that it did not release the trained model in the public domain. This is Manuel Amunategui and Welcome to ViralML! AI text generator not released for concerns about implications (openai.com) https://blog.openai.com/better-language-models/#sample5 Here is a sample Human writing starter piece: In a shocking finding, scientist discovered a herd of unicorns living in a remote, previously unexplored valley, in the Andes Mountains. Even more surprising to the researchers was the fact that the unicorns spoke perfect English. MODEL COMPLETION The scientist named the population, after their distinctive horn, Ovids Unicorn. These four-horned, silver-white unicorns were previously unknown to science. Now, after almost two centuries, the mystery of what sparked this odd phenomenon is finally solved. Dr. Jorge Pérez, an evolutionary biologist from the University of La Paz, and several companions were exploring the Andes Mountains when they found a small valley, with no other animals or humans. Pérez noticed that the valley had what appeared to be a natural fountain, surrounded by two peaks of rock and silver snow. Pérez and his friends were astonished to see the unicorn herd... This is interesting on many levels. These transformer models are becoming very popular compared to regular recurrent or LSTM models. They are still neural networks as they can be parallelized when training so as long as you have the computational network, you can learn on enormous text corpora. Remember the movie Sunspring with Thomas Middleditch from Silicon Valley? The script was made using recurrent neural networks trained on sci-fi screenplays Sunspring | A Sci-Fi Short Film Starring Thomas Middleditch https://www.youtube.com/watch?v=LY7x2Ihqjmc Its completely absurd and awesome. So is the GPT-2 model. The problem with both examples is one is overfitting while the other isnt. Open-AIs GPT-2 is overfitting thus returning realistically sounding text, maybe even plagiarism some of these websites, while the Sunspring is underfitting thus sounding absurd. I know, you like I am awaiting for AI to write stories that arent about whether they sound real or not, but that is new with original, even primordial, thinking. Neither examples shown today do that. We need poetry that moves us, stories that captivate us, we want to discover new literary grounds but as long as all were doing is finding faster ways of duplicating the human knowledge base, were not gonna get there. Well, stay tuned, well see what comes next... CATEGORY:StateOfIndustry