Talk to Transformer - OpenAI Text Completion Based On 345 Million Parameters
Talk to Transformer
OpenAI Text Completion Based On 345 Million Parameters
Screenshots
Hunter's comment
Recently I'm stumbling upon a lot of neural networks and other AI related projects m ore ad more and I'm happy o s all the developments. Talk to Transformer is based on the GPT-2 by OpenAI (backed by Peter Thiel, Elon Musk etc.)
Just Add A Little Text
We’ve trained a large-scale unsupervised language model which generates coherent paragraphs of text, achieves state-of-the-art performance on many language modeling benchmarks, and performs rudimentary reading comprehension, machine translation, question answering, and summarization—all without task-specific training.
The Results Are Great For Generic Phrases
Just try few phrases and you'll see some impressive results with almost paragraph sized texts generated based on your tiny bit of input. Don't try to be too clever. Neurl Networks aren't conscious human beings :P
Link
This is posted on Steemhunt - A place where you can dig products and earn STEEM.
View on Steemhunt.com
Please read our posting guidelines.
Connect with Steemhunt
Congratulations!
We have upvoted your post for your contribution within our community.
Thanks again and look forward to seeing your next hunt!
Want to chat? Join us on: