Russia wants to cut itself off from the global internet. Here’s what that really means.

In the next two weeks, Russia is planning to attempt something no other country has tried before. It’s going to test whether it can disconnect from the rest of the world electronically while keeping the internet running for its citizens. This means it will have to reroute all its data internally, rather than relying on servers abroad. The test is key to a proposed “sovereign internet” law currently working its way through Russia’s government. It looks likely to be eventually voted through and signed into law by President Vladimir Putin, though it has stalled in parliament for now. Read More

#russia

AI Business School

Business leaders can use AI Business School to gain specific, practical knowledge to define and implement your AI strategy. Hear directly from industry experts on how to foster an “AI-ready” culture and learn how to use AI responsibly and with confidence. Read More

#artificial-intelligence, #videos

Language Models are Unsupervised Multitask Learners

Natural language processing tasks, such as question answering, machine translation, reading comprehension, and summarization, are typically approached with supervised learning on task specific datasets. We demonstrate that language models begin to learn these tasks without any explicit supervision when trained on a new dataset of millions of webpages called WebText. When conditioned on a document plus questions, the answers generated by the language model reach 55 F1 on the CoQA dataset – matching or exceeding the performance of 3 out of 4 baseline systems without using the 127,000+ training examples. The capacity of the language model is essential to the success of zero-shot task transfer and increasing it improves performance in a log-linear fashion across tasks. Our largest model, GPT-2, is a 1.5B parameter Transformer that achieves state of the art results on 7 out of 8 tested language modeling datasets in a zero-shot setting but still underfits WebText. Samples from the model reflect these improvements and contain coherent paragraphs of text. These findings suggest a promising path towards building language processing systems which learn to perform tasks from their naturally occurring demonstrations. Read More

#neural-networks, #nlp

Convolutional Neural Networks – The Math of Intelligence

#neural-networks, #videos